WO2022226728A1 - Computing device, method and apparatus for detecting skin conditions of human subject - Google Patents

Computing device, method and apparatus for detecting skin conditions of human subject Download PDF

Info

Publication number
WO2022226728A1
WO2022226728A1 PCT/CN2021/089954 CN2021089954W WO2022226728A1 WO 2022226728 A1 WO2022226728 A1 WO 2022226728A1 CN 2021089954 W CN2021089954 W CN 2021089954W WO 2022226728 A1 WO2022226728 A1 WO 2022226728A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
normalized
skin
wrinkle
human subject
Prior art date
Application number
PCT/CN2021/089954
Other languages
French (fr)
Inventor
Wenna WANG
Chengda YE
Thomas Andrew STEEL
Frederic Flament
Original Assignee
L'oreal
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by L'oreal filed Critical L'oreal
Priority to CN202180097549.5A priority Critical patent/CN117241722A/en
Priority to PCT/CN2021/089954 priority patent/WO2022226728A1/en
Priority to FR2106235A priority patent/FR3122076B1/en
Publication of WO2022226728A1 publication Critical patent/WO2022226728A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/442Evaluating skin mechanical properties, e.g. elasticity, hardness, texture, wrinkle assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/444Evaluating skin marks, e.g. mole, nevi, tumour, scar
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet

Definitions

  • the disclosure relates to the field of cosmetics. More specifically, the disclosure relates to a computing device, a method and an apparatus for detecting skin conditions of a human subject.
  • the skin is the largest organ of the human body and one of the most important. Among its many functions, the skin provides a protective barrier against harmful substances, harmful effects of UV radiation as well as mechanical, thermal, and physical injury. The skin also acts as a sensory organ that helps perceive temperature, touch, etc. Maintain a healthy skin often requires knowledge of the state and status of several skin conditions. In existing prior arts, the presence or absence pores, wrinkles, skin tone, spots and blackheads are often used to measure skin conditions of a user. Though existing prior arts disclose technical solutions of detecting user ⁇ s skin conditions, those existing prior arts are incomplete in the detection dimension of skin detection and are also not accurate.
  • a significant object or appearance unit including computational circuitry configured to extract significant object or appearance data from one or more digital images of a region of skin of the human subject; a normalization unit including computational circuitry configured to generate normalized skin characteristic data based on the significant object or appearance data extracted from the one or more digital images of the region of skin; and a skin prediction unit including computational circuitry configured to predict a skin condition data based on an admix of the normalized skin characteristic data.
  • the significant object or appearance data includes acne data; blackhead data, dark circle data; pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data and the like.
  • the significant object or appearance data includes presence, absence, or severity pore data, presence, absence or severity wrinkle data, presence, absence, or severity blackhead data, presence, absence, or severity acne data, and the like.
  • the significant object or appearance data includes changes in reflection data, changes in hue data, changes in wrinkle data, changes in spot data, changes in dark circle data, and the like.
  • the skin condition data includes skin texture data, skin tone data; skin tonicity data; skin translucency data, or the like.
  • the predicted skin condition data includes a predicted skin sensitivity, a predicted skin texture, a predicted skin tone, a predicted skin tonicity, a predicted skin translucency, or the like.
  • the predicted skin condition data includes changes in skin tone data, changes in skin translucency data, changes in skin texture data, changes in skin tonicity data, or the like.
  • the predicted skin condition data includes data indicative of the presence, absence, severity, or a change in a condition associated with skin tone, skin translucency, skin texture, skin tonicity, or the like.
  • normalized data includes normalized skin object or appearance data, normalized skin characteristic data, or the like.
  • normalized data includes normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data or the like.
  • the computing device further comprises a skin condition display including computational circuitry configured to display on a graphical user interface one or more instances of the extracted significant object or appearance data, the normalized skin characteristic data or the predicted skin condition data.
  • the normalized skin characteristic data are such as normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data or the like.
  • the skin prediction unit includes computational circuitry configured to predict skin translucency data based on a weighted admixture of normalized reflection data, normalized hue data, normalized wrinkle data, normalized spot data, and normalized dark circle data.
  • the skin prediction unit includes computational circuitry configured to predict skin translucency data based on a weighted admixture of P n_reflection , P n_hue , P n_wrinkle , P n_spot , and P n_dark circle ,
  • P skin translucency W 1 ⁇ P n_reflection +W 2 ⁇ P n_hue +W 3 ⁇ P n_wrinkle +W 4 ⁇ P n_spot + W 5 ⁇ P n_dark circle ,
  • P skin translucency represents the skin translucency data
  • P n_reflection represents the normalized reflection data
  • P n_hue represents the normalized hue data
  • P n_wrinkle represents the normalized wrinkle data
  • P n_spot represents the normalized spot data
  • W 1 , W 2 , W 3 , W 4 and W 5 are predefined weights
  • the skin prediction unit includes computational circuitry configured to predict skin texture data based on a weighted admixture of normalized pore data, normalized wrinkle data, normalized blackhead data, and normalized acne data.
  • the skin prediction unit includes computational circuitry configured to predict skin texture data based on a weighted admixture of P n_pore , P n_wrinkle , P n_blackhead , and P n_acne ;
  • P skin texture represents the skin texture data
  • P n_pore represents the normalized pore data
  • P n_wrinkle represents the normalized wrinkle data
  • P n_blackhead represents the normalized blackhead data
  • P n_acne represents the normalized acne data
  • W 6 , W 7 , W 8 , and W 9 are predefined weights
  • the skin prediction unit includes computational circuitry configured to predict skin tone data based on a weighted admixture of normalized skin tone data, normalized spot data, normalized acne data, normalized blackhead data, normalized dark circle data, normalized sensitivity data.
  • the skin prediction unit includes computational circuitry configured to predict Skin tone data based on a weighted admixture of P n_skintone , P n_spot , P n_blackhead , P n_dark circle , P n_sensitivity , P n_acne ;
  • P skin tone W 10 ⁇ P n_skin tone +W 11 ⁇ P n_spot +W 12 ⁇ P n_blackhead +W 13 ⁇ P n_dark circle +W 14 ⁇ P n_sensitivity +W 15 ⁇ P n_acne , (equation 3)
  • P skin tone represents evenness data of the skin tone
  • P n_skin tone represents the normalized skin color data
  • P n_spot represents the normalized spot data
  • P n_blackhead represents the normalized blackhead data
  • P n_dark circle represents the normalized dark circle data
  • P n_sensitivity represents the normalized sensitivity data
  • P n_acne represents the normalized acne data
  • W 10 , W 11 , W 12 , W 13 , W 14 and W 15 are predefined weights
  • skin tone means the evenness of skin tone.
  • the melanocytes in the epidermis are the main factors that determine skin color. Skin aging, acne or other problems may increase of pigmentation of the face and affect the skin color and evenness. Dark circles under the eyes and inflammatory acne can also change the skin tone.
  • normalized skin tone means the normalized skin color
  • the skin prediction unit includes computational circuitry configured to predict skin tonicity data based on a weighted admixture of normalized age data, tonicity data, and wrinkle data.
  • the skin prediction unit includes computational circuitry configured to predict skin tonicity data based on a weighted admixture of P n_age , P n_tonicity , and P n_wrinkle ;
  • P skin tonicity W 16 ⁇ P n_age +W 17 ⁇ P n_tonicity +W 18 ⁇ P n_wrinkle , (equation 4)
  • P skin tonicity represents tightness data of human skin
  • P n_age represents the normalized age data
  • P n_tonicity represents the normalized skin sagging data on the facial counter
  • P n_wrinkle represents the normalized wrinkle data
  • W 16 , W 17 and W 18 are predefined weights
  • skin tonicity means tightness data of human skin.
  • skin tonicity means tightness data of human skin.
  • skin tonicity means tightness data of human skin.
  • the elastin and collagen in the dermis lost, changes in subcutaneous tissue, and gravity can lead to facial skin laxity, manifested as skin sagging on cheek, jaw, and neck , nasolabial folds and eye bags, etc.
  • a method for detecting skin conditions of a human subject comprising extracting significant object or appearance data from one or more digital images of a region of skin of a human subject; generating normalized skin characteristic data based on the significant object or appearance data extracted from the one or more digital images of the region of skin; and predicting a skin condition data based on an admix of the normalized skin characteristic data.
  • the method further comprises extracting acne data, blackhead data, dark circle data, pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data from the one or more digital images of the region of skin.
  • said method also comprise normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data.
  • the method further comprises extracting data indicative of a presence, absence, or severity of a skin condition from the one or more digital images of the region of skin.
  • the significant object or appearance data includes acne data; blackhead data, dark circle data; pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data and the like.
  • the significant object or appearance data includes presence, absence, or severity pore data, presence, absence or severity wrinkle data, presence, absence, or severity blackhead data, presence, absence, or severity acne data, and the like.
  • the significant object or appearance data includes changes in reflection data, changes in hue data, changes in wrinkle data, changes in spot data, changes in dark circle data, and the like.
  • the skin condition data includes skin texture data, skin tone data; skin tonicity data; skin translucency data, or the like.
  • the predicted skin condition data includes a predicted skin sensitivity, a predicted skin texture, a predicted skin tone, a predicted skin tonicity, a predicted skin translucency, or the like.
  • the predicted skin condition data includes changes in skin tone data, changes in skin translucency data, changes in skin texture data, changes in skin tonicity data, or the like.
  • the predicted skin condition data includes data indicative of the presence, absence, severity, or a change in a condition associated with skin tone, skin translucency, skin texture, skin tonicity, or the like.
  • normalized skin characteristic data includes normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data or the like.
  • the method further comprises displaying on a graphical user interface one or more instances of the extracted significant object or appearance data, the normalized skin characteristic data or the predicted skin condition data.
  • the skin translucency data is predicted based on a weighted admixture of normalized reflection data, normalized hue data, normalized wrinkle data, normalized spot data, and normalized dark circle data.
  • skin texture data is predicted based on a weighted admixture of normalized pore data, normalized wrinkle data, normalized blackhead data, and normalized acne data.
  • skin tone data is predicted based on a weighted admixture of normalized skin tone data, normalized spot data, normalized acne data, normalized blackhead data, normalized dark circle data, normalized sensitivity data.
  • skin tonicity data is predicted based on a weighted admixture of normalized age data, normalized tonicity data, and normalized wrinkle data.
  • the skin translucency data is predicted by the above mentioned equation 1
  • the skin texture data is predicted by the above mentioned equation 2
  • the skin tone data is predicted by the above mentioned equation 3
  • the skin tonicity data is predicted by the above mentioned equation 4.
  • a computer readable medium having stored thereon instructions that when executed cause a computing device to perform the above-mentioned method.
  • an apparatus of detecting conditions of a user ⁇ s skin comprises means for performing the above-mentioned method.
  • conditions of a user ⁇ s skin can be measured accurately and completely by using specific four new parameters, i.e. skin texture, skin tone, skin translucency and skin tonicity.
  • Fig. 1 illustrates a block diagram of a computing device in accordance with a first aspect of the present disclosure
  • FIG. 2 illustrates a flowchart of a method of detecting skin conditions of a human subject in accordance with a second aspect of the present disclosure
  • FIG. 3 illustrates a flowchart of a method of predicting a skin translucency data of the human subject in accordance with a first embodiment of the second aspect of the present disclosure
  • FIG. 4 illustrates a flowchart of a method of predicting a skin texture data of the human subject in accordance with a second embodiment of the second aspect of the present disclosure
  • FIG. 5 illustrates a flowchart of a method of predicting a skin tone data of the human subject in accordance with a third embodiment of the second aspect of the present disclosure
  • FIG. 6 illustrates a flowchart of a method of predicting a skin tonicity data of the human subject in accordance with a fourth embodiment of the second aspect of the present disclosure.
  • the present technology may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc. ) .
  • the present technology may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system.
  • a computer-usable or computer-readable medium may be any medium that may contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the inventive concept of the disclosure is to propose new four parameters related to skin conditions based on user perception decoding statistic model, the new four parameters are skin texture, skin tone, skin translucency and skin tonicity which are called 4T.
  • 4T skin texture, skin tone, skin translucency and skin tonicity
  • Fig. 1 illustrates a block diagram of a computing device 100 in accordance with a first aspect of the present disclosure.
  • the computing device 100 comprises a significant object or appearance unit 101, a normalization unit 102 and a skin prediction unit 103.
  • the significant object or appearance unit 101 is to extract significant object or appearance data from one or more digital images of a region of skin of a human subject.
  • the significant object or appearance unit 101 can be such as a spectrometer, a mobile device, a portable device and so on capable of emitting five spectrums to achieve five spectrums imaging.
  • such significant object or appearance unit 101 comprises computational circuitry which is configured to emit five light sources from the epidermis to the real skin to perform five spectrums imaging technology so that underlying skin problems are found.
  • the significant object or appearance unit 101 captures images of the user ⁇ s skin by scanning or taking photo the user ⁇ s skin.
  • images of the user ⁇ s skin can be saved beforehand.
  • the user ⁇ s skin can be any skin in the user ⁇ s body including, not limited to a face, neck, hand, foot and so on.
  • significant object or appearance data such as acne data, blackhead data, dark circle data, pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101.
  • skin tone data means skin color data
  • tonicity data means the skin sagging data on the facial counter.
  • a normalization unit 102 includes computational circuitry configured to generate normalized skin characteristic data based on the significant object or appearance data extracted from the one or more digital images of the region of skin.
  • normalized skin characteristic data are such as normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data.
  • the skin prediction unit 103 includes computational circuitry configured to predict a skin condition data based on an admix of the normalized skin characteristic data.
  • a skin condition data includes a predicted skin sensitivity, a predicted skin texture, a predicted skin tone, a predicted skin tonicity, a predicted skin translucency, or the like.
  • the predicted skin tone means predicted evenness of skin stone and the predicted skin tonicity means predicted tightness of human skin and normalized skin tone means normalized skin color.
  • a plurality of skin condition parameters data can be obtained in local or in a cloud server.
  • the images are processed by known algorithms corresponding to different skin condition parameters stored in the computing devices to obtain the plurality of skin condition parameters data.
  • the images are transmitted to the cloud server. Then, the images are processed by known algorithms corresponding to different skin condition parameters stored in the cloud server to obtain the plurality of skin condition parameters data.
  • the plurality of skin condition parameters are such as a normalized reflection data, a normalized hue data, a wrinkle data, a spot data, a dark circle data, a pore data, a blackhead data, an acne data, a normalized skin tone data, a sensitivity data, an normalized age data and a tonicity data. Then, The plurality of skin condition parameters are normalized to generate normalized skin condition parameters.
  • a first set of data comprising the normalized reflection data, the normalized hue data, the normalized wrinkle data, the normalized spot data and the normalized dark circle data is obtained; a second set of data comprising the normalized pore data, the normalized wrinkle data, the normalized blackhead data and the normalized acne data is obtained; a third set of data comprising the normalized skin tone data, the normalized spot data, the normalized acne data, the normalized blackhead data, the normalized dark circle data and the normalized sensitivity data is obtained; a fourth set of data comprising the normalized age data, the normalized tonicity data and the normalized wrinkle data is obtained.
  • Said normalized reflection data can be obtained by a known algorithm corresponding to the normalized reflection based on captured images.
  • Said normalized hue data can be obtained by a known algorithm corresponding to the normalized hue based on captured images.
  • Said wrinkle data can be obtained by a known algorithm corresponding to the wrinkle based on captured images.
  • Said spot data can be obtained by a known algorithm corresponding to the spot based on captured images.
  • Said dark circle data can be obtained by a known algorithm corresponding to the dark circle based on captured images.
  • Said pore data can be obtained by a known algorithm corresponding to the pore based on captured images.
  • Said blackhead data can be obtained by a known algorithm corresponding to the blackhead based on captured images.
  • Said acne data can be obtained by a known algorithm corresponding to the acne based on captured images.
  • Said normalized skin tone data can be obtained by a known algorithm corresponding to the normalized skin tone based on captured images.
  • Said sensitivity data can be obtained by a known algorithm corresponding to the sensitivity based on captured images.
  • Said normalized age data can be obtained by a known algorithm corresponding to the normalized age based on captured images.
  • Said tonicity data can be obtained by a known algorithm corresponding to the tonicity based on captured images. Then, these data are further normalized to generate normalized data.
  • the skin prediction unit 103 comprises computational circuitry which is configured to also predict at least of a skin translucency data based on the first set of data, a skin texture data based on the second set of data, a skin tone data based on the third set of data and a skin tonicity data based on the fourth set of data.
  • the skin prediction unit 103 comprises computational circuitry which is configured to predict the skin translucency data by weighing the first set of data. More particularly, the skin translucency data is determined by the following equation:
  • P skin translucency W 1 ⁇ P n_reflection +W 2 ⁇ P n_hue +W 3 ⁇ P n_wrinkle +W 4 ⁇ P n_spot + W 5 ⁇ P n_dark circle
  • P skin translucency represents the skin translucency data
  • P n_reflection represents the normalized reflection data
  • P n_hue represents the normalized hue data
  • P n_wrinkle represents the normalized wrinkle data
  • P n_spot represents the normalized spot data
  • P n_dark circle represents the normalized dark circle data
  • the weights W 1 , W 2 , W 3 , W 4 and W 5 are calculated in advance based on experience and research data.
  • the weights can be adjusted according to the user’s perception of the skin dimension and the needs of the research results, and the adjusted weights of each dimension are updated in real time on the application server.
  • W 1 is 28%-45%
  • W 2 is 28%-45%
  • W 3 is 5%-20%
  • W 4 is 2%-15%
  • W 5 is 2%-15%.
  • W 1 , W 2 , W 3 , W 4 and W 5 are not limited to such value ranges.
  • the skin prediction unit 103 comprises computational circuitry which is configured to also predict the skin texture data by weighing the second set of data. More particularly, the skin prediction unit 103 is to also predict the skin texture data by the following equation:
  • P skin texture W 6 ⁇ P n_pore +W 7 ⁇ P n_wrinkle +W 8 ⁇ P n_blackhead +W 9 ⁇ P n_acne (equation 2)
  • P skin texture represents the skin texture data
  • P n_pore represents the normalized pore data
  • P n_wrinkle represents the normalized wrinkle data
  • P n_blackhead represents the normalized blackhead data
  • P n_acne represents the normalized acne data
  • the weights W 6 , W 7 , W 8 and W 9 are calculated in advance based on experience and research data.
  • the weights can be adjusted according to the user’s perception of the skin dimension and the needs of the research results, and the adjusted weights of each dimension are updated in real time on the application server.
  • W 6 is 15%-50%
  • W 7 is 15%-50%
  • W 8 is 12%-35%
  • W 9 is 12%-30%.
  • W 6 , W 7 , W 8 and W 9 are not limited to such value ranges.
  • the skin prediction unit 103 comprises computational circuitry which is configured to also predict the skin tone data by weighing the third set of data. More particularly, the skin prediction unit 103 is to also determine the skin tone data by the following equation:
  • P skin tone W 10 ⁇ P n_skin tone +W 11 ⁇ P n_spot +W 12 ⁇ P n_blackhead +W 13 ⁇ P n_dark circle +W 14 ⁇ P n_sensitivity +W 15 ⁇ P n_acne
  • P skin tone represents evenness data of the skin tone
  • P n_skin tone represents the normalized skin color data
  • P n_spot represents the normalized spot data
  • P n_blackhead represents the normalized blackhead data
  • P n_dark circle represents the normalized dark circle data
  • P n_sensitivity represents the normalized sensitivity data
  • P n_acne represents the normalized acne data
  • the weights W 10 , W 11 , W 12 , W 13 , W 14 and W 15 are calculated in advance based on experience and research data.
  • the weights can be adjusted according to the user’s perception of the skin dimension and the needs of the research results, and the adjusted weights of each dimension are updated in real time on the application server.
  • W 10 is 22%-50%; W 11 is 8%-35%; W 12 is 3%-20%; W 13 is 3%-20%; W 14 is 3%-20%; and W 15 is 3%-20%.
  • W 10 , W 11 , W 12 , W 13 , W 14 and W 15 are not limited to such value ranges.
  • the skin prediction unit 103 comprises computational circuitry which is configured to also predict the skin tonicity data by weighting the fourth set of data. More particularly, the skin prediction unit 103 is to also predict the skin tonicity by the following equation:
  • P skin tonicity W 16 ⁇ P n_age +W 17 ⁇ P n_tonicity +W 18 ⁇ P n_wrinkle (equation 4)
  • P skin tonicity represents tightness data of human skin
  • P n_age represents the normalized age data
  • P n_tonicity represents the normalized skin sagging data on the facial counter
  • P n_wrinkle represents the normalized wrinkle data
  • the weights W 16 , W 17 and W 18 are calculated in advance based on experience and research data.
  • the weights can be adjusted according to the user’s perception of the skin dimension and the needs of the research results, and the adjusted weights of each dimension are updated in real time on the application server.
  • W 16 is 25%-65%; W 17 is 20%-50%; and W 18 is 20%-50%.
  • W 16 , W 17 and W 18 are not limited to such value ranges.
  • the computing device 100 also comprises a skin condition display including computational circuitry configured to display on a graphical user interface one or more instances of the predict skin condition data.
  • the predicted skin condition data such as skin transparency, skin texture, skin tone and skin tonicity can be indicated in a numerical value or a visual form on the graphical user interface.
  • the computing device 100 also comprises a skin condition display including computational circuitry configured to display on a graphical user interface one or more instances of the extracted significant object and the normalized skin characteristic such as normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data.
  • a skin condition display including computational circuitry configured to display on a graphical user interface one or more instances of the extracted significant object and the normalized skin characteristic such as normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data.
  • the extracted significant object and the normalized skin characteristic such as normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data can be indicated in a numerical value or a visual form on the graphical user interface.
  • the computing device 100 also comprises the significant object or appearance unit including computational circuitry configured to obtain data indicative of a presence, absence, or severity of a skin condition from the one or more digital images of the region of skin.
  • Computing device 100 can be, for example, a server of a service provider, a device associated with a client (e.g, a client device) , a system on a chip, and/or any other suitable computing device or computing system.
  • computing device 100 can take a variety of different configurations.
  • computing device 100 can be implemented as a computer-like device including a personal computer, desktop computer, multi-screen computer, laptop computer, netbook, and the like.
  • Computing device 100 can also be implemented as a mobile device-like device that includes mobile devices such as mobile phones, portable music players, portable gaming devices, tablet computers, multi-screen computers, and the like.
  • Computing device 100 can also be implemented as a television-like device that includes a device having or connected to a generally larger screen in a casual viewing environment. These devices include televisions, set-top boxes, game consoles, and the like.
  • computational circuitry includes, among other things, one or more computing devices such as a processor (e.g., a microprocessor, a quantum processor, qubit processor, etc. ) , a central processing unit (CPU) , a digital signal processor (DSP) , an application-specific integrated circuit (ASIC) , a field programmable gate array (FPGA) , and the like, or any combinations thereof, and can include discrete digital or analog circuit elements or electronics, or combinations thereof.
  • computational circuitry includes one or more ASICs having a plurality of predefined logic components.
  • computational circuitry includes one or more FPGAs, each having a plurality of programmable logic components.
  • computation circuitry includes one or more electric circuits, printed circuits, flexible circuits, electrical conductors, electrodes, cavity resonators, conducting traces, ceramic patterned electrodes, electro-mechanical components, transducers, and the like.
  • computational circuitry includes one or more components operably coupled (e.g., communicatively, electromagnetically, magnetically, ultrasonically, optically, inductively, electrically, capacitively coupled, wirelessly coupled, and the like) to each other.
  • circuitry includes one or more remotely located components.
  • remotely located components are operably coupled, for example, via wireless communication.
  • remotely located components are operably coupled, for example, via one or more communication modules, receivers, transmitters, transceivers, and the like.
  • computation circuitry includes memory that, for example, stores instructions or information.
  • memory includes volatile memory (e.g., Random Access Memory (RAM) , Dynamic Random Access Memory (DRAM) , and the like) , non-volatile memory (e.g., Read-Only Memory (ROM) , Electrically Erasable Programmable Read-Only Memory (EEPROM) , Compact Disc Read-Only Memory (CD-ROM) , and the like) , persistent memory, and the like.
  • RAM Random Access Memory
  • DRAM Dynamic Random Access Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • CD-ROM Compact Disc Read-Only Memory
  • EPROM Erasable Programmable Read-Only Memory
  • memory is coupled to, for example, one or more computing devices by one or more instructions, information, or power buses.
  • computational circuitry includes one or more databases stored in memory.
  • computational circuitry includes one or more look-up tables stored in memory.
  • computational circuitry includes one or more computer-readable media drives, interface sockets, Universal Serial Bus (USB) ports, memory card slots, and the like, and one or more input/output components such as, for example, a graphical user interface, a display, a keyboard, a keypad, a trackball, a joystick, a touch-screen, a mouse, a switch, a dial, and the like, and any other peripheral device.
  • USB Universal Serial Bus
  • computational circuitry includes one or more user input/output components that are operably coupled to at least one computing device configured to control (electrical, electromechanical, software-implemented, firmware-implemented, or other control, or combinations thereof) at least one parameter associated with, for example, determining one or more tissue thermal properties responsive to detected shifts in turn-on voltage.
  • computational circuitry includes electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein) , electrical circuitry forming a memory device (e.g., forms of memory (e.g., random access, flash, read only, etc. ) ) , electrical circuitry forming a communications device (e.g., a modem, communications switch, optical-electrical equipment, etc. ) , and/or any non-electrical analog thereto, such as optical or other analogs.
  • a computer program e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein
  • the computing device 100 in accordance with a first aspect of the present disclosure can provide accurate and complete skin conditions measurements.
  • FIG. 2 illustrates a flowchart of a method 200 of predicting skin conditions of a human subject in accordance with a second aspect of the present disclosure.
  • the method 200 starts at block 201.
  • significant object or appearance data from one or more digital images of a region of skin of a human subject are extracted by a significant object or appearance unit.
  • the significant object or appearance unit can be a spectrometer, a mobile device, a portable device and so on, capable of emitting five spectrums to achieve five spectrums imaging.
  • Such significant object or appearance unit can emit five light sources from the epidermis to the real skin to perform five spectrum imaging technology so that underlying skin problems are found.
  • the significant object or appearance unit captures images of the user ⁇ s skin by scanning or taking photo the user ⁇ s skin.
  • images of the user ⁇ s skin can be saved beforehand.
  • the user ⁇ s skin can be any skin in the user ⁇ s body including, not limited to a face, neck, hand, foot and so on.
  • significant object or appearance data such as acne data, blackhead data, dark circle data, pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101.
  • skin tone data means skin color data
  • tonicity data means the skin sagging data on the facial counter.
  • normalized skin characteristic data is generated based on the significant object or appearance data extracted from the one or more digital images of the region of skin.
  • normalized skin characteristic data are such as normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data.
  • a skin condition data is predicted based on an admix of the normalized skin characteristic data.
  • a skin condition data includes skin sensitivity, skin texture, skin tone, skin tonicity, skin translucency, or the like. Note that as mentioned above, here the predicted skin tone means predicted evenness of skin stone and the predicted skin tonicity means predicted tightness of human skin.
  • acne data, blackhead data, dark circle data, pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data from the one or more digital images of the region of skin are extracted.
  • orders of extracting the reflection data, the hue data, the wrinkle data, the spot data, the dark circle data, the pore data, the blackhead data, the acne data, the skin tone data, the sensitivity data, the age data, and the tonicity data can be exchanged or simultaneous.
  • normalized acne data normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data is generated.
  • Said reflection data can be obtained by a known algorithm corresponding to the reflection based on captured images.
  • Said hue data can be obtained by a known algorithm corresponding to the hue based on captured images.
  • Said wrinkle data can be obtained by a known algorithm corresponding to the wrinkle based on captured images.
  • Said spot data can be obtained by a known algorithm corresponding to the spot based on captured images.
  • Said dark circle data can be obtained by a known algorithm corresponding to the dark circle based on captured images.
  • Said pore data can be obtained by a known algorithm corresponding to the pore based on captured images.
  • Said blackhead data can be obtained by a known algorithm corresponding to the blackhead based on captured images.
  • Said acne data can be obtained by a known algorithm corresponding to the acne based on captured images.
  • Said skin tone data can be obtained by a known algorithm corresponding to the skin tone based on captured images.
  • Said sensitivity data can be obtained by a known algorithm corresponding to the sensitivity based on captured images.
  • Said age data can be obtained by a known algorithm corresponding to the age based on captured images.
  • Said tonicity data can be obtained by a known algorithm corresponding to the tonicity based on captured images. Then, these data are normalized to generate normalized data.
  • the first set of data comprises the normalized reflection data, the normalized hue data, the normalized wrinkle data, the normalized spot data and the normalized dark circle data;
  • the second set of data comprises the normalized pore data, the normalized wrinkle data, the normalized blackhead data and the normalized acne data;
  • the third set of data comprises the normalized skin tone data, the normalized spot data, the normalized acne data, the normalized blackhead data, the normalized dark circle data and the normalized sensitivity data;
  • the fourth set of data comprises the normalized age data, the normalized tonicity data and the normalized wrinkle data.
  • a skin translucency data at least one of a skin translucency data, a skin texture data, a skin tone data and a skin tonicity data, skin conditions of a user can be detected completely and accurately.
  • FIG. 3 illustrates a flowchart of a method 300 of determining a skin translucency data in accordance with a first embodiment of the second aspect of the present disclosure.
  • Block 301 is the similar as block 201 in Fig. 2.
  • significant object or appearance data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101.
  • significant object or appearance data is reflection data, hue data, wrinkle data, spot data and dark circle data.
  • a normalized reflection data, and a normalized hue data, a normalized wrinkle data, a normalized spot data and a normalized dark circle data are generated based on the extracted reflection data, the extracted hue data, the extracted wrinkle data, the extracted spot data and the extracted dark circle data. More particularly, the reflection data, the hue data, the wrinkle data, the spot data and the dark circle data can be obtained according to corresponding algorithms for reflection, hue, wrinkle, spot and dark circle based on captured images of the user ⁇ s skin. Blocks 301, 302 can be performed in local or in a cloud server. More particularly, by corresponding algorithms stored in a local device, captured images are processed to obtain the reflection data, the hue data, the wrinkle data, the spot data and the dark circle data and normalize these data.
  • captured images are transmitted to the cloud server; then by corresponding algorithms stored in the cloud server, captured images are processed to obtain the reflection data, the hue data, the wrinkle data, the spot data and the dark circle data, then normalize these data; finally, the normalized reflection data, the normalized hue data, the normalized wrinkle data, the normalized spot data and the normalized dark circle data are transmitted back to the user ⁇ s device.
  • the skin translucency data is predicted by the above-mentioned equation (1) .
  • P skin translucency W 1 ⁇ P n_reflection +W 2 ⁇ P n_hue +W 3 ⁇ P n_wrinkle +W 4 ⁇ P n_spot + W 5 ⁇ P n_dark circle
  • P skin translucency represents the skin translucency data
  • P n_reflection represents the normalized reflection data
  • P n_hue represents the normalized hue data
  • P n_wrinkle represents the normalized wrinkle data
  • P n_spot represents the normalized spot data
  • P n_dark circle represents the normalized dark circle data
  • W 1 is 28%-45%; W 2 is 28%-45%; W 3 is 5%-20%; W 4 is 2%-15%; and W 5 is 2%-15%.
  • W 1 , W 2 , W 3 , W 4 and W 5 are not limited to such value ranges.
  • FIG. 4 illustrates a flowchart of a method 400 of predicting a skin texture data in accordance with a second embodiment of the second aspect of the present disclosure.
  • the method 400 starts at block 401.
  • significant object or appearance data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101.
  • significant object or appearance data is pore data, wrinkle data, blackhead data and acne data.
  • a normalized pore data, a normalized wrinkle data, a normalized blackhead data and a normalized acne data are generated. More particularly, the pore data, the wrinkle data, the blackhead data and the acne data can be obtained according to corresponding algorithms for pore, wrinkle, blackhead and acne based on captured images of the user ⁇ s skin.
  • Blocks 402, 403 can be performed in local or in a cloud server.
  • captured images are processed to obtain the pore data, the wrinkle data, the blackhead data and the acne data.
  • captured images are transmitted to the cloud server; then by corresponding algorithms stored in a local device, captured images are processed to obtain the pore data, the wrinkle data, the blackhead data and the acne data, then these data are normalized; finally, the normalized pore data, the normalized wrinkle data, the normalized blackhead data and the normalized acne data are transmitted back to the user ⁇ s device.
  • the skin texture data is predicted by the above-mentioned equation (2) :
  • P skin texture W 6 ⁇ P n_pore +W 7 ⁇ P n_wrinkle +W 8 ⁇ P n_blackhead +W 9 ⁇ P n_acne
  • P skin texture represents the skin texture data
  • P n_pore represents the normalized pore data
  • P n_wrinkle represents the normalized wrinkle data
  • P n_blackhead represents the normalized blackhead data
  • P n_acne represents the normalized acne data
  • W 6 , W 7 , W 8 and W 9 are not limited to such value ranges.
  • FIG. 5 illustrates a flowchart of a method 500 of determining a skin tone data in accordance with a third embodiment of the second aspect of the present disclosure.
  • skin tone data means evenness data of skin stone.
  • Block 501 is the similar as block 201 in Fig. 2.
  • significant object or appearance data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101.
  • significant object or appearance data is a skin tone data, a spot data, an acne data, a blackhead data, a dark circle data and a sensitivity data.
  • a normalized skin tone data, a normalized spot data, a normalized acne data, a normalized blackhead data, a normalized dark circle data and a normalized sensitivity data are generated based on the skin tone data, the spot data, the acne data, the blackhead data, the dark circle data and the sensitivity data.
  • normalized skin tone data means normalized skin color data. More particularly, the skin tone data, the spot data, the acne data, the blackhead data, the dark circle data and the sensitivity data can be obtained according to corresponding algorithms for the skin tone, the spot, the acne, the blackhead, the dark circle data and the sensitivity based on captured images of the user ⁇ s skin.
  • Blocks 502, 503 can be performed in local or in a cloud server.
  • captured images are processed to obtain the skin tone data, the spot data, the acne data, the blackhead data, the dark circle data and the sensitivity data and then normalize these data.
  • captured images are transmitted to the cloud server; then by corresponding algorithms stored in the cloud server, captured images are processed to obtain the skin tone data, the spot data, the acne data, the blackhead data and the sensitivity data, then these data are normalized; finally, the normalized skin tone data, the normalized spot data, the normalized acne data, the normalized blackhead data, the normalized dark circle data and the normalized sensitivity data are transmitted back to the user ⁇ s device.
  • the skin tone data is predicted by the above-mentioned equation (3) :
  • P skin tone W 10 ⁇ P n_skin tone +W 11 ⁇ P n_spot +W 12 ⁇ P n_blackhead +W 13 ⁇ P n_dark circle +W 14 ⁇ P n_sensitivity +W 15 ⁇ P n_acne ;
  • P skin tone represents evenness data of the skin tone
  • P n_skin tone represents the normalized skin color data
  • P n_spot represents the normalized spot data
  • P n_blackhead represents the normalized blackhead data
  • P n_dark circle represents the normalized dark circle data
  • P n_sensitivity represents the normalized sensitivity data
  • P n_acne represents the normalized acne data
  • W 10 is 22%-50%; W 11 is 8%-35%; W 12 is 3%-20%; W 13 is 3%-20%; W 14 is 3%-20%; and W 15 is 3%-20%.
  • W 10 , W 11 , W 12 , W 13 , W 14 and W 15 are not limited to such value ranges.
  • FIG. 6 illustrates a flowchart of a method 600 of predicting a skin tonicity data in accordance with a fourth embodiment of the second aspect of the present disclosure.
  • the skin tonicity data means tightness data of human skin.
  • Block 601 is the similar as block 201 in Fig. 2.
  • significant object or appearance data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101.
  • significant object or appearance data is an age data, a tonicity data and a wrinkle data.
  • the tonicity data means the skin sagging on the facial counter.
  • a normalized age data, a normalized tonicity data and a normalized wrinkle data are generated based on the extracted age data, the extracted tonicity data and the extracted wrinkle data. More particularly, the age data, the tonicity data and the wrinkle data can be obtained according to corresponding algorithms for a age, a tonicity, a wrinkle based on captured images of the user ⁇ s skin. Blocks 602, 603 can be performed in local or in a cloud server. More particularly, by corresponding algorithms stored in a local device, captured images are processed to obtain the age data, the tonicity data and the wrinkle data and then normalize these data.
  • captured images are transmitted to the cloud server; then by corresponding algorithms stored in the cloud server, captured images are processed to obtain the normalized age data, the tonicity data and the wrinkle data, then these data are normalized; finally, the normalized age data, the normalized tonicity data and the normalized wrinkle data are transmitted back to the user ⁇ s device.
  • the skin tonicity is predicted by the above-mentioned equation (4) :
  • P skin tonicity W 16 ⁇ P n_age +W 17 ⁇ P n_tonicity +W 18 ⁇ P n_wrinkle
  • P skin tonicity represents tightness data of human skin
  • P n_age represents the normalized age data
  • P n_tonicity represents the normalized skin sagging data on the facial counter
  • P n_wrinkle represents the normalized wrinkle data
  • W 16 is 25%-65%; W 17 is 20%-50%; and W 18 is 20%-50%.
  • W 16 , W 17 and W 18 are not limited to such value ranges.
  • each of method 200, method 300, method 400, method 500, method 600 also comprises displaying on a graphical user interface one or more instances of the predict skin condition data.
  • the predict skin condition data such as skin transparency, skin texture, skin tone and skin tonicity can be indicated in a numerical value or a visual form on the graphical user interface.
  • each of method 200, method 300, method 400, method 500, method 600 also comprises displaying on a graphical user interface one or more instances of the extracted significant object or appearance data, or the normalized skin characteristic such as normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data, or the predict skin condition data.
  • the extracted significant object, the normalized skin characteristic such as normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data, or the predict skin condition data can be indicated in a numerical value or a visual form on the graphical user interface.
  • the wrinkle data and the normalized wrinkle data in method 300 can be used in method 400, 600; the spot data and the normalized spot data in method 300 can be used in method 500; the dark circle data and the normalized dark circle data in method 300 can be used in method 500; the acne data and the normalized acne data in method 400 can be used in method 500; the blackhead data and the normalized blackhead data in method 400 can be used in method 500.
  • the user can obtain four specific parameters, i.e. skin texture, skin tone, skin translucency and skin tonicity, which can best measure condition of a user ⁇ s skin in an accurate and a complete way. Then the 4T parameters can be displayed to a user ⁇ s device in more attached and understandable description form. Finally, based on determined four specific parameters, i.e. skin texture, skin tone, skin translucency and skin tonicity, a cosmetic solution is recommended to the user in order to solve specific skin problems of the user.
  • four specific parameters i.e. skin texture, skin tone, skin translucency and skin tonicity
  • a machine-learning algorithm is introduced to establish 4T model.
  • An embodiment of the disclosure may be an article of manufacture in which a non-transitory machine-readable medium (such as microelectronic memory) has stored thereon instructions (e.g., computer code) which program one or more data processing components (generically referred to here as a “processor” ) to perform the operations described above.
  • a non-transitory machine-readable medium such as microelectronic memory
  • instructions e.g., computer code
  • data processing components program one or more data processing components (generically referred to here as a “processor” ) to perform the operations described above.
  • some of these operations might be performed by specific hardware components that contain hardwired logic (e.g., dedicated digital filter blocks and state machines) .
  • Those operations might alternatively be performed by any combination of programmed data processing components and fixed hardwired circuit components.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Dermatology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A computing device for detecting skin conditions of a human subject, the computing device comprising: a significant object or appearance unit including computational circuitry configured to extract significant object or appearance data from one or more digital images of a region of skin of a human subject; a normalization unit including computational circuitry configured to generate normalized skin characteristic data based on the significant object or appearance data extracted from the one or more digital images of the region of skin; and a skin prediction unit including computational circuitry configured to predict a skin condition data based on an admix of the normalized skin characteristic data. A method for detecting skin conditions of a human subject, the computing device comprising: extracting significant object or appearance data from one or more digital images of a region of skin of a human subject; generating normalized skin characteristic data based on the significant object or appearance data extracted from the one or more digital images of the region of skin; and predicting a skin condition data based on an admix of the normalized skin characteristic data.

Description

[Title established by the ISA under Rule 37.2] COMPUTING DEVICE, METHOD AND APPARATUS FOR DETECTING SKIN CONDITIONS OF HUMAN SUBJECT TECHNICAL FIELD
The disclosure relates to the field of cosmetics. More specifically, the disclosure relates to a computing device, a method and an apparatus for detecting skin conditions of a human subject.
BACKGROUND
The skin is the largest organ of the human body and one of the most important. Among its many functions, the skin provides a protective barrier against harmful substances, harmful effects of UV radiation as well as mechanical, thermal, and physical injury. The skin also acts as a sensory organ that helps perceive temperature, touch, etc. Maintain a healthy skin often requires knowledge of the state and status of several skin conditions. In existing prior arts, the presence or absence pores, wrinkles, skin tone, spots and blackheads are often used to measure skin conditions of a user. Though existing prior arts disclose technical solutions of detecting user`s skin conditions, those existing prior arts are incomplete in the detection dimension of skin detection and are also not accurate.
Therefore, there is a need to detect, quantify, and classify conditions of the user`s skin and changes to conditions of the user`s skin in order to provide more complete and more accurate detection results for the user`s skin.
SUMMARY
The summary is provided to introduce a selection of concepts in a simplified form that are further described below in detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Various aspects and features of the disclosure are described in further detail below.
According to a first aspect of the disclosure, there is provided a significant object or appearance unit including computational circuitry configured to extract significant object or appearance data from one or more digital images of a region of skin of the human subject; a normalization unit including computational circuitry configured to generate  normalized skin characteristic data based on the significant object or appearance data extracted from the one or more digital images of the region of skin; and a skin prediction unit including computational circuitry configured to predict a skin condition data based on an admix of the normalized skin characteristic data.
In an embodiment, the significant object or appearance data includes acne data; blackhead data, dark circle data; pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data and the like.
In an embodiment, the significant object or appearance data includes presence, absence, or severity pore data, presence, absence or severity wrinkle data, presence, absence, or severity blackhead data, presence, absence, or severity acne data, and the like.
In an embodiment, the significant object or appearance data includes changes in reflection data, changes in hue data, changes in wrinkle data, changes in spot data, changes in dark circle data, and the like.
In an embodiment, the skin condition data includes skin texture data, skin tone data; skin tonicity data; skin translucency data, or the like.
In an embodiment, the predicted skin condition data includes a predicted skin sensitivity, a predicted skin texture, a predicted skin tone, a predicted skin tonicity, a predicted skin translucency, or the like.
In an embodiment, the predicted skin condition data includes changes in skin tone data, changes in skin translucency data, changes in skin texture data, changes in skin tonicity data, or the like.
In an embodiment, the predicted skin condition data includes data indicative of the presence, absence, severity, or a change in a condition associated with skin tone, skin translucency, skin texture, skin tonicity, or the like.
In an embodiment, normalized data includes normalized skin object or appearance data, normalized skin characteristic data, or the like.
In an embodiment, normalized data includes normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data or the like.
In an embodiment, the computing device further comprises a skin condition display  including computational circuitry configured to display on a graphical user interface one or more instances of the extracted significant object or appearance data, the normalized skin characteristic data or the predicted skin condition data. In an embodiment, the normalized skin characteristic data are such as normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data or the like.
In an embodiment, the skin prediction unit includes computational circuitry configured to predict skin translucency data based on a weighted admixture of normalized reflection data, normalized hue data, normalized wrinkle data, normalized spot data, and normalized dark circle data.
More particularly, the skin prediction unit includes computational circuitry configured to predict skin translucency data based on a weighted admixture of P n_reflection, P n_hue, P n_wrinkle, P n_spot, and P n_dark circle,
wherein P skin translucency = W 1×P n_reflection+W 2×P n_hue+W 3×P n_wrinkle+W 4×P n_spot+ W 5×P n_dark circle,
                             (equation 1)
Wherein
P skin translucency represents the skin translucency data,
P n_reflection represents the normalized reflection data,
P n_hue represents the normalized hue data,
P n_wrinkle represents the normalized wrinkle data,
P n_spot represents the normalized spot data,
P n_dark  circle the normalized dark circle data;
wherein W 1, W 2, W 3, W 4 and W 5 are predefined weights; and
wherein W 1 + W 2 + W 3 + W 4 + W 5 = 1.
In an embodiment, the skin prediction unit includes computational circuitry configured to predict skin texture data based on a weighted admixture of normalized pore data, normalized wrinkle data, normalized blackhead data, and normalized acne data.
More particularly, the skin prediction unit includes computational circuitry configured to predict skin texture data based on a weighted admixture of P n_pore, P n_wrinkle, P n_blackhead, and P n_acne;
wherein P skin texture=W 6×P n_pore+W 7×P n_wrinkle+W 8×P n_blackhead+W 9×P n_acne; (equation 2)
wherein
P skin texture represents the skin texture data,
P n_pore represents the normalized pore data,
P n_wrinkle represents the normalized wrinkle data,
P n_blackhead represents the normalized blackhead data,
P n_acne represents the normalized acne data,
wherein W 6, W 7, W 8, and W 9 are predefined weights; and
wherein W 6 + W 7 + W 8 + W 9= 1.
In an embodiment, the skin prediction unit includes computational circuitry configured to predict skin tone data based on a weighted admixture of normalized skin tone data, normalized spot data, normalized acne data, normalized blackhead data, normalized dark circle data, normalized sensitivity data.
More particularly, the skin prediction unit includes computational circuitry configured to predict Skin tone data based on a weighted admixture of P n_skintone, P n_spot, P n_blackhead, P n_dark circle, P n_sensitivity, P n_acne;
P skin tone=W 10×P n_skin tone+W 11×P n_spot+W 12×P n_blackhead+W 13×P n_dark circle+W 14×P n_sensitivity+W 15×P n_acne, (equation 3)
wherein P skin tone represents evenness data of the skin tone;
P n_skin tone represents the normalized skin color data;
P n_spot represents the normalized spot data;
P n_blackhead represents the normalized blackhead data;
P n_dark circle represents the normalized dark circle data;
P n_sensitivity represents the normalized sensitivity data;
P n_acne represents the normalized acne data; and
wherein W 10, W 11, W 12, W 13, W 14 and W 15 are predefined weights; and
wherein W 10+ W 11+W 12+W 13+W 14 +W 15=1.
Note that throughout description, a new parameter "skin tone" is proposed and the new term "skin tone" means the evenness of skin tone. The melanocytes in the epidermis are the main factors that determine skin color. Skin aging, acne or other problems may increase of pigmentation of the face and affect the skin color and evenness. Dark circles under the eyes and inflammatory acne can also change the skin tone.
Also note that throughout description, the term "normalized skin tone" means the  normalized skin color.
In an embodiment, the skin prediction unit includes computational circuitry configured to predict skin tonicity data based on a weighted admixture of normalized age data, tonicity data, and wrinkle data.
More particularly, the skin prediction unit includes computational circuitry configured to predict skin tonicity data based on a weighted admixture of P n_age, P n_tonicity, and P n_wrinkle;
P skin tonicity=W 16×P n_age+W 17×P n_tonicity+W 18×P n_wrinkle,  (equation 4)
wherein P skin tonicity represents tightness data of human skin;
P n_age represents the normalized age data;
P n_tonicity represents the normalized skin sagging data on the facial counter;
P n_wrinkle represents the normalized wrinkle data; and
wherein W 16, W 17 and W 18 are predefined weights; and
wherein W 16+W 17 + W 18=1.
Note that throughout description, a new parameter "skin tonicity" is proposed and the new term "skin tonicity" means tightness data of human skin. As skin ages, the elastin and collagen in the dermis lost, changes in subcutaneous tissue, and gravity can lead to facial skin laxity, manifested as skin sagging on cheek, jaw, and neck , nasolabial folds and eye bags, etc.
Also note that throughout description, the term "tonicity" means the skin sagging on the facial counter.
According to a second aspect of the disclosure, there is provided a method for detecting skin conditions of a human subject, the method comprising extracting significant object or appearance data from one or more digital images of a region of skin of a human subject; generating normalized skin characteristic data based on the significant object or appearance data extracted from the one or more digital images of the region of skin; and predicting a skin condition data based on an admix of the normalized skin characteristic data.
In an embodiment, the method further comprises extracting acne data, blackhead data, dark circle data, pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data from the one or more digital images of the region of skin.
In an embodiment, said method also comprise normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data.
In an embodiment, the method further comprises extracting data indicative of a presence, absence, or severity of a skin condition from the one or more digital images of the region of skin.
In an embodiment, the significant object or appearance data includes acne data; blackhead data, dark circle data; pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data and the like.
In an embodiment, the significant object or appearance data includes presence, absence, or severity pore data, presence, absence or severity wrinkle data, presence, absence, or severity blackhead data, presence, absence, or severity acne data, and the like.
In an embodiment, the significant object or appearance data includes changes in reflection data, changes in hue data, changes in wrinkle data, changes in spot data, changes in dark circle data, and the like.
In an embodiment, the skin condition data includes skin texture data, skin tone data; skin tonicity data; skin translucency data, or the like.
In an embodiment, the predicted skin condition data includes a predicted skin sensitivity, a predicted skin texture, a predicted skin tone, a predicted skin tonicity, a predicted skin translucency, or the like.
In an embodiment, the predicted skin condition data includes changes in skin tone data, changes in skin translucency data, changes in skin texture data, changes in skin tonicity data, or the like.
In an embodiment, the predicted skin condition data includes data indicative of the presence, absence, severity, or a change in a condition associated with skin tone, skin translucency, skin texture, skin tonicity, or the like.
In an embodiment, normalized skin characteristic data includes normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection  data, or normalized skin tone data or the like.
In an embodiment, the method further comprises displaying on a graphical user interface one or more instances of the extracted significant object or appearance data, the normalized skin characteristic data or the predicted skin condition data. In an embodiment, the skin translucency data is predicted based on a weighted admixture of normalized reflection data, normalized hue data, normalized wrinkle data, normalized spot data, and normalized dark circle data. In an embodiment, skin texture data is predicted based on a weighted admixture of normalized pore data, normalized wrinkle data, normalized blackhead data, and normalized acne data. In an embodiment, skin tone data is predicted based on a weighted admixture of normalized skin tone data, normalized spot data, normalized acne data, normalized blackhead data, normalized dark circle data, normalized sensitivity data. In an embodiment, skin tonicity data is predicted based on a weighted admixture of normalized age data, normalized tonicity data, and normalized wrinkle data.
More particularly, the skin translucency data is predicted by the above mentioned equation 1, the skin texture data is predicted by the above mentioned equation 2, the skin tone data is predicted by the above mentioned equation 3 and the skin tonicity data is predicted by the above mentioned equation 4.
According to a third aspect of the disclosure, there is provided a computer readable medium having stored thereon instructions that when executed cause a computing device to perform the above-mentioned method.
According to a fourth aspect of the disclosure, there is provided an apparatus of detecting conditions of a user`s skin, the apparatus comprises means for performing the above-mentioned method.
According to the disclosure, conditions of a user`s skin can be measured accurately and completely by using specific four new parameters, i.e. skin texture, skin tone, skin translucency and skin tonicity.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other aspects, features, and benefits of various embodiments of the disclosure will become more fully apparent, by way of example, from the following detailed description with reference to the accompanying drawings, in which like reference numerals or letters are used to designate like or equivalent elements. The drawings are  illustrated for facilitating better understanding of the embodiments of the disclosure and not necessarily drawn to scale, in which:
Fig. 1 illustrates a block diagram of a computing device in accordance with a first aspect of the present disclosure;
FIG. 2 illustrates a flowchart of a method of detecting skin conditions of a human subject in accordance with a second aspect of the present disclosure;
FIG. 3 illustrates a flowchart of a method of predicting a skin translucency data of the human subject in accordance with a first embodiment of the second aspect of the present disclosure;
FIG. 4 illustrates a flowchart of a method of predicting a skin texture data of the human subject in accordance with a second embodiment of the second aspect of the present disclosure;
FIG. 5 illustrates a flowchart of a method of predicting a skin tone data of the human subject in accordance with a third embodiment of the second aspect of the present disclosure;
FIG. 6 illustrates a flowchart of a method of predicting a skin tonicity data of the human subject in accordance with a fourth embodiment of the second aspect of the present disclosure.
DETAILED DESCRIPTION
Embodiments herein will be described in detail hereinafter with reference to the accompanying drawings, in which embodiments are shown. These embodiments herein may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. The elements of the drawings are not necessarily to scale relative to each other. Like numbers refer to like elements throughout.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used herein, the singular forms "a" , "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" "comprising, " "includes" and/or "including" when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations,  elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meanings as commonly understood. It will be further understood that a term used herein should be interpreted as having a meaning consistent with its meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The present technology is described below with reference to block diagrams and/or flowchart illustrations of methods, apparatus (systems) and/or computer program products according to the present embodiments. It is understood that blocks of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, may be implemented by computer program instructions. These computer program instructions may be provided to a processor, controller or controlling unit of a general purpose computer, special purpose computer, and/or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the present technology may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc. ) . Furthermore, the present technology may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system. In the context of this document, a computer-usable or computer-readable medium may be any medium that may contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
Embodiments herein will be described below with reference to the drawings.
The inventive concept of the disclosure is to propose new four parameters related to skin conditions based on user perception decoding statistic model, the new four parameters are skin texture, skin tone, skin translucency and skin tonicity which are called 4T. By using such 4T parameters, conditions of a user`s skin can be measured accurately and completely.
Fig. 1 illustrates a block diagram of a computing device 100 in accordance with a first aspect of the present disclosure.
In an embodiment, the computing device 100 comprises a significant object or appearance unit 101, a normalization unit 102 and a skin prediction unit 103. The significant object or appearance unit 101 is to extract significant object or appearance data from one or more digital images of a region of skin of a human subject. The significant object or appearance unit 101 can be such as a spectrometer, a mobile device, a portable device and so on capable of emitting five spectrums to achieve five spectrums imaging. For example, such significant object or appearance unit 101 comprises computational circuitry which is configured to emit five light sources from the epidermis to the real skin to perform five spectrums imaging technology so that underlying skin problems are found. More particularly, the significant object or appearance unit 101 captures images of the user`s skin by scanning or taking photo the user`s skin. As an alternative, images of the user`s skin can be saved beforehand. Further, the user`s skin can be any skin in the user`s body including, not limited to a face, neck, hand, foot and so on.
In an example, significant object or appearance data such as acne data, blackhead data, dark circle data, pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101. Note that as mentioned above, here skin tone data means skin color data and tonicity data means the skin sagging data on the facial counter.
normalization unit 102 includes computational circuitry configured to generate normalized skin characteristic data based on the significant object or appearance data extracted from the one or more digital images of the region of skin. In an example, normalized skin characteristic data are such as normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data.
The skin prediction unit 103 includes computational circuitry configured to predict a skin condition data based on an admix of the normalized skin characteristic data. In an example, a skin condition data includes a predicted skin sensitivity, a predicted skin  texture, a predicted skin tone, a predicted skin tonicity, a predicted skin translucency, or the like. Note that as mentioned above, here the predicted skin tone means predicted evenness of skin stone and the predicted skin tonicity means predicted tightness of human skin and normalized skin tone means normalized skin color.
By using detection, identification, or classification technologies aided by Artificial Intelligence (AI) , machine learning, deep learning, etc, areas with skin problems can be identified quickly or with greater accuracy. According to images of those identified areas, a plurality of skin condition parameters data can be obtained in local or in a cloud server. In local case, the images are processed by known algorithms corresponding to different skin condition parameters stored in the computing devices to obtain the plurality of skin condition parameters data. In a cloud server case, the images are transmitted to the cloud server. Then, the images are processed by known algorithms corresponding to different skin condition parameters stored in the cloud server to obtain the plurality of skin condition parameters data. The plurality of skin condition parameters are such as a normalized reflection data, a normalized hue data, a wrinkle data, a spot data, a dark circle data, a pore data, a blackhead data, an acne data, a normalized skin tone data, a sensitivity data, an normalized age data and a tonicity data. Then, The plurality of skin condition parameters are normalized to generate normalized skin condition parameters.
In existing prior art, these factors such as a pore, a wrinkle, skin tone, a spot and a blackhead are used to measure skin conditions of a user. However, it is proved that such measurements for skin conditions are not accurate and complete. In the disclosure, unique statistical model was trained to connect objective measurement and subjective feeling based on image analysis and consumer perception in quantitative level. More particularly, specific combinations of a plurality of normalized skin condition factors such as a normalized reflection, a normalized hue, a normalized wrinkle, a normalized spot, a normalized dark circle, a normalized pore, a normalized blackhead, a normalized acne, a normalized skin tone, a normalized sensitivity, a normalized age, a normalized tonicity are considered.
In an example, a first set of data comprising the normalized reflection data, the normalized hue data, the normalized wrinkle data, the normalized spot data and the normalized dark circle data is obtained; a second set of data comprising the normalized pore data, the normalized wrinkle data, the normalized blackhead data and the normalized  acne data is obtained; a third set of data comprising the normalized skin tone data, the normalized spot data, the normalized acne data, the normalized blackhead data, the normalized dark circle data and the normalized sensitivity data is obtained; a fourth set of data comprising the normalized age data, the normalized tonicity data and the normalized wrinkle data is obtained.
Said normalized reflection data can be obtained by a known algorithm corresponding to the normalized reflection based on captured images. Said normalized hue data can be obtained by a known algorithm corresponding to the normalized hue based on captured images. Said wrinkle data can be obtained by a known algorithm corresponding to the wrinkle based on captured images. Said spot data can be obtained by a known algorithm corresponding to the spot based on captured images. Said dark circle data can be obtained by a known algorithm corresponding to the dark circle based on captured images. Said pore data can be obtained by a known algorithm corresponding to the pore based on captured images. Said blackhead data can be obtained by a known algorithm corresponding to the blackhead based on captured images. Said acne data can be obtained by a known algorithm corresponding to the acne based on captured images. Said normalized skin tone data can be obtained by a known algorithm corresponding to the normalized skin tone based on captured images. Said sensitivity data can be obtained by a known algorithm corresponding to the sensitivity based on captured images. Said normalized age data can be obtained by a known algorithm corresponding to the normalized age based on captured images. Said tonicity data can be obtained by a known algorithm corresponding to the tonicity based on captured images. Then, these data are further normalized to generate normalized data.
In an embodiment, the skin prediction unit 103 comprises computational circuitry which is configured to also predict at least of a skin translucency data based on the first set of data, a skin texture data based on the second set of data, a skin tone data based on the third set of data and a skin tonicity data based on the fourth set of data.
In a preferred embodiment, the skin prediction unit 103 comprises computational circuitry which is configured to predict the skin translucency data by weighing the first set of data. More particularly, the skin translucency data is determined by the following equation:
P skin translucency =W 1×P n_reflection+W 2×P n_hue+W 3×P n_wrinkle+W 4×P n_spot+ W 5×P n_dark circle
                          (equation 1)
wherein P skin translucency represents the skin translucency data;
P n_reflection represents the normalized reflection data;
P n_hue represents the normalized hue data;
P n_wrinkle represents the normalized wrinkle data;
P n_spot represents the normalized spot data;
P n_dark circle represents the normalized dark circle data; and
W 1, W 2, W 3, W 4 and W 5 are predefined weights and W 1 + W 2 + W 3 + W 4 + W 5 = 1 is 1.
The weights W 1, W 2, W 3, W 4 and W 5 are calculated in advance based on experience and research data. The weights can be adjusted according to the user’s perception of the skin dimension and the needs of the research results, and the adjusted weights of each dimension are updated in real time on the application server. In an example, W 1 is 28%-45%; W 2 is 28%-45%; W 3 is 5%-20%; W 4 is 2%-15%; and W 5 is 2%-15%. However, according to the disclosure, W 1, W 2, W 3, W 4 and W 5 are not limited to such value ranges.
In a preferred embodiment, the skin prediction unit 103 comprises computational circuitry which is configured to also predict the skin texture data by weighing the second set of data. More particularly, the skin prediction unit 103 is to also predict the skin texture data by the following equation:
P skin texture=W 6×P n_pore+W 7×P n_wrinkle+W 8×P n_blackhead+W 9×P n_acne  (equation 2)
wherein P skin texture represents the skin texture data;
P n_pore represents the normalized pore data
P n_wrinkle represents the normalized wrinkle data;
P n_blackhead represents the normalized blackhead data;
P n_acne represents the normalized acne data; and
W 6, W 7, W 8 and W 9 are predefined weights and W 6 + W 7 + W 8 + W 9= 1 is 1.
The weights W 6, W 7, W 8 and W 9 are calculated in advance based on experience and research data. The weights can be adjusted according to the user’s perception of the skin dimension and the needs of the research results, and the adjusted weights of each dimension are updated in real time on the application server. In an example, W 6 is 15%-50%; W 7 is 15%-50%; W 8 is 12%-35%; and W 9 is 12%-30%. However, according to the disclosure, W 6, W 7, W 8 and W 9 are not limited to such value ranges.
In a preferred embodiment, the skin prediction unit 103 comprises computational circuitry which is configured to also predict the skin tone data by weighing the third set of data. More particularly, the skin prediction unit 103 is to also determine the skin tone data by the following equation:
P skin tone=W 10×P n_skin tone+W 11×P n_spot+W 12×P n_blackhead+W 13×P n_dark circle+W 14×P n_sensitivity+W 15×P n_acne
                           (equation 3)
wherein P skin tone represents evenness data of the skin tone;
P n_skin tone represents the normalized skin color data;
P n_spot represents the normalized spot data;
P n_blackhead represents the normalized blackhead data;
P n_dark circle represents the normalized dark circle data;
P n_sensitivity represents the normalized sensitivity data;
P n_acne represents the normalized acne data; and
W 10, W 11, W 12, W 13, W 14 and W 15 are predefined weights and W 10+W 11+W 12+W 13+W 14 +W 15=1.
The weights W 10, W 11, W 12, W 13, W 14 and W 15 are calculated in advance based on experience and research data. The weights can be adjusted according to the user’s perception of the skin dimension and the needs of the research results, and the adjusted weights of each dimension are updated in real time on the application server. In an example, W 10 is 22%-50%; W 11 is 8%-35%; W 12 is 3%-20%; W 13 is 3%-20%; W 14 is 3%-20%; and W 15 is 3%-20%. However, according to the disclosure, W 10, W 11, W 12, W 13, W 14 and W 15 are not limited to such value ranges.
In a preferred embodiment, the skin prediction unit 103 comprises computational circuitry which is configured to also predict the skin tonicity data by weighting the fourth set of data. More particularly, the skin prediction unit 103 is to also predict the skin tonicity by the following equation:
P skin tonicity=W 16×P n_age+W 17×P n_tonicity+W 18×P n_wrinkle      (equation 4)
wherein P skin tonicity represents tightness data of human skin;
P n_age represents the normalized age data;
P n_tonicity represents the normalized skin sagging data on the facial counter;
P n_wrinkle represents the normalized wrinkle data; and
W 16, W 17 and W 18 are predefined weights and W 16+W 17 + W 18=1.
The weights W 16, W 17 and W 18 are calculated in advance based on experience and research data. The weights can be adjusted according to the user’s perception of the skin dimension and the needs of the research results, and the adjusted weights of each dimension are updated in real time on the application server. In an example, W 16 is 25%-65%; W 17 is 20%-50%; and W 18 is 20%-50%. However, according to the disclosure, W 16, W 17 and W 18 are not limited to such value ranges.
In an embodiment, the computing device 100 also comprises a skin condition display including computational circuitry configured to display on a graphical user interface one or more instances of the predict skin condition data. In an example, the predicted skin condition data such as skin transparency, skin texture, skin tone and skin tonicity can be indicated in a numerical value or a visual form on the graphical user interface.
In an embodiment, the computing device 100 also comprises a skin condition display including computational circuitry configured to display on a graphical user interface one or more instances of the extracted significant object and the normalized skin characteristic such as normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data. In an example, the extracted significant object and the normalized skin characteristic such as normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data can be indicated in a numerical value or a visual form on the graphical user interface.
In an embodiment, the computing device 100 also comprises the significant object or appearance unit including computational circuitry configured to obtain data indicative of a presence, absence, or severity of a skin condition from the one or more digital images of the region of skin.
Computing device 100 can be, for example, a server of a service provider, a device associated with a client (e.g, a client device) , a system on a chip, and/or any other suitable computing device or computing system. In various implementations, computing device  100 can take a variety of different configurations. For example, computing device 100 can be implemented as a computer-like device including a personal computer, desktop computer, multi-screen computer, laptop computer, netbook, and the like. Computing device 100 can also be implemented as a mobile device-like device that includes mobile devices such as mobile phones, portable music players, portable gaming devices, tablet computers, multi-screen computers, and the like. Computing device 100 can also be implemented as a television-like device that includes a device having or connected to a generally larger screen in a casual viewing environment. These devices include televisions, set-top boxes, game consoles, and the like.
In an embodiment, computational circuitry includes, among other things, one or more computing devices such as a processor (e.g., a microprocessor, a quantum processor, qubit processor, etc. ) , a central processing unit (CPU) , a digital signal processor (DSP) , an application-specific integrated circuit (ASIC) , a field programmable gate array (FPGA) , and the like, or any combinations thereof, and can include discrete digital or analog circuit elements or electronics, or combinations thereof. In an embodiment, computational circuitry includes one or more ASICs having a plurality of predefined logic components. In an embodiment, computational circuitry includes one or more FPGAs, each having a plurality of programmable logic components.
In an embodiment, computation circuitry includes one or more electric circuits, printed circuits, flexible circuits, electrical conductors, electrodes, cavity resonators, conducting traces, ceramic patterned electrodes, electro-mechanical components, transducers, and the like.
In an embodiment, computational circuitry includes one or more components operably coupled (e.g., communicatively, electromagnetically, magnetically, ultrasonically, optically, inductively, electrically, capacitively coupled, wirelessly coupled, and the like) to each other. In an embodiment, circuitry includes one or more remotely located components. In an embodiment, remotely located components are operably coupled, for example, via wireless communication. In an embodiment, remotely located components are operably coupled, for example, via one or more communication modules, receivers, transmitters, transceivers, and the like.
In an embodiment, computation circuitry includes memory that, for example, stores instructions or information. Non-limiting examples of memory include volatile memory  (e.g., Random Access Memory (RAM) , Dynamic Random Access Memory (DRAM) , and the like) , non-volatile memory (e.g., Read-Only Memory (ROM) , Electrically Erasable Programmable Read-Only Memory (EEPROM) , Compact Disc Read-Only Memory (CD-ROM) , and the like) , persistent memory, and the like. Further non-limiting examples of memory include Erasable Programmable Read-Only Memory (EPROM) , flash memory, and the like. In an embodiment, memory is coupled to, for example, one or more computing devices by one or more instructions, information, or power buses. In an embodiment, computational circuitry includes one or more databases stored in memory. In an embodiment, computational circuitry includes one or more look-up tables stored in memory.
In an embodiment, computational circuitry includes one or more computer-readable media drives, interface sockets, Universal Serial Bus (USB) ports, memory card slots, and the like, and one or more input/output components such as, for example, a graphical user interface, a display, a keyboard, a keypad, a trackball, a joystick, a touch-screen, a mouse, a switch, a dial, and the like, and any other peripheral device. In an embodiment, computational circuitry includes one or more user input/output components that are operably coupled to at least one computing device configured to control (electrical, electromechanical, software-implemented, firmware-implemented, or other control, or combinations thereof) at least one parameter associated with, for example, determining one or more tissue thermal properties responsive to detected shifts in turn-on voltage.
In an embodiment, computational circuitry includes electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein) , electrical circuitry forming a memory device (e.g., forms of memory (e.g., random access, flash, read only, etc. ) ) , electrical circuitry forming a communications device (e.g., a modem, communications switch, optical-electrical equipment, etc. ) , and/or any non-electrical analog thereto, such as optical or other analogs.
The computing device 100 in accordance with a first aspect of the present disclosure can provide accurate and complete skin conditions measurements.
FIG. 2 illustrates a flowchart of a method 200 of predicting skin conditions of a  human subject in accordance with a second aspect of the present disclosure.
With the method 200, the above and other potential deficiencies in the conventional approaches can be overcome. The method 200 starts at block 201. At block 201, significant object or appearance data from one or more digital images of a region of skin of a human subject are extracted by a significant object or appearance unit. The significant object or appearance unit can be a spectrometer, a mobile device, a portable device and so on, capable of emitting five spectrums to achieve five spectrums imaging. Such significant object or appearance unit can emit five light sources from the epidermis to the real skin to perform five spectrum imaging technology so that underlying skin problems are found. More particularly, the significant object or appearance unit captures images of the user`s skin by scanning or taking photo the user`s skin. As an alternative, images of the user`s skin can be saved beforehand. Further, the user`s skin can be any skin in the user`s body including, not limited to a face, neck, hand, foot and so on.
In an example, significant object or appearance data such as acne data, blackhead data, dark circle data, pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101. Note that as mentioned above, here skin tone data means skin color data and tonicity data means the skin sagging data on the facial counter.
At block 202, normalized skin characteristic data is generated based on the significant object or appearance data extracted from the one or more digital images of the region of skin. In an example, normalized skin characteristic data are such as normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data.
At block 203, a skin condition data is predicted based on an admix of the normalized skin characteristic data. In an example, a skin condition data includes skin sensitivity, skin texture, skin tone, skin tonicity, skin translucency, or the like. Note that as mentioned above, here the predicted skin tone means predicted evenness of skin stone and the predicted skin tonicity means predicted tightness of human skin.
As mentioned in above, in existing prior art, these factors such as a pore, a wrinkle,  skin tone, a spot and a blackhead are used to measure skin conditions of a user. However, it is proved that such measurements for skin conditions lack accurate and complete. In the disclosure, specific combinations of a plurality of skin condition factors such as a reflection, a hue, a wrinkle, a spot, a dark circle, a pore, a blackhead, an acne, a skin tone, a sensitivity, an age, a tonicity are considered. For this, in an example, acne data, blackhead data, dark circle data, pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data from the one or more digital images of the region of skin are extracted.
Note that orders of extracting the reflection data, the hue data, the wrinkle data, the spot data, the dark circle data, the pore data, the blackhead data, the acne data, the skin tone data, the sensitivity data, the age data, and the tonicity data can be exchanged or simultaneous.
Then, normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data is generated.
Said reflection data can be obtained by a known algorithm corresponding to the reflection based on captured images. Said hue data can be obtained by a known algorithm corresponding to the hue based on captured images. Said wrinkle data can be obtained by a known algorithm corresponding to the wrinkle based on captured images. Said spot data can be obtained by a known algorithm corresponding to the spot based on captured images. Said dark circle data can be obtained by a known algorithm corresponding to the dark circle based on captured images. Said pore data can be obtained by a known algorithm corresponding to the pore based on captured images. Said blackhead data can be obtained by a known algorithm corresponding to the blackhead based on captured images. Said acne data can be obtained by a known algorithm corresponding to the acne based on captured images. Said skin tone data can be obtained by a known algorithm corresponding to the skin tone based on captured images. Said sensitivity data can be obtained by a known algorithm corresponding to the sensitivity based on captured images. Said age data can be obtained by a known algorithm corresponding to the age based on captured images. Said tonicity data can be obtained by a known algorithm corresponding to the tonicity based on captured images. Then, these data are normalized to generate normalized data.
Further, in an example, we can define a first set of data, a second set of data, a third set of data and a fourth set of data. The first set of data comprises the normalized reflection data, the normalized hue data, the normalized wrinkle data, the normalized spot data and the normalized dark circle data; the second set of data comprises the normalized pore data, the normalized wrinkle data, the normalized blackhead data and the normalized acne data; the third set of data comprises the normalized skin tone data, the normalized spot data, the normalized acne data, the normalized blackhead data, the normalized dark circle data and the normalized sensitivity data; and the fourth set of data comprises the normalized age data, the normalized tonicity data and the normalized wrinkle data.
According to method 200, at least one of a skin translucency data, a skin texture data, a skin tone data and a skin tonicity data, skin conditions of a user can be detected completely and accurately.
Details regarding how to determine at least one of a skin translucency data, a skin texture data, a skin tone data and a skin tonicity data will be described in below.
FIG. 3 illustrates a flowchart of a method 300 of determining a skin translucency data in accordance with a first embodiment of the second aspect of the present disclosure.
The method 300 starts at block 301. Block 301 is the similar as block 201 in Fig. 2. At block 301, significant object or appearance data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101. In an example, significant object or appearance data is reflection data, hue data, wrinkle data, spot data and dark circle data.
At block 302, a normalized reflection data, and a normalized hue data, a normalized wrinkle data, a normalized spot data and a normalized dark circle data are generated based on the extracted reflection data, the extracted hue data, the extracted wrinkle data, the extracted spot data and the extracted dark circle data. More particularly, the reflection data, the hue data, the wrinkle data, the spot data and the dark circle data can be obtained according to corresponding algorithms for reflection, hue, wrinkle, spot and dark circle based on captured images of the user`s skin.  Blocks  301, 302 can be performed in local or in a cloud server. More particularly, by corresponding algorithms stored in a local device, captured images are processed to obtain the reflection data, the hue data, the wrinkle data, the spot data and the dark circle data and normalize these data. As an alternative, captured images are transmitted to the cloud server; then by corresponding algorithms stored in the  cloud server, captured images are processed to obtain the reflection data, the hue data, the wrinkle data, the spot data and the dark circle data, then normalize these data; finally, the normalized reflection data, the normalized hue data, the normalized wrinkle data, the normalized spot data and the normalized dark circle data are transmitted back to the user`s device.
At block 303, the skin translucency data is predicted by the above-mentioned equation (1) .
P skin translucency=W 1×P n_reflection+W 2×P n_hue+W 3×P n_wrinkle+W 4×P n_spot+ W 5×P n_dark circle
wherein P skin translucency represents the skin translucency data;
P n_reflection represents the normalized reflection data;
P n_hue represents the normalized hue data;
P n_wrinkle represents the normalized wrinkle data;
P n_spot represents the normalized spot data;
P n_dark circle represents the normalized dark circle data; and
W 1, W 2, W 3, W 4 and W 5 are predefined weights and W 1 + W 2 + W 3 + W 4 + W 5 = 1.
In an example, W 1 is 28%-45%; W 2 is 28%-45%; W 3 is 5%-20%; W 4 is 2%-15%; and W 5 is 2%-15%. However, according to the disclosure, W 1, W 2, W 3, W 4 and W 5 are not limited to such value ranges.
By the method 300, skin translucency data of the user can be determined accurately and completely.
FIG. 4 illustrates a flowchart of a method 400 of predicting a skin texture data in accordance with a second embodiment of the second aspect of the present disclosure.
The method 400 starts at block 401. At block 401, significant object or appearance data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101. In an example, significant object or appearance data is pore data, wrinkle data, blackhead data and acne data. At block 402, a normalized pore data, a normalized wrinkle data, a normalized blackhead data and a normalized acne data are generated. More particularly, the pore data, the wrinkle data, the blackhead data and the acne data can be obtained according to corresponding algorithms for pore, wrinkle, blackhead and acne based on captured images of the user`s skin.  Blocks  402, 403 can be performed in local or in a cloud server. More particularly, by corresponding algorithms stored in a local device, captured images are processed to obtain the pore data, the wrinkle  data, the blackhead data and the acne data. As an alternative, captured images are transmitted to the cloud server; then by corresponding algorithms stored in a local device, captured images are processed to obtain the pore data, the wrinkle data, the blackhead data and the acne data, then these data are normalized; finally, the normalized pore data, the normalized wrinkle data, the normalized blackhead data and the normalized acne data are transmitted back to the user`s device.
At block 403, the skin texture data is predicted by the above-mentioned equation (2) :
P skin texture=W 6×P n_pore+W 7×P n_wrinkle+W 8×P n_blackhead+W 9×P n_acne
wherein P skin texture represents the skin texture data;
P n_pore represents the normalized pore data
P n_wrinkle represents the normalized wrinkle data;
P n_blackhead represents the normalized blackhead data;
P n_acne represents the normalized acne data; and
W 6, W 7, W 8 and W 9 are predefined weights and W 6 + W 7 + W 8 + W 9= 1.
In an example, W 6 is 15%-50%; W 7 is 15%-50%; W 8 is 12%-35%; and W 9 is 12%-30%. However, according to the disclosure, W 6, W 7, W 8 and W 9 are not limited to such value ranges.
By the method 400, skin texture data of the user can be determined accurately and completely.
FIG. 5 illustrates a flowchart of a method 500 of determining a skin tone data in accordance with a third embodiment of the second aspect of the present disclosure. Note that as mentioned above, here skin tone data means evenness data of skin stone.
The method 500 starts at block 501. Block 501 is the similar as block 201 in Fig. 2. At block 501, significant object or appearance data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101. In an example, significant object or appearance data is a skin tone data, a spot data, an acne data, a blackhead data, a dark circle data and a sensitivity data.
At block 502, a normalized skin tone data, a normalized spot data, a normalized acne data, a normalized blackhead data, a normalized dark circle data and a normalized sensitivity data are generated based on the skin tone data, the spot data, the acne data, the blackhead data, the dark circle data and the sensitivity data. Note that as mentioned above,  here normalized skin tone data means normalized skin color data. More particularly, the skin tone data, the spot data, the acne data, the blackhead data, the dark circle data and the sensitivity data can be obtained according to corresponding algorithms for the skin tone, the spot, the acne, the blackhead, the dark circle data and the sensitivity based on captured images of the user`s skin.  Blocks  502, 503 can be performed in local or in a cloud server. More particularly, by corresponding algorithms stored in a local device, captured images are processed to obtain the skin tone data, the spot data, the acne data, the blackhead data, the dark circle data and the sensitivity data and then normalize these data. As an alternative, captured images are transmitted to the cloud server; then by corresponding algorithms stored in the cloud server, captured images are processed to obtain the skin tone data, the spot data, the acne data, the blackhead data and the sensitivity data, then these data are normalized; finally, the normalized skin tone data, the normalized spot data, the normalized acne data, the normalized blackhead data, the normalized dark circle data and the normalized sensitivity data are transmitted back to the user`s device.
At block 503, the skin tone data is predicted by the above-mentioned equation (3) :
P skin tone=W 10×P n_skin tone+W 11×P n_spot+W 12×P n_blackhead+W 13×P n_dark  circle+W 14×P n_sensitivity+W 15×P n_acne;
wherein P skin tone represents evenness data of the skin tone;
P n_skin tone represents the normalized skin color data;
P n_spot represents the normalized spot data;
P n_blackhead represents the normalized blackhead data;
P n_dark circle represents the normalized dark circle data;
P n_sensitivity represents the normalized sensitivity data;
P n_acne represents the normalized acne data; and
W 10, W 11, W 12, W 13, W 14 and W 15 are predefined weights and W 10+W 11+W 12+W 13+W 14 +W 15=1.
In an example, W 10 is 22%-50%; W 11 is 8%-35%; W 12 is 3%-20%; W 13 is 3%-20%; W 14 is 3%-20%; and W 15 is 3%-20%. However, according to the disclosure, W 10, W 11, W 12, W 13, W 14 and W 15 are not limited to such value ranges.
By the method 500, skin tone data can be determined accurately and completely.
FIG. 6 illustrates a flowchart of a method 600 of predicting a skin tonicity data in accordance with a fourth embodiment of the second aspect of the present disclosure. Note that as mentioned above, here the skin tonicity data means tightness data of human skin.
The method 600 starts at block 601. Block 601 is the similar as block 201 in Fig. 2. At block 601, significant object or appearance data is extracted from the one or more digital images of the region of skin by the significant object or appearance unit 101. In an example, significant object or appearance data is an age data, a tonicity data and a wrinkle data. Note that as mentioned above, here the tonicity data means the skin sagging on the facial counter.
At block 602, a normalized age data, a normalized tonicity data and a normalized wrinkle data are generated based on the extracted age data, the extracted tonicity data and the extracted wrinkle data. More particularly, the age data, the tonicity data and the wrinkle data can be obtained according to corresponding algorithms for a age, a tonicity, a wrinkle based on captured images of the user`s skin.  Blocks  602, 603 can be performed in local or in a cloud server. More particularly, by corresponding algorithms stored in a local device, captured images are processed to obtain the age data, the tonicity data and the wrinkle data and then normalize these data. As an alternative, captured images are transmitted to the cloud server; then by corresponding algorithms stored in the cloud server, captured images are processed to obtain the normalized age data, the tonicity data and the wrinkle data, then these data are normalized; finally, the normalized age data, the normalized tonicity data and the normalized wrinkle data are transmitted back to the user`s device.
At block 603, the skin tonicity is predicted by the above-mentioned equation (4) :
P skin tonicity=W 16×P n_age+W 17×P n_tonicity+W 18×P n_wrinkle
wherein P skin tonicity represents tightness data of human skin;
P n_age represents the normalized age data;
P n_tonicity represents the normalized skin sagging data on the facial counter;
P n_wrinkle represents the normalized wrinkle data; and
W 16, W 17 and W 18 are predefined weights and W 16+W 17 + W 18=1.
In an example, W 16 is 25%-65%; W 17 is 20%-50%; and W 18 is 20%-50%. However, according to the disclosure, W 16, W 17 and W 18 are not limited to such value ranges.
By the method 600, skin tonicity can be determined accurately and completely.
In an embodiment, each of method 200, method 300, method 400, method 500, method 600 also comprises displaying on a graphical user interface one or more instances of the predict skin condition data. In an example, the predict skin condition data such as  skin transparency, skin texture, skin tone and skin tonicity can be indicated in a numerical value or a visual form on the graphical user interface.
In an embodiment, each of method 200, method 300, method 400, method 500, method 600, also comprises displaying on a graphical user interface one or more instances of the extracted significant object or appearance data, or the normalized skin characteristic such as normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data, or the predict skin condition data. In an example, the extracted significant object, the normalized skin characteristic such as normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data, or the predict skin condition data can be indicated in a numerical value or a visual form on the graphical user interface.
Note that the same parameters used in  method  200, 300, 400, 500, 600 can be shared. For example, the wrinkle data and the normalized wrinkle data in method 300 can be used in  method  400, 600; the spot data and the normalized spot data in method 300 can be used in method 500; the dark circle data and the normalized dark circle data in method 300 can be used in method 500; the acne data and the normalized acne data in method 400 can be used in method 500; the blackhead data and the normalized blackhead data in method 400 can be used in method 500.
According to another aspect of the disclosure, the user can obtain four specific parameters, i.e. skin texture, skin tone, skin translucency and skin tonicity, which can best measure condition of a user`s skin in an accurate and a complete way. Then the 4T parameters can be displayed to a user`s device in more attached and understandable description form. Finally, based on determined four specific parameters, i.e. skin texture, skin tone, skin translucency and skin tonicity, a cosmetic solution is recommended to the user in order to solve specific skin problems of the user.
According to yet another aspect of the disclosure, in order to improve speed and accuracy of detection of four specific parameters, i.e. skin texture, skin tone, skin translucency and skin tonicity, a machine-learning algorithm is introduced to establish 4T model.
An embodiment of the disclosure may be an article of manufacture in which a non-transitory machine-readable medium (such as microelectronic memory) has stored thereon instructions (e.g., computer code) which program one or more data processing  components (generically referred to here as a “processor” ) to perform the operations described above. In other embodiments, some of these operations might be performed by specific hardware components that contain hardwired logic (e.g., dedicated digital filter blocks and state machines) . Those operations might alternatively be performed by any combination of programmed data processing components and fixed hardwired circuit components.
While the embodiments have been illustrated and described herein, it will be understood by those skilled in the art that various changes and modifications may be made, and equivalents may be substituted for elements thereof without departing from the true scope of the present technology. In addition, many modifications may be made to adapt to a particular situation and the teaching herein without departing from its central scope. Therefore it is intended that the present embodiments not be limited to the particular embodiment disclosed as the best mode contemplated for carrying out the present technology, but that the present embodiments include all embodiments falling within the scope of the appended claims.

Claims (36)

  1. A computing device for detecting skin conditions of a human subject, the computing device comprising:
    a significant object or appearance unit including computational circuitry configured to extract significant object or appearance data from one or more digital images of a region of skin of the human subject;
    a normalization unit including computational circuitry configured to generate normalized skin characteristic data based on the significant object or appearance data extracted from the one or more digital images of the region of skin; and
    a skin prediction unit including computational circuitry configured to predict a skin condition data based on an admix of the normalized skin characteristic data.
  2. The computing device for detecting skin conditions of the human subject of claim 1, further comprising:
    a skin condition display including computational circuitry configured to display on a graphical user interface one or more instances of the extracted significant object or appearance data, the normalized skin characteristic data or or the predicted skin condition data.
  3. The computing device for detecting skin conditions of the human subject of one of claims 1-2, wherein the significant object or appearance unit includes computational circuitry configured to extract acne data, blackhead data, dark circle data, pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data from the one or more digital images of the region of skin.
  4. The computing device for detecting skin conditions of the human subject of one of claims 1-3 wherein the normalization unit includes computational circuitry configured to generate normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data.
  5. The computing device for detecting skin conditions of the human subject of one of claims 1-4, wherein the significant object or appearance unit includes computational circuitry configured to obtain data indicative of a presence, absence, or severity of a skin  condition from the one or more digital images of the region of skin.
  6. The computing device for detecting skin conditions of the human subject of claim 4, wherein the skin prediction unit includes computational circuitry configured to predict skin translucency data based on a weighted admixture of normalized reflection data, normalized hue data, normalized wrinkle data, normalized spot data, and normalized dark circle data.
  7. The computing device for detecting skin conditions of the human subject of claim 4, wherein the skin prediction unit includes computational circuitry configured to predict skin translucency data based on a weighted admixture of P n_reflection, P n_hue, P n_wrinkle, P n_spot, and P n_dark circle,
    wherein P skin translucency = W 1×P n_reflection+W 2×P n_hue+W 3×P n_wrinkle+W 4×P n_spot+W 5×P n_dark circle,
    wherein
    P skin translucency represents the skin translucency data,
    P n_reflection represents the normalized reflection data,
    P n_hue represents the normalized hue data,
    P n_wrinkle represents the normalized wrinkle data,
    P n_spot represents the normalized spot data,
    P n_dark circle the normalized dark circle data;
    wherein W 1, W 2, W 3, W 4 and W 5 are predefined weights; and
    wherein W 1+W 2+W 3+W 4+W 5 = 1.
  8. The computing device for detecting skin conditions of the human subject of claim 7, wherein W 1 is 28%-45%; W 2 is 28%-45%; W 3 is 5%-20%; W 4 is 2%-15%; and W 5 is 2%-15%.
  9. The computing device for detecting skin conditions of the human subject of claim 4, wherein the skin prediction unit includes computational circuitry configured to predict skin texture data based on a weighted admixture of normalized pore data, normalized wrinkle data, normalized blackhead data, and normalized acne data.
  10. The computing device for detecting skin conditions of the human subject of claim 4, wherein the skin prediction unit includes computational circuitry configured to predict skin texture data based on a weighted admixture of P n_pore, P n_wrinkle, P n_blackhead, and P n_acne;
    wherein P skin texture=W 6×P n_pore+W 7×P n_wrinkle+W 8×P n_blackhead+W 9×P n_acne;
    wherein
    P skin texture represents the skin texture data,
    P n_pore represents the normalized pore data,
    P n_wrinkle represents the normalized wrinkle data,
    P n_blackhead represents the normalized blackhead data,
    P n_acne represents the normalized acne data,
    wherein W 6, W 7, W 8, and W 9 are predefined weights; and
    wherein W 6+W 7+W 8+W 9= 1.
  11. The computing device for detecting skin conditions of the human subject of claim 10, wherein W 6 is 15%-50%; W 7 is 15%-50%; W 8 is 12%-35%; and W 9 is 12%-30%.
  12. The computing device for detecting skin conditions of the human subject of claim 4, wherein the skin prediction unit includes computational circuitry configured to predict skin tone data based on a weighted admixture of normalized skin tone data, normalized spot data, normalized acne data, normalized blackhead data, normalized dark circle data, and normalized sensitivity data.
  13. The computing device for detecting skin conditions of the human subject of claim 4, wherein the skin prediction unit includes computational circuitry configured to predict skin tone data based on a weighted admixture of P n_skintone, P n_spot, P n_blackhead, P n_dark circle, P n_sensitivity, P n_acne;
    P skin tone=W 10×P n_skin tone+W 11×P n_spot+W 12×P n_blackhead+W 13×P n_dark circle+W 14×P n_sensitivity+W 15×P n_acne,
    wherein P skin tone represents evenness data of the skin tone;
    P n_skin tone represents the normalized skin color data;
    P n_spot represents the normalized spot data;
    P n_blackhead represents the normalized blackhead data;
    P n_dark circle represents the normalized dark circle data;
    P n_sensitivity represents the normalized sensitivity data;
    P n_acne represents the normalized acne data; and
    wherein W 10, W 11, W 12, W 13, W 14 and W 15 are predefined weights and
    wherein W 10+W 11+W 12+W 13+W 14+W 15=1.
  14. The computing device for detecting skin conditions of the human subject of claim 13, wherein W 10 is 22%-50%; W 11 is 8%-35%; W 12 is 3%-20%; W 13 is 3%-20%;  W 14 is 3%-20%; and W 15 is 3%-20%.
  15. The computing device for detecting skin conditions of the human subject of claim 4, wherein the skin prediction unit includes computational circuitry configured to predict skin tonicity data based on a weighted admixture of normalized age data, normalized tonicity data, and normalized wrinkle data.
  16. The computing device for detecting skin conditions of the human subject of claim 4, wherein the skin prediction unit includes computational circuitry configured to predict skin tonicity data based on a weighted admixture of P n_age, P n_tonicity, and P n_wrinkle;
    P skin tonicity=W 16×P n_age+W 17×P n_tonicity+W 18×P n_wrinkle,
    wherein P skin tonicity represents tightness data of human skin;
    P n_age represents the normalized age data;
    P n_tonicity represents the normalized skin sagging data on the facial counter;
    P n_wrinkle represents the normalized wrinkle data; and
    wherein W 16, W 17 and W 18 are predefined weights; and
    wherein W 16+W 17+W 18=1.
  17. The computing device for detecting skin conditions of the human subject of claim 16, wherein W 16 is 25%-65%; W 17 is 20%-50%; and W 18 is 20%-50%.
  18. A method for detecting skin conditions of a human subject, the method comprising:
    extracting significant object or appearance data from one or more digital images of a region of skin of the human subject;
    generating normalized skin characteristic data based on the significant object or appearance data extracted from the one or more digital images of the region of skin; and
    predicting a skin condition data based on an admix of the normalized skin characteristic data.
  19. The method for detecting skin conditions of the human subject of claim 18, further comprising:
    displaying on a graphical user interface one or more instances of the extracted significant object or appearance data, the normalized skin characteristic data or the predicted skin condition data.
  20. The method for detecting skin conditions of the human subject of one of claims 18-19, also comprising:
    extracting acne data, blackhead data, dark circle data, pore data, skin sensitivity data, spot data, wrinkle data, tonicity data, age data, hue data, reflection data, or skin tone data from the one or more digital images of the region of skin.
  21. The method for detecting skin conditions of the human subject of one of claims 18-20, also comprising:
    generating normalized acne data, normalized blackhead data, normalized dark circle data, normalized pore data, normalized skin sensitivity data, normalized spot data, normalized wrinkle data, normalized tonicity data, normalized age data, normalized hue data, normalized reflection data, or normalized skin tone data.
  22. The method for detecting skin conditions of the human subject of one of claims 18-21, also comprising:
    obtaining data indicative of a presence, absence, or severity of a skin condition from the one or more digital images of the region of skin.
  23. The method for detecting skin conditions of the human subject of claim 21, also comprising:
    predicting skin translucency data based on a weighted admixture of normalized reflection data, normalized hue data, normalized wrinkle data, normalized spot data, and normalized dark circle data.
  24. The method for detecting skin conditions of the human subject of claim 21, also comprising:
    predicting skin translucency data based on a weighted admixture of P n_reflection, P n_hue, P n_wrinkle, P n_spot, and P n_dark circle,
    wherein P skin translucency = W 1×P n_reflection+W 2×P n_hue+W 3×P n_wrinkle+W 4×P n_spot+W 5×P n_dark circle;
    wherein
    P skin translucency represents the skin translucency data,
    P n_reflection represents the normalized reflection data,
    P n_hue represents the normalized hue data,
    P n_wrinkle represents the normalized wrinkle data,
    P n_spot represents the normalized spot data,
    P n_dark circle the normalized dark circle data;
    wherein W 1, W 2, W 3, W 4 and W 5 are predefined weights; and
    wherein W 1+W 2+W 3+W 4+W 5 = 1.
  25. The method for detecting skin conditions of the human subject of claim 24, wherein W 1 is 28%-45%; W 2 is 28%-45%; W 3 is 5%-20%; W 4 is 2%-15%; and W 5 is 2%-15%.
  26. The method for detecting skin conditions of the human subject of claim 21, also comprising: predicting skin texture data based on a weighted admixture of normalized pore data, normalized wrinkle data, normalized blackhead data, and normalized acne data.
  27. The method for detecting skin conditions of the human subject of claim 21, also comprising: predicting skin texture data based on a weighted admixture of P n_pore, P n_wrinkle, P n_blackhead, and P n_acne;
    wherein P skin texture=W 6×P n_pore+W 7×P n_wrinkle+W 8×P n_blackhead+W 9×P n_acne;
    wherein
    P skin texture represents the skin texture data,
    P n_pore represents the normalized pore data,
    P n_wrinkle represents the normalized wrinkle data,
    P n_blackhead represents the normalized blackhead data,
    P n_acne represents the normalized acne data,
    wherein W 6, W 7, W 8, and W 9 are predefined weights; and
    wherein W 6+W 7+W 8+W 9= 1.
  28. The method for detecting skin conditions of the human subject of claim 27, wherein W 6 is 15%-50%; W 7 is 15%-50%; W 8 is 12%-35%; and W 9 is 12%-30%.
  29. The method for detecting skin conditions of the human subject of claim 21, also comprising:
    predicting skin tone data based on a weighted admixture of normalized skin tone data, normalized spot data, normalized acne data, normalized blackhead data, normalized dark circle data, normalized sensitivity data.
  30. The method for detecting skin conditions of the human subject of claim 21, also comprising:
    predicting skin tone data based on a weighted admixture of P n_skintone, P n_spot, P n_blackhead, P n_dark circle, P n_sensitivity, P n_acne;
    P skin tone=W 10×P n_skin tone+W 11×P n_spot+W 12×P n_blackhead+W 13×P n_dark circle+W 14×P n_sensitivity+W 15×P n_acne,
    wherein P skin tone represents evenness data of the skin tone;
    P n_skin tone represents the normalized skin color data;
    P n_spot represents the normalized spot data;
    P n_blackhead represents the normalized blackhead data;
    P n_dark circle represents the normalized dark circle data;
    P n_sensitivity represents the normalized sensitivity data;
    P n_acne represents the normalized acne data; and
    wherein W 10, W 11, W 12, W 13, W 14 and W 15 are predefined weights and
    wherein W 10+W 11+W 12+W 13+W 14+W 15=1.
  31. The method for detecting skin conditions of the human subject of claim 30, wherein W 10 is 22%-50%; W 11 is 8%-35%; W 12 is 3%-20%; W 13 is 3%-20%; W 14 is 3%-20%; and W 15 is 3%-20%.
  32. The method for detecting skin conditions of the human subject of claim 21, also comprising:
    predicting skin tonicity data based on a weighted admixture of normalized age data, normalized tonicity data, and normalized wrinkle data.
  33. The method for detecting skin conditions of the human subject of claim 21, also comprising:
    predicting skin tonicity data based on a weighted admixture of P n_age, P n_tonicity, and P n_wrinkle;
    P skin tonicity=W 16×P n_age+W 17×P n_tonicity+W 18×P n_wrinkle,
    wherein P skin tonicity represents tightness data of human skin;
    P n_age represents the normalized age data;
    P n_tonicity represents the normalized skin sagging data on the facial counter;
    P n_wrinkle represents the normalized wrinkle data; and
    wherein W 16, W 17 and W 18 are predefined weights; and
    wherein W 16+W 17+W 18=1.
  34. The method for detecting skin conditions of the human subject of claim 33, wherein W 16 is 25%-65%; W 17 is 20%-50%; and W 18 is 20%-50%.
  35. A computer readable medium having stored thereon instructions that when executed cause a computing device to perform the method according to one of claims 18-34.
  36. An apparatus of detecting conditions of a user`s skin, the apparatus comprises means for performing the method according to one of claims 18-34.
PCT/CN2021/089954 2021-04-26 2021-04-26 Computing device, method and apparatus for detecting skin conditions of human subject WO2022226728A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202180097549.5A CN117241722A (en) 2021-04-26 2021-04-26 Computing device, method and apparatus for detecting skin condition of human subject
PCT/CN2021/089954 WO2022226728A1 (en) 2021-04-26 2021-04-26 Computing device, method and apparatus for detecting skin conditions of human subject
FR2106235A FR3122076B1 (en) 2021-04-26 2021-06-14 COMPUTER device, method and apparatus for DETECTING skin conditions of a human subject

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/089954 WO2022226728A1 (en) 2021-04-26 2021-04-26 Computing device, method and apparatus for detecting skin conditions of human subject

Publications (1)

Publication Number Publication Date
WO2022226728A1 true WO2022226728A1 (en) 2022-11-03

Family

ID=83723830

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/089954 WO2022226728A1 (en) 2021-04-26 2021-04-26 Computing device, method and apparatus for detecting skin conditions of human subject

Country Status (3)

Country Link
CN (1) CN117241722A (en)
FR (1) FR3122076B1 (en)
WO (1) WO2022226728A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120288168A1 (en) * 2011-05-09 2012-11-15 Telibrahma Convergent Communications Pvt. Ltd. System and a method for enhancing appeareance of a face
US20160135730A1 (en) * 2013-06-28 2016-05-19 Panasonic Intellectual Property Corporation Of America Skin function evaluation device and skin evaluation method
US20170270350A1 (en) * 2016-03-21 2017-09-21 Xerox Corporation Method and system for assessing facial skin health from a mobile selfie image
CN111814520A (en) * 2019-04-12 2020-10-23 虹软科技股份有限公司 Skin type detection method, skin type grade classification method, and skin type detection device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2897768B1 (en) * 2006-02-24 2008-05-09 Oreal METHOD FOR EVALUATING THE SCRATCH OF THE DYE.
FR2983328B1 (en) * 2011-11-29 2022-07-29 Developpement Industrialisation Et Promotion De Tech Avancees METHOD FOR MANUFACTURING AND/OR SELECTING COSMETIC PRODUCTS SUITABLE FOR A PARTICULAR CUSTOMER AND COMPUTER PROGRAM IMPLEMENTING SUCH METHOD
US20150099947A1 (en) * 2013-10-04 2015-04-09 Access Business Group International Llc Skin youthfulness index, methods and applications thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120288168A1 (en) * 2011-05-09 2012-11-15 Telibrahma Convergent Communications Pvt. Ltd. System and a method for enhancing appeareance of a face
US20160135730A1 (en) * 2013-06-28 2016-05-19 Panasonic Intellectual Property Corporation Of America Skin function evaluation device and skin evaluation method
US20170270350A1 (en) * 2016-03-21 2017-09-21 Xerox Corporation Method and system for assessing facial skin health from a mobile selfie image
CN111814520A (en) * 2019-04-12 2020-10-23 虹软科技股份有限公司 Skin type detection method, skin type grade classification method, and skin type detection device

Also Published As

Publication number Publication date
FR3122076A1 (en) 2022-10-28
FR3122076B1 (en) 2024-02-02
CN117241722A (en) 2023-12-15

Similar Documents

Publication Publication Date Title
EP3000386B1 (en) Skin function evaluation device and skin evaluation method
KR101800102B1 (en) Skin diagnostic and image processing methods
US20150099947A1 (en) Skin youthfulness index, methods and applications thereof
CN114502061A (en) Image-based automatic skin diagnosis using deep learning
KR102485256B1 (en) Customized Skin diagnostic and Managing System
KR20150141989A (en) Skin diagnostic and image processing systems, apparatus and articles
JP6473401B2 (en) Skin gloss evaluation apparatus, gloss evaluation method, and gloss evaluation program
JP2023052849A (en) Apparatus and method for determining cosmetic skin attributes
Chang et al. Automatic facial skin defect detection system
TWI452998B (en) System and method for establishing and analyzing skin parameters using digital image multi-area analysis
KR101949152B1 (en) Method and Appartus for Skin Condition Diagnosis and System for Providing Makeup Information suitable Skin Condition Using the Same
WO2022226728A1 (en) Computing device, method and apparatus for detecting skin conditions of human subject
KR20180110842A (en) Customized semi-permanent make-up recommendation system based on virtual experience and its service method
KR102239575B1 (en) Apparatus and Method for skin condition diagnosis
US20230144089A1 (en) Smart system for skin testing and customised formulation and manufacturing of cosmetics
WO2021120152A1 (en) Computing device, method and apparatus for noninvasively diagnosing a hair
KR20200121692A (en) Method and apparatus for estimating age of skin
WO2023184221A1 (en) Computing device, method and apparatus for predicting acne propoerties for keratin material of human subject
JP2022078936A (en) Skin image analysis method
KR20200110512A (en) Device for skin measurement of led mask
KR20210069495A (en) Electronic apparatus and controlling method thereof
CN113160224B (en) Artificial intelligence-based skin aging degree identification method, system and device
TWI555507B (en) Electrocardiograph (ecg)-based identity identifying system
WO2024021000A1 (en) System and method for evaluating dynamic wrinkles on keratin material of a user to be tested
WO2023217626A1 (en) Detection and visualization of cutaneous signs using a heat map

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21938219

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE