CN117379005A - Skin detection control method, device, equipment and storage medium of beauty instrument - Google Patents
Skin detection control method, device, equipment and storage medium of beauty instrument Download PDFInfo
- Publication number
- CN117379005A CN117379005A CN202311564404.9A CN202311564404A CN117379005A CN 117379005 A CN117379005 A CN 117379005A CN 202311564404 A CN202311564404 A CN 202311564404A CN 117379005 A CN117379005 A CN 117379005A
- Authority
- CN
- China
- Prior art keywords
- parameter
- skin
- image information
- parameters
- treatment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003796 beauty Effects 0.000 title claims abstract description 66
- 238000000034 method Methods 0.000 title claims abstract description 53
- 238000001514 detection method Methods 0.000 title claims abstract description 43
- 238000011282 treatment Methods 0.000 claims abstract description 127
- 238000004458 analytical method Methods 0.000 claims abstract description 33
- 230000037303 wrinkles Effects 0.000 claims abstract description 13
- 230000001815 facial effect Effects 0.000 claims abstract description 11
- 239000011148 porous material Substances 0.000 claims abstract description 11
- 239000004519 grease Substances 0.000 claims abstract description 10
- 230000008859 change Effects 0.000 claims description 34
- 239000002537 cosmetic Substances 0.000 claims description 30
- 230000000694 effects Effects 0.000 claims description 23
- 230000005855 radiation Effects 0.000 claims description 23
- 230000001502 supplementing effect Effects 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 12
- 230000000474 nursing effect Effects 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 claims description 5
- 238000013527 convolutional neural network Methods 0.000 claims description 5
- 238000007781 pre-processing Methods 0.000 claims description 5
- 230000003020 moisturizing effect Effects 0.000 claims description 4
- 238000011176 pooling Methods 0.000 claims description 4
- 238000000605 extraction Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 description 11
- 238000013528 artificial neural network Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 5
- 230000002159 abnormal effect Effects 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 208000002874 Acne Vulgaris Diseases 0.000 description 3
- 206010000496 acne Diseases 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000003750 conditioning effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 206010061218 Inflammation Diseases 0.000 description 1
- 206010039509 Scab Diseases 0.000 description 1
- 206010040844 Skin exfoliation Diseases 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 239000000306 component Substances 0.000 description 1
- 239000008358 core component Substances 0.000 description 1
- 230000035618 desquamation Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000003631 expected effect Effects 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000004054 inflammatory process Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000037311 normal skin Effects 0.000 description 1
- 230000035790 physiological processes and functions Effects 0.000 description 1
- 230000037075 skin appearance Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000011269 treatment regimen Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/442—Evaluating skin mechanical properties, e.g. elasticity, hardness, texture, wrinkle assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/443—Evaluating skin constituents, e.g. elastin, melanin, water
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/444—Evaluating skin marks, e.g. mole, nevi, tumour, scar
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/445—Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F7/00—Heating or cooling appliances for medical or therapeutic treatment of the human body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/06—Radiation therapy using light
- A61N5/0613—Apparatus adapted for a specific treatment
- A61N5/0616—Skin treatment other than tanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N7/00—Ultrasound therapy
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
- G06V10/449—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
- G06V10/451—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
- G06V10/454—Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F7/00—Heating or cooling appliances for medical or therapeutic treatment of the human body
- A61F2007/0001—Body part
- A61F2007/0002—Head or parts thereof
- A61F2007/0003—Face
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/06—Radiation therapy using light
- A61N2005/0626—Monitoring, verifying, controlling systems and methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N7/00—Ultrasound therapy
- A61N2007/0004—Applications of ultrasound therapy
- A61N2007/0034—Skin treatment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30088—Skin; Dermal
Abstract
The invention relates to the technical field of beauty instruments, in particular to a skin detection control method, a device, equipment and a storage medium of a beauty instrument, wherein the method comprises the steps of acquiring first image information of a face of a user; inputting the first image information into a trained model, and converting the first image information into first parameters, wherein the first parameters comprise skin layer tone parameters, skin layer wrinkle parameters, skin layer pore density parameters, skin layer moisture parameters, skin layer grease parameters and skin layer foreign matter parameters; inputting the first parameters into a preset database for analysis and matching to obtain second parameters of skin treatment; comparing the second parameter with the normal facial parameter to determine target treatment data of the skin, wherein the target treatment data comprises electric signal adjustment data; and controlling the beauty instrument based on the electric signal adjustment data. The treatment of the face can be more targeted, and the corresponding treatment mode is adapted according to different skin states of the user.
Description
Technical Field
The present application relates to the technical field of beauty treatment apparatuses, and in particular, to a method, an apparatus, a device, and a storage medium for controlling skin detection of a beauty treatment apparatus.
Background
A beauty instrument is a machine for adjusting and improving the body and the face according to the physiological functions of a human body. According to the working principle, the beauty instrument can be divided into a laser beauty instrument, a radio frequency beauty instrument and the like, and the beauty instrument is usually provided with various working gears so as to adapt to the requirements of different users. In the use process, a user can select a corresponding working gear according to the actual condition of the skin so as to achieve the experience effect.
Patent literature (CN 115845251 a) discloses a power control device of a cosmetic instrument, the cosmetic instrument is a radio frequency cosmetic instrument, the radio frequency cosmetic instrument includes a radio frequency module, the radio frequency module includes a radio frequency electrode, the control device includes: the temperature determining module is used for determining the current temperature of the face skin of the user; the distance measuring module is used for obtaining the distance between the face of the user and the radio frequency electrode; the processing module is used for determining a transmitting power compensation value and a transmitting voltage compensation value corresponding to the distance based on the distance and a preset first mapping relation; and adjusting the transmitting power and the transmitting voltage of the radio frequency beauty instrument based on the determined transmitting power compensation value, the transmitting voltage compensation value and the current temperature. Through the above patent, the corresponding transmit power compensation value and transmit voltage compensation value can be determined based on the distance between the face and the radio frequency electrode of the cosmetic instrument. In the above patent, the corresponding working mode cannot be adapted to different skin states of the user, and the absorption and beauty care effects of the radio frequency signals are different due to different skin areas and different skin states of the human body. When there are only one or several radio frequency signals with a single specific radio frequency, the corresponding treatment mode cannot be adapted to different skin states of the user, and when the skin does not need treatment, the user does not know the skin condition, and the condition of excessive cosmetic treatment is easy to occur.
Accordingly, the prior art has drawbacks and needs improvement.
Disclosure of Invention
In order to solve one or more problems in the prior art, a main object of the present application is to provide a skin detection control method, apparatus, device and storage medium for a cosmetic instrument.
In order to achieve the above object, the present application proposes a skin detection control method of a cosmetic instrument, the method comprising:
acquiring first image information of a user face;
inputting the first image information into a trained model, and converting the first image information into a first parameter set, wherein the first parameter set comprises a skin layer tone parameter, a skin layer wrinkle parameter, a skin layer pore density parameter, a skin layer moisture parameter, a skin layer grease parameter and a skin layer foreign matter parameter;
inputting the first parameter set into a preset database for analysis and matching to obtain a second parameter of skin treatment;
comparing the second parameter with the normal facial parameter to determine target treatment data of the skin, wherein the target treatment data comprises electric signal adjustment data;
and controlling the beauty instrument based on the electric signal adjustment data.
Further, the inputting the first image information into the trained model, converting the first image information into a first parameter set, includes:
Preprocessing the first image information, removing noise of the image, adjusting brightness and contrast of the image, extracting a skin area through color information and texture features, and clustering pixels similar to skin hues into a skin area;
inputting the preprocessed image into a trained deep convolutional neural network, extracting the characteristics of the image through a filter, and pooling the classified image to obtain a first parameter set;
the formula of the feature extraction is as follows: r=i (x, y) -a.L (x, y), where I (x, y) is the original pixel value of the preprocessed image, a is the weight coefficient, and L (x, y) is the pixel value of the filter output.
Further, the beauty instrument comprises a macro shooting part and a light supplementing part, wherein the light supplementing part is used for supplementing light to the macro shooting part;
the acquiring the first image information of the face of the user comprises:
acquiring brightness of the environment and reflection characteristics on the face of a user;
judging whether the brightness of the environment meets the brightness compensation condition;
when the brightness of the environment meets the brightness compensation condition, acquiring a brightness target compensation quantity;
controlling the light supplementing section based on a brightness target compensation amount;
judging whether the light radiation intensity of the reflection characteristic meets the radiation intensity compensation condition or not;
When the light radiation intensity of the reflection characteristic meets the radiation intensity compensation condition, acquiring a radiation intensity target compensation quantity;
and controlling the macro shooting part to acquire image information of the face of the user based on the radiation intensity target compensation amount.
Further, the step of inputting the first parameter set into a preset database for analysis and matching to obtain a second parameter of skin treatment includes:
inputting the first parameter set into a preset database;
searching and matching the first parameter set through a database;
judging whether any one parameter of the first parameter set exceeds a preset parameter threshold range;
when at least one parameter set exceeds a preset parameter threshold range in the first parameter set, matching all parameters exceeding the preset parameter range with the database to obtain a second parameter of skin treatment, wherein the second parameter comprises a skin state parameter;
when any parameter set out of the first parameter set exceeds a preset parameter threshold range, the beauty instrument exits from a beauty mode and is switched to a skin conventional nursing mode, wherein the conventional nursing mode comprises a soothing massage mode and a moisturizing mode.
Further, the target treatment data further includes backup electrical signal conditioning data, and after controlling the cosmetic instrument based on the electrical signal conditioning data, the method further includes:
After a preset working time, acquiring second image information of the face of the user during treatment;
comparing and analyzing the second image information with the first image information;
judging whether the second image information meets the condition of calling the standby electric signal adjustment data or not;
when the variation difference value of the second image information is smaller than a preset standard value, judging that the second image information meets the condition of calling the standby electric signal adjustment data;
and controlling the beauty instrument based on the standby electric signal adjustment data.
Further, after the cosmetic treatment is completed, the method further comprises:
acquiring third image information of the face of the user after treatment is completed;
inputting the third image information into a trained model, and converting the third image information into a third parameter;
judging whether any one of the third parameters exceeds a preset parameter threshold range;
if any one of the third parameters does not exceed the preset parameter threshold range, inputting the current target treatment data into the preset database for storage;
comparing and analyzing the third image information with the first image information and the second image information to obtain a change rule and change time;
Generating an analysis report based on the change rule and the change time;
and sending the change effect information to a cloud server, and sending the change effect information to a third-party platform through the cloud server.
Further, judging whether the user stores the history detection parameters;
if the user stores the historical detection parameters, the historical skin parameters and the current detection parameters are input into a preset database for analysis and matching.
The embodiment of the application also provides a skin detection control device of a beauty instrument, comprising:
the acquisition module is used for acquiring first image information of the face of the user;
the conversion module is used for inputting the first image information into a trained model and converting the first image information into a first parameter group, wherein the first parameter group comprises a skin layer tone parameter, a skin layer wrinkle parameter, a skin layer pore density parameter, a skin layer moisture parameter, a skin layer grease parameter and a skin layer foreign matter parameter;
the analysis module is used for inputting the first parameter set into a preset database for analysis and matching to obtain a second parameter of skin treatment;
the determining module is used for comparing the second parameter with the normal facial parameter to determine target treatment data of the skin, wherein the target treatment data comprises electric signal adjustment data;
And the control module is used for controlling the beauty instrument based on the electric signal adjustment data.
The present application also provides a computer device comprising a memory storing a computer program and a processor implementing the steps of any of the methods described above when the computer program is executed by the processor.
The present application also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method of any of the above.
The skin detection control method, the device, the equipment and the storage medium of the beauty instrument can be used for judging the face state of a user according to the first image information by acquiring the first image information of the user, inputting the acquired first image information into a trained model for processing and analyzing, converting the first image information into a first parameter group by identifying and converting the first image information, converting the image into parameters, more intuitively analyzing the problem of skin, inputting the first parameter group into a pre-trained database for matching, enabling the database to give corresponding treatment second parameters for different parameters, comparing the second parameters with normal face parameters, judging the parameters of the face of the user, generating corresponding target treatment data, controlling the beauty instrument to make corresponding treatment according to the electric signal adjustment data of the target treatment data, enabling the face treatment to be more targeted, and fully playing the face treatment of the user through the detected treatment under the condition that the user does not know the use of the beauty instrument.
Drawings
Fig. 1 is a flow chart of a skin detection control method of a cosmetic instrument according to an embodiment of the present application;
fig. 2 is a flow chart of a skin detection control method of a cosmetic instrument according to an embodiment of the present application;
fig. 3 is a schematic block diagram of a skin detection control device of a cosmetic instrument according to an embodiment of the present application;
fig. 4 is a block diagram schematically illustrating a structure of a computer device according to an embodiment of the present application.
The achievement of the objects, functional features and advantages of the present application will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
Referring to fig. 1, in an embodiment of the present application, there is provided a skin detection control method of a cosmetic apparatus, the method including:
s1, acquiring first image information of a user face;
s2, inputting the first image information into a trained model, and converting the first image information into a first parameter set, wherein the first parameter set comprises a skin layer tone parameter, a skin layer wrinkle parameter, a skin layer pore density parameter, a skin layer moisture parameter, a skin layer grease parameter and a skin layer foreign matter parameter;
S3, inputting the first parameter set into a preset database for analysis and matching to obtain a second parameter of skin treatment;
s4, comparing the second parameter with the normal facial parameter to determine target treatment data of the skin, wherein the target treatment data comprises electric signal adjustment data;
s5, controlling the beauty instrument based on the electric signal adjustment data.
As described in step S1, by acquiring the first image information of the user, the facial status of the user may be determined according to the first image information, for example, the facial photograph may be acquired by the macro camera device, and the beauty treatment apparatus may include a treatment structure, where the treatment structure may be provided with an ultrasonic ice head assembly, a spectrum assembly or a water-replenishing heating assembly, a camera detection assembly and a data processing assembly.
In the step S2, the terminal inputs the obtained first image information into the trained model for processing analysis, and converts the first image information into a plurality of sets of first parameters, where the first parameters include skin layer tone parameters, skin layer wrinkle parameters, skin layer pore density parameters, skin layer moisture parameters, skin layer grease parameters and skin layer foreign matter parameters, the skin layer tone parameters are the overall tone of the skin, such as brightness, darkness, uniformity, non-uniformity, etc., the skin wrinkle parameters are the number and depth of fine lines and wrinkles on the skin, the skin layer density parameters are the surface smoothness and touch of the skin, if there is no roughness, pores are large, etc., the skin layer moisture parameters are the moisture content and the moisture retention of the skin are evaluated, if there is no dryness, desquamation, etc., the skin layer foreign matter parameters are the acne on the skin, the acne, the closure, etc., if there is no inflammation, acne, scab, etc., for example, setting the skin wrinkle parameters to 5-7 layers as the skin parameters are higher than the corresponding to the skin parameters, and making a corresponding to the skin foreign matter treatment standard, and making a treatment for each skin foreign matter is beyond the skin standard, and making a corresponding to the skin appearance parameters.
As described in step S3, the first parameter set is input into a pre-trained database for matching, for example, the database is continuously trained through historical treatment, so that the database gives corresponding treatment parameters for different parameters, and thus, after the first parameter set is input, the second parameter set can be quickly matched with the second parameter set for treatment, the second parameter set can have two or more matched groups, the nearest parameter is pre-controlled to the beauty instrument, the change of the face of the user is continuously detected in the treatment process, and when the effect does not reach the expected effect, the second parameter set can be replaced to control the beauty instrument.
As described in the above steps S4-S5, the parameters to be adjusted on the face of the user are determined by combining the second parameters with the normal face parameters, corresponding target treatment data are generated, the beauty instrument is controlled to make corresponding treatment according to the electric signal adjustment data of the target treatment data, the corresponding treatment scheme is made by identifying the specific condition of the face of the user, the treatment on the face can be more targeted, and the use effect of the beauty instrument can be fully exerted by the treatment made after the detected condition of the face of the user under the condition that the user does not know the use condition of the beauty instrument. In view of the current user needs, the generated parameter flow of the cosmetic instrument may comprise one or more skin management flows, i.e. comprising one or more different skin care effects, and the achieved skin care results may have a relevant pertinence, e.g. generating a plurality of solutions to different problems, i.e. skin management flows, depending on skin data and history functions, the pertinence problem solutions comprising cleaning, moisturizing, lifting, tightening and also solutions of different intensities may exist, etc.
According to the method, the first image information of the user is obtained and can be used for judging the face state of the user according to the first image information, the obtained first image information is input into a trained model for processing analysis, the first image information is converted into a first parameter set through identification conversion, the image is converted into parameters, the problems existing in the skin can be more intuitively analyzed, the first parameter set is input into a pre-trained database for matching, the database gives corresponding treatment second parameters for different parameters, the second parameters are combined with normal face parameters for comparison analysis, so that the parameters required to be adjusted for the face of the user are judged, corresponding target treatment data are generated, the corresponding treatment is controlled by the beauty instrument according to the electric signal adjustment data of the target treatment data, the face treatment is more targeted, and the use effect of the beauty instrument can be fully exerted through the treatment made after the detected face condition of the user is not known.
In a possible embodiment, with the large-scale growth of the internet of things, the skin detection control of the internet of things in the beauty instrument plays an increasingly important role. The internet of things refers to collecting any object or process needing to be monitored, connected and interacted in real time through various devices and technologies such as various information sensors, radio frequency identification technologies, global positioning systems, infrared sensors and laser scanners, collecting various needed information such as sound, light, heat, electricity, mechanics, chemistry, biology and positions, and realizing ubiquitous connection of objects and people through various possible network access, and realizing intelligent sensing, identification and management of objects and processes. The terminal electric control end of the beauty instrument can be interacted with the cloud platform, and then the cloud platform is connected with a third party platform (namely a mobile phone end), so that treatment data or results can be obtained, and sharing and synchronization of operation can be realized.
In one embodiment, the inputting the first image information into the trained model, converting the first image information into a first set of parameters, includes:
preprocessing the first image information, removing noise of the image, adjusting brightness and contrast of the image, extracting a skin area through color information and texture features, and clustering pixels similar to skin hues into a skin area;
inputting the preprocessed image into a trained deep convolutional neural network, extracting the characteristics of the image through a filter, and pooling the classified image to obtain a first parameter set;
the formula of the feature extraction is as follows: r=i (x, y) -a.L (x, y), where I (x, y) is the original pixel value of the preprocessed image, a is the weight coefficient, and L (x, y) is the pixel value of the filter output.
As described above, by preprocessing the first image information, all the first image information can be optimized, and the trained model can more conveniently process the first image information, and in preprocessing, by removing noise of an image, the noise of the image is not essentially spatial, that is, the point is not abrupt relative to the surrounding points, that is, the point is a noise point. Instead, if the error is large, the point can be called a noise point, that is, the nature of the noise is time domain, and when a certain area of an image is calculated, the signal-to-noise ratio of the area is sometimes calculated by using a flat image of the area, for example: the individual points of the flat area can be considered as a set of center points over a continuous time. And then, the brightness of the image is adjusted, and the image is identified by the model under a certain brightness standard so as to ensure the accuracy of the identification. The pixel points with similar skin hues are clustered into the skin area, so that the speed of model identification can be improved. Inputting the preprocessed image into a trained deep convolutional neural network, training a to-be-trained graph neural network by using a loss function value in specific training of the convolutional neural network, obtaining the trained graph neural network, carrying out back propagation according to the final loss function value, updating network parameters of the to-be-trained graph neural network, wherein the updated network parameters comprise learning rate and weight matrix of the to-be-trained graph neural network, the training process of the to-be-trained graph neural network comprises multiple iterations, the larger the difference value between the final loss function values obtained by two adjacent iteration calculation is, the faster the updating of the network parameters of the to-be-trained graph neural network is, judging whether the propagation times of the back propagation are larger than a propagation times threshold, if so, stopping training, and obtaining the trained graph neural network. The method comprises the steps of extracting features of an image through a filter, wherein the filter, namely a convolution layer, is a core component convolution layer of a convolution neural network, extracting and classifying the features in the image through the filter, outputting and flattening the classified features through a pooling layer to obtain feature vector parameters, and combining feature vectors to obtain a first parameter set.
Referring to fig. 2, in an embodiment, the beauty treatment apparatus includes a macro camera and a light supplementing unit for supplementing light to the macro camera;
the acquiring the first image information of the face of the user comprises:
s11, acquiring brightness of the environment and reflection characteristics on the face of the user;
s12, judging whether the brightness of the environment meets the brightness compensation condition;
s13, when the brightness of the environment meets the brightness compensation condition, acquiring a brightness target compensation quantity;
s14, controlling the light supplementing part based on the brightness target compensation quantity;
s15, judging whether the light radiation intensity of the reflection characteristic meets the radiation intensity compensation condition;
s16, when the light radiation intensity of the reflection characteristic meets the radiation intensity compensation condition, acquiring a radiation intensity target compensation quantity;
and S17, controlling the macro shooting part to acquire image information of the face of the user based on the radiation intensity target compensation amount.
As described in the above steps S11-S14, the beauty instrument includes a macro shooting part and a light supplementing part, the skin of the face of the user can be shot through the macro shooting part, the feature information can be more clearly shot by the macro shooting due to the need of shooting the detailed image of the face of the user, the light supplementing part is arranged to supplement the macro shooting part with a proper light source due to the need of better light for the macro shooting, the brightness of the environment and the reflection feature of the face of the user are obtained, whether the light source is sufficient is judged according to the brightness of the environment, when the brightness of the environment is weaker, the brightness target compensation amount is needed to be obtained, the corresponding brightness is compensated according to different environment brightness, the brightness of the brightness target brightness is sent out by the light supplementing part to supplement the light source for the macro shooting part through the light supplementing part.
As described in the above steps S15-S17, the reflection characteristic is the reflection degree of the face, the reflection characteristic is affected by the skin color and the oily layer of the skin, when the reflection radiation degree is too high, exposure will be affected on the photographed image, when the reflection radiation degree is too low, shadow will be affected on the photographed image, then the reflection characteristic can be detected, whether the light radiation intensity of the reflection characteristic meets the radiation intensity compensation condition is judged, and when the compensation condition is met, the macro photographing part is controlled to photograph the face of the user based on the compensation amount. In yet another embodiment, when the intensity of the light radiation is detected to be too high, the brightness compensation of the light supplementing part is reduced, and the exposure degree of the macro shooting part is adjusted to be reduced, so that the macro shooting part can acquire clear image information.
In an embodiment, the inputting the first parameter set into a preset database for analysis and matching to obtain a second parameter of skin treatment includes:
inputting the first parameter set into a preset database;
searching and matching the first parameter set through a database;
judging whether any one parameter of the first parameter set exceeds a preset parameter threshold range;
When at least one parameter set exceeds a preset parameter threshold range in the first parameter set, matching all parameters exceeding the preset parameter range with the database to obtain a second parameter of skin treatment, wherein the second parameter comprises a skin state parameter;
when any parameter set out of the first parameter set exceeds a preset parameter threshold range, the beauty instrument exits from a beauty mode and is switched to a skin conventional nursing mode, wherein the conventional nursing mode comprises a soothing massage mode and a moisturizing mode.
As described above, the first parameter set is input into a preset database, the first parameter set is analyzed and matched through the database, the database can give different skin state parameters, namely, second parameters, according to different parameters of the historical treatment scheme, and when the database judges the first parameter set, each parameter exceeding the range of the parameter threshold value can be quickly matched to the abnormal parameters of the skin by judging the parameter exceeding the range of the parameter threshold value in the first parameter set, so that the second parameters are given. When no abnormal value appears in the first parameter set, the first parameter set is matched with the skin treatment scheme in the database according to a preset algorithm, so as to obtain a second parameter which is most suitable for the skin condition of the user. The second parameters include skin condition parameters and treatment regimen parameters. When an abnormal value exists in the first parameter group or no proper skin treatment scheme exists, the beautifying instrument can exit the beautifying mode, switch to the normal skin care mode and provide basic skin care services. According to the specific skin condition of each user, the most suitable treatment scheme is provided for the user through database matching and algorithm calculation, and personalized customization is realized. When the data input by the user has abnormal values, the beauty instrument can timely find out and exit from the beauty mode, so that adverse effects on the skin of the user are avoided; the user can automatically match the proper treatment scheme for the user only by inputting basic personal information and skin condition, so that the user can conveniently and quickly enjoy professional beauty service.
In an embodiment, after the controlling the cosmetic instrument based on the result of the target treatment data, further comprising:
after a preset working time, acquiring second image information of the face of the user during treatment;
comparing and analyzing the second image information with the first image information;
judging whether the second image information meets the condition of calling the standby electric signal adjustment data or not;
when the variation difference value of the second image information is smaller than a preset standard value, judging that the second image information meets the condition of calling the standby electric signal adjustment data;
and controlling the beauty instrument based on the standby electric signal adjustment data.
After the beauty instrument works for a certain time, the second image information of the face of the user during treatment is acquired again, the first image information and the second image information are analyzed, whether the change trend of the second image information reaches the preset change is judged, if the change trend does not reach the preset change, the second parameter is required to be adjusted, the standby electric signal adjustment data are used for controlling and adjusting the beauty instrument to change the treatment scheme, the beauty instrument can adjust the treatment data, the treatment effect of the face of the user is continuously detected during the operation of the beauty instrument, and the treatment effect is improved through changing the parameter.
In one embodiment, after the cosmetic treatment is completed, the method further comprises:
acquiring third image information of the face of the user after treatment is completed;
inputting the third image information into a trained model, and converting the third image information into a third parameter;
judging whether any one of the third parameters exceeds a preset parameter threshold range;
if any one of the third parameters does not exceed the preset parameter threshold range, inputting the target treatment data into the preset database for storage;
comparing and analyzing the third image information with the first image information and the second image information to obtain a change rule and change time;
generating an analysis report based on the change rule and the change time;
and sending the change effect information to a cloud server, and sending the change effect information to a third-party platform through the cloud server.
After the treatment of the beauty instrument is finished, the third image information of the face of the user is obtained, the image information before the treatment and the photo in the treatment process can be compared through the third image information, the change rule and the change time can be compared, whether the treatment effect is achieved or not is judged, the third image information is particularly input into a trained model after being obtained, the third image information is converted into the third parameter, the treatment effect can be more accurately compared through a parameter comparison mode, when all the sub-parameters in the third parameter do not exceed the preset range, the current treatment effect is indicated to be achieved, the obtained target treatment data are stored in a database, so that the follow-up effective treatment data can be conveniently called, the change rule and the change time can be obtained through the comparison analysis of the third image information and the first image information, the change rule and the change time can be generated, an analysis report can be generated through the analysis report and the analysis report is sent to a third party platform (mobile phone end) of the user, and the user can observe the change and the effect of the current treatment effect of the user, and accordingly the user experience is improved. It should be noted that if any one of the third parameters exceeds the preset parameter threshold range, the parameter exceeding the preset range in the third parameters is obtained for analysis, the parameter exceeding the preset range in the third parameters is adjusted to be within the preset range, the adjusted third parameter is stored in a database to be tested of the preset database, when the last used target treatment data is called again, whether the last used target treatment data can reach the treatment effect or not is tested, and if the last used target treatment data is not passed, the last used target treatment data is deleted from the database.
In one embodiment, determining whether a user has stored historical detection parameters;
if the user stores the historical detection parameters, the historical skin parameters and the current detection parameters are input into a preset database for analysis and matching.
As described above, by judging whether the user has stored the history detection parameter, if the user has the history detection parameter, it indicates that the user has used the record before, by calling the previous treatment parameter to perform analysis matching with the current treatment parameter, the detection efficiency can be improved, and whether the current detection parameter and the previous parameter have a change can be compared to adjust the parameter weight coefficient of the history treatment, when the change of the parameter is lower than the preset change range, the weight coefficient of the history parameter of the user is reduced, when the weight coefficient is lower than a certain range, the history treatment parameter is deleted, and the treatment parameter is re-matched to treat the user.
According to the skin detection control method of the beauty instrument, the first image information of the user is obtained, the face state of the user can be judged according to the first image information, the obtained first image information is input into a trained model for processing and analysis, the first image information can be converted into the first parameter group through identification and conversion, the image can be converted into the parameter to be more intuitively analyzed to the problem existing in the skin, the first parameter group is input into a database trained in advance for matching, the database is provided with a plurality of groups of corresponding treatment second parameters aiming at different parameters, the parameters which need to be adjusted for the face of the user are judged by combining the second parameters with normal face parameters, corresponding target treatment data are generated, the beauty instrument is controlled to make corresponding treatment according to the electric signal adjustment data of the target treatment data, the face treatment can be more targeted, and the use effect of the beauty instrument can be fully exerted through the detected treatment after the face condition of the user is detected under the condition that the user does not know the use of the beauty instrument. From the above analysis, the embodiments of the present application can effectively adapt to the corresponding treatment modes for different skin states of the user.
Referring to fig. 3, in an embodiment of the present application, there is further provided a skin detection control device of a cosmetic apparatus, including:
an acquisition module 1, configured to acquire first image information of a face of a user;
a conversion module 2, configured to input the first image information into a trained model, and convert the first image information into a first parameter set, where the first parameter set includes a skin layer tone parameter, a skin layer wrinkle parameter, a skin layer pore density parameter, a skin layer moisture parameter, a skin layer grease parameter, and a skin layer foreign matter parameter;
the analysis module 3 is used for inputting the first parameter set into a preset database for analysis and matching to obtain a second parameter of skin treatment;
a determining module 4, configured to compare the second parameter with the normal facial parameter, and determine target treatment data of the skin, where the target treatment data includes electrical signal adjustment data;
and the control module 5 is used for controlling the beauty instrument based on the electric signal adjustment data.
As described above, it may be understood that each component of the skin detection control device of the beauty instrument set forth in the present application may implement the function of any one of the skin detection control methods of the beauty instrument as described above, and the specific structure will not be described again.
Referring to fig. 4, a computer device is further provided in the embodiment of the present application, where the computer device may be a server, and the internal structure of the computer device may be as shown in fig. 4. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the computer is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used for storing data such as monitoring data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by the processor, implements a skin detection control method of a cosmetic instrument.
The processor executes the skin detection control method of the beauty instrument, and the method comprises the following steps: acquiring first image information of a user face; inputting the first image information into a trained model, and converting the first image information into a first parameter set, wherein the first parameter set comprises a skin layer tone parameter, a skin layer wrinkle parameter, a skin layer pore density parameter, a skin layer moisture parameter, a skin layer grease parameter and a skin layer foreign matter parameter; inputting the first parameter set into a preset database for analysis and matching to obtain a second parameter of skin treatment; comparing the second parameter with the normal facial parameter to determine target treatment data of the skin, wherein the target treatment data comprises electric signal adjustment data; and controlling the beauty instrument based on the electric signal adjustment data.
The skin detection control method of the beauty instrument can be used for judging the face state of the user according to the first image information by acquiring the first image information of the user, inputting the acquired first image information into a trained model for processing analysis, converting the first image information into a first parameter set by identifying and converting the first image information, converting the image into parameters, analyzing the problems existing in the skin more intuitively, inputting the first parameter set into a pre-trained database for matching, enabling the database to give corresponding treatment parameters for different parameters, enabling the database to be capable of quickly matching with the second parameters of treatment after the first parameter set is input, enabling the second parameters to have two or more matched groups, enabling the closest parameters to be used for controlling the beauty instrument in advance, combining one of the second parameters with the first parameter set, enabling the second parameters to be closest to the second parameters to be used for combining with the first parameter set, determining the treatment data of the skin, enabling the face to be more specific for treatment by identifying the specific conditions of the user, enabling the face to be more specific for the face treatment, enabling the user to fully exert the effect of the face treatment under the condition that the face of the user is not used by detecting the face instrument.
An embodiment of the present application further provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a skin detection control method of a cosmetic instrument, including the steps of: acquiring first image information of a user face; inputting the first image information into a trained model, and converting the first image information into a first parameter set, wherein the first parameter set comprises a skin layer tone parameter, a skin layer wrinkle parameter, a skin layer pore density parameter, a skin layer moisture parameter, a skin layer grease parameter and a skin layer foreign matter parameter; inputting the first parameter set into a preset database for analysis and matching to obtain a second parameter of skin treatment; comparing the second parameter with the normal facial parameter to determine target treatment data of the skin, wherein the target treatment data comprises electric signal adjustment data; and controlling the beauty instrument based on the electric signal adjustment data.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium provided herein and used in embodiments may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual speed data rate SDRAM (SSRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, apparatus, article or method that comprises the element.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the claims, and all equivalent structures or equivalent processes using the descriptions and drawings of the present application, or direct or indirect application in other related technical fields are included in the scope of the claims of the present application.
Claims (10)
1. A skin detection control method of a cosmetic instrument, characterized by comprising:
acquiring first image information of a user face;
inputting the first image information into a trained model, and converting the first image information into a first parameter set, wherein the first parameter set comprises a skin layer tone parameter, a skin layer wrinkle parameter, a skin layer pore density parameter, a skin layer moisture parameter, a skin layer grease parameter and a skin layer foreign matter parameter;
Inputting the first parameter set into a preset database for analysis and matching to obtain a second parameter of skin treatment;
comparing the second parameter with the normal facial parameter to determine target treatment data of the skin, wherein the target treatment data comprises electric signal adjustment data;
and controlling the beauty instrument based on the electric signal adjustment data.
2. The skin detection control method of a cosmetic instrument according to claim 1, wherein the inputting the first image information into a trained model, converting the first image information into a first parameter set, comprises:
preprocessing the first image information, removing noise of the image, adjusting brightness and contrast of the image, extracting a skin area through color information and texture features, and clustering pixels similar to skin hues into a skin area;
inputting the preprocessed image into a trained deep convolutional neural network, extracting the characteristics of the image through a filter, and converting the extracted characteristics into a first parameter set through a pooling layer;
the formula of the feature extraction is as follows: r=i (x, y) -a.L (x, y), where I (x, y) is the original pixel value of the preprocessed image, a is the weight coefficient, and L (x, y) is the pixel value of the filter output.
3. The skin detection control method of a cosmetic instrument according to claim 1, wherein the cosmetic instrument includes a macro-photographing section and a light supplementing section for supplementing light to the macro-photographing section;
the acquiring the first image information of the face of the user comprises:
acquiring brightness of the environment and reflection characteristics on the face of a user;
judging whether the brightness of the environment meets the brightness compensation condition;
when the brightness of the environment meets the brightness compensation condition, acquiring a brightness target compensation quantity;
controlling the light supplementing section based on a brightness target compensation amount;
judging whether the light radiation intensity of the reflection characteristic meets the radiation intensity compensation condition or not;
when the light radiation intensity of the reflection characteristic meets the radiation intensity compensation condition, acquiring a radiation intensity target compensation quantity;
and controlling the macro shooting part to acquire image information of the face of the user based on the radiation intensity target compensation amount.
4. The method for controlling skin detection of a cosmetic apparatus according to claim 1, wherein inputting the first parameter set into a preset database for analysis and matching, to obtain a second parameter of skin treatment, comprises:
inputting the first parameter set into a preset database;
Searching and matching the first parameter set through a database;
judging whether any one parameter of the first parameter set exceeds a preset parameter threshold range;
when at least one parameter set exceeds a preset parameter threshold range in the first parameter set, matching all parameters exceeding the preset parameter range with the database to obtain a second parameter of skin treatment, wherein the second parameter comprises a skin state parameter;
when any parameter set out of the first parameter set exceeds a preset parameter threshold range, the beauty instrument exits from a beauty mode and is switched to a skin conventional nursing mode, wherein the conventional nursing mode comprises a soothing massage mode and a moisturizing mode.
5. The skin detection control method of a cosmetic instrument according to claim 1, wherein the target treatment data further includes standby electric signal adjustment data, and further comprising, after controlling the cosmetic instrument based on the electric signal adjustment data:
after a preset working time, acquiring second image information of the face of the user during treatment;
comparing and analyzing the second image information with the first image information;
judging whether the second image information meets the condition of calling the standby electric signal adjustment data or not;
When the variation difference value of the second image information is smaller than a preset standard value, judging that the second image information meets the condition of calling the standby electric signal adjustment data;
and controlling the beauty instrument based on the standby electric signal adjustment data.
6. The skin detection control method of a cosmetic apparatus according to any one of claims 1 to 5, characterized in that after the cosmetic apparatus treatment is completed, the method further comprises:
acquiring third image information of the face of the user after treatment is completed;
inputting the third image information into a trained model, and converting the third image information into a third parameter;
judging whether any one of the third parameters exceeds a preset parameter threshold range;
if any one of the third parameters does not exceed the preset parameter threshold range, inputting the current target treatment data into a preset database for storage;
comparing and analyzing the third image information with the first image information and the second image information to obtain a change rule and change time;
generating an analysis report based on the change rule and the change time;
and sending the change effect information to a cloud server, and sending the change effect information to a third-party platform through the cloud server.
7. The skin detection control method of a cosmetic instrument according to any one of claims 1 to 5, characterized in that,
judging whether the user stores the history detection parameters or not;
if the user stores the historical detection parameters, the historical skin parameters and the current detection parameters are input into a preset database for analysis and matching.
8. A skin detection control device of a beauty instrument, characterized by comprising:
the acquisition module is used for acquiring first image information of the face of the user;
the conversion module is used for inputting the first image information into a trained model and converting the first image information into a first parameter group, wherein the first parameter group comprises a skin layer tone parameter, a skin layer wrinkle parameter, a skin layer pore density parameter, a skin layer moisture parameter, a skin layer grease parameter and a skin layer foreign matter parameter;
the analysis module is used for inputting the first parameter set into a preset database for analysis and matching to obtain a second parameter of skin treatment;
the determining module is used for comparing the second parameter with the normal facial parameter to determine target treatment data of the skin, wherein the target treatment data comprises electric signal adjustment data;
And the control module is used for controlling the beauty instrument based on the electric signal adjustment data.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311564404.9A CN117379005A (en) | 2023-11-21 | 2023-11-21 | Skin detection control method, device, equipment and storage medium of beauty instrument |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311564404.9A CN117379005A (en) | 2023-11-21 | 2023-11-21 | Skin detection control method, device, equipment and storage medium of beauty instrument |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117379005A true CN117379005A (en) | 2024-01-12 |
Family
ID=89470251
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311564404.9A Pending CN117379005A (en) | 2023-11-21 | 2023-11-21 | Skin detection control method, device, equipment and storage medium of beauty instrument |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117379005A (en) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109806513A (en) * | 2019-03-27 | 2019-05-28 | 广州市圣洁美美容科技有限公司 | A kind of cold and hot supersonic face care instrument |
CN111048209A (en) * | 2019-12-28 | 2020-04-21 | 安徽硕威智能科技有限公司 | Health assessment method and device based on living body face recognition and storage medium thereof |
CN111150369A (en) * | 2020-01-02 | 2020-05-15 | 京东方科技集团股份有限公司 | Medical assistance apparatus, medical assistance detection apparatus and method |
KR20200057666A (en) * | 2019-12-18 | 2020-05-26 | 주식회사 웨이웨어러블 | Method for analyze user's skin and construct bigdata using skin measurement and care process |
CN111297326A (en) * | 2020-02-19 | 2020-06-19 | 广州小鹏汽车科技有限公司 | Control method, skin care system, vehicle, and storage medium |
KR102403012B1 (en) * | 2021-10-27 | 2022-05-30 | (주)이지템 | Beauty Mirror for Providing Personalized Beauty Solution and Driving Method Thereof |
CN217119130U (en) * | 2022-04-15 | 2022-08-05 | 欣颜时代(广州)技术有限公司 | Beauty treatment leading-in instrument |
CN115547460A (en) * | 2022-09-29 | 2022-12-30 | 深圳市沃特沃德信息有限公司 | Intelligent skin care method, equipment and medium |
KR20230054286A (en) * | 2021-10-14 | 2023-04-24 | (주)아모레퍼시픽 | System and method for diagnosing skin based on analysis of image using deep learning |
CN115999059A (en) * | 2023-01-31 | 2023-04-25 | 深圳市太美亚电子科技有限公司 | Intelligent control method and system of beauty instrument |
CN116842330A (en) * | 2023-08-31 | 2023-10-03 | 庆云县人民医院 | Health care information processing method and device capable of comparing histories |
CN117064338A (en) * | 2023-08-17 | 2023-11-17 | 普希斯(广州)科技股份有限公司 | Skin data processing method and device, skin beauty instrument and skin beauty system |
-
2023
- 2023-11-21 CN CN202311564404.9A patent/CN117379005A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109806513A (en) * | 2019-03-27 | 2019-05-28 | 广州市圣洁美美容科技有限公司 | A kind of cold and hot supersonic face care instrument |
KR20200057666A (en) * | 2019-12-18 | 2020-05-26 | 주식회사 웨이웨어러블 | Method for analyze user's skin and construct bigdata using skin measurement and care process |
CN111048209A (en) * | 2019-12-28 | 2020-04-21 | 安徽硕威智能科技有限公司 | Health assessment method and device based on living body face recognition and storage medium thereof |
CN111150369A (en) * | 2020-01-02 | 2020-05-15 | 京东方科技集团股份有限公司 | Medical assistance apparatus, medical assistance detection apparatus and method |
CN111297326A (en) * | 2020-02-19 | 2020-06-19 | 广州小鹏汽车科技有限公司 | Control method, skin care system, vehicle, and storage medium |
KR20230054286A (en) * | 2021-10-14 | 2023-04-24 | (주)아모레퍼시픽 | System and method for diagnosing skin based on analysis of image using deep learning |
KR102403012B1 (en) * | 2021-10-27 | 2022-05-30 | (주)이지템 | Beauty Mirror for Providing Personalized Beauty Solution and Driving Method Thereof |
CN217119130U (en) * | 2022-04-15 | 2022-08-05 | 欣颜时代(广州)技术有限公司 | Beauty treatment leading-in instrument |
CN115547460A (en) * | 2022-09-29 | 2022-12-30 | 深圳市沃特沃德信息有限公司 | Intelligent skin care method, equipment and medium |
CN115999059A (en) * | 2023-01-31 | 2023-04-25 | 深圳市太美亚电子科技有限公司 | Intelligent control method and system of beauty instrument |
CN117064338A (en) * | 2023-08-17 | 2023-11-17 | 普希斯(广州)科技股份有限公司 | Skin data processing method and device, skin beauty instrument and skin beauty system |
CN116842330A (en) * | 2023-08-31 | 2023-10-03 | 庆云县人民医院 | Health care information processing method and device capable of comparing histories |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108537152B (en) | Method and apparatus for detecting living body | |
CN108416324B (en) | Method and apparatus for detecting living body | |
US11058209B2 (en) | Beauty counseling information providing device and beauty counseling information providing method | |
CN108038466B (en) | Multi-channel human eye closure recognition method based on convolutional neural network | |
CN111274860A (en) | Machine vision-based online automatic tobacco leaf grade sorting identification method | |
CN107106020A (en) | For analyzing and transmitting the data relevant with mammal skin damaged disease, image and the System and method for of video | |
CN108197546A (en) | Photo-irradiation treatment method, apparatus, computer equipment and storage medium in recognition of face | |
DE102020213128A1 (en) | INFERENCE OF HEAT COMFORT OF A USER USING BODY SHAPE INFORMATION | |
CN111178331A (en) | Radar image recognition system, method, apparatus, and computer-readable storage medium | |
CN116205910B (en) | Injection molding temperature self-adaptive learning regulation and control system for power adapter | |
CN113610844A (en) | Intelligent skin care method, device, equipment and storage medium | |
CN111493836A (en) | Postoperative acute pain prediction system based on brain-computer interface and deep learning and application | |
CN111178187A (en) | Face recognition method and device based on convolutional neural network | |
CN117379005A (en) | Skin detection control method, device, equipment and storage medium of beauty instrument | |
Bae et al. | Robust skin-roughness estimation based on co-occurrence matrix | |
CN113255802A (en) | Intelligent skin tendering system based on infrared laser | |
CN112944611A (en) | Control method and device of air conditioner, storage medium and processor | |
CN111369559A (en) | Makeup evaluation method, makeup evaluation device, makeup mirror, and storage medium | |
CN115988714A (en) | Artificial intelligence-based intelligent light control method, system, equipment and medium | |
CN115813409A (en) | Ultra-low-delay moving image electroencephalogram decoding method | |
CN116250849A (en) | Electroencephalogram signal identification method based on information separator and regional convolution network | |
KR102402684B1 (en) | Method for providing an information of self skin condition based on big data in non-facing environment | |
CN112686145A (en) | Facial skin type identification method and intelligent terminal thereof | |
CN112364713A (en) | Intelligent makeup suggestion method and system | |
CN117503062B (en) | Neural detection control method, device, equipment and storage medium of beauty instrument |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |