CN113397480B - Control method, device and equipment of beauty instrument and storage medium - Google Patents

Control method, device and equipment of beauty instrument and storage medium Download PDF

Info

Publication number
CN113397480B
CN113397480B CN202110507390.1A CN202110507390A CN113397480B CN 113397480 B CN113397480 B CN 113397480B CN 202110507390 A CN202110507390 A CN 202110507390A CN 113397480 B CN113397480 B CN 113397480B
Authority
CN
China
Prior art keywords
user
beauty instrument
skin
beauty
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110507390.1A
Other languages
Chinese (zh)
Other versions
CN113397480A (en
Inventor
韩诗瑶
周鲁平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Shuliantianxia Intelligent Technology Co Ltd
Original Assignee
Shenzhen Shuliantianxia Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Shuliantianxia Intelligent Technology Co Ltd filed Critical Shenzhen Shuliantianxia Intelligent Technology Co Ltd
Priority to CN202110507390.1A priority Critical patent/CN113397480B/en
Publication of CN113397480A publication Critical patent/CN113397480A/en
Application granted granted Critical
Publication of CN113397480B publication Critical patent/CN113397480B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H23/00Percussion or vibration massage, e.g. using supersonic vibration; Suction-vibration massage; Massage with moving diaphragms
    • A61H23/02Percussion or vibration massage, e.g. using supersonic vibration; Suction-vibration massage; Massage with moving diaphragms with electric or magnetic drive
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/06Radiation therapy using light
    • A61N5/0613Apparatus adapted for a specific treatment
    • A61N5/0616Skin treatment other than tanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M37/00Other apparatus for introducing media into the body; Percutany, i.e. introducing medicines into the body by diffusion through the skin
    • A61M2037/0007Other apparatus for introducing media into the body; Percutany, i.e. introducing medicines into the body by diffusion through the skin having means for enhancing the permeation of substances through the epidermis, e.g. using suction or depression, electric or magnetic fields, sound waves or chemical agents
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/06Radiation therapy using light
    • A61N2005/0626Monitoring, verifying, controlling systems and methods

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Dermatology (AREA)
  • Epidemiology (AREA)
  • Pain & Pain Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The application provides a control method, a control device, control equipment and a storage medium of a beauty instrument, wherein the method comprises the following steps: acquiring a face skin measurement result generated based on an image of a user face; acquiring skin data corresponding to different parts of the face of the user according to the face skin measuring result, wherein one part corresponds to one position area of the face of the user; configuring the operation parameters of the beauty instrument corresponding to each part according to the skin data corresponding to each part; and controlling the beauty instrument to respectively adopt the beauty instrument operation parameters corresponding to each part to execute beauty operation at each part of the face of the user. The current skin state of the user is determined through the skin data in the face skin measuring result, and the corresponding operation parameters of the beauty instrument are determined according to different parts of the face of the user, so that the beauty operation of each part of the face of the user can be executed regionally, the matching precision of the operation parameters of the beauty instrument and the skin state of the user is high, the beauty effect is good, and the risk of damaging the skin is low.

Description

Control method, device and equipment of beauty instrument and storage medium
Technical Field
The application belongs to the technical field of intelligent equipment, and particularly relates to a control method, a control device, control equipment and a storage medium of a beauty instrument.
Background
Modern people have a series of skin problems of yellow skin, no luster and the like due to the fact that living pressure is greater, various toxins deposited in bodies are slowly excreted, and skin absorption of skin to maintenance products is slow. With the increasing living standard of modern people, the desire of people for pursuing beautiful and healthy life is increasing day by day, and people use some beauty instruments to beautify and maintain the skin.
The beauty instrument is an instrument for beautifying people by using methods such as physics, electronic technology, optical technology and the like. The beauty instrument is one of the common instruments for beauty treatment of human face skin, such as an introduction instrument, a massage instrument, etc. The existing beauty instrument usually determines working parameters of the beauty instrument by selecting a function mode by a user, but the selectable function modes of the beauty instrument by the user are very limited, so that the working parameters of the beauty instrument which are most suitable for the current skin condition of the user cannot be automatically adapted to the user according to the current skin condition of the user accurately, the beauty effect is greatly reduced, and even the risk of skin injury exists.
Disclosure of Invention
In view of this, embodiments of the present application provide a method, an apparatus, a device, and a storage medium for controlling a beauty instrument, which can automatically adapt to an operation parameter of the beauty instrument most suitable for a current skin condition of a user according to the current skin condition of the user, improve a beauty effect, and reduce a risk of skin injury.
A first aspect of an embodiment of the present application provides a control method of a cosmetic apparatus, including:
acquiring a face skin measurement result generated based on an image of the face of the user;
acquiring skin data corresponding to different parts of the face of the user according to the face skin measuring result, wherein one part corresponds to one position area of the face of the user;
configuring the operation parameters of the beauty instrument corresponding to each part according to the skin data corresponding to each part;
and controlling the beauty instrument to respectively adopt the beauty instrument operation parameters corresponding to each part to execute beauty operation at each part of the face of the user.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the cosmetic apparatus operating parameter includes a vibration duration and/or a vibration frequency; the step of configuring the operation parameters of the beauty instrument corresponding to each part according to the skin data corresponding to each part comprises the following steps:
acquiring target skin type data for reflecting the skin type state of a target part from the face skin measuring result, wherein the target part is any part of the face of the user;
inquiring a preset skin data-beauty instrument operation parameter matching relation table according to the target skin data, and obtaining a beauty instrument operation parameter matching value matched with the target skin data from the preset skin data-beauty instrument operation parameter matching relation table;
and multiplying the ratio of the beauty instrument operation parameters matched with the target skin data by the basic operation parameters of the user correspondingly to obtain the beauty instrument operation parameters corresponding to the target part, wherein the basic operation parameters comprise basic vibration duration and/or basic vibration frequency.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, the cosmetic apparatus operating parameter includes a vibration duration; the step of obtaining the beauty instrument operation parameters corresponding to the target part by multiplying the beauty instrument operation parameter matching values matched with the target skin data by the basic operation parameters of the user correspondingly respectively comprises the following steps:
acquiring the current time for using the beauty instrument by a user;
according to the corresponding relation table of the time traversal time-vibration duration threshold value of the cosmetic instrument currently used by the user, acquiring a first vibration duration threshold value matched with the time of the cosmetic instrument currently used by the user from the corresponding relation table of the time-vibration duration threshold value as the basic vibration duration of the user;
and multiplying the vibration duration parameter matching value matched with the target skin data by the basic vibration duration of the user to obtain the vibration duration of the beauty instrument corresponding to the target part.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, after the step of obtaining, from the time-vibration duration threshold correspondence table, a first vibration duration threshold matched with the time when the user currently uses the cosmetic apparatus as a basic vibration duration of the user according to the time traversal time-vibration duration threshold correspondence table when the user currently uses the cosmetic apparatus, the method further includes:
acquiring the current maintenance product coating amount value of the user;
and according to the current maintenance product smearing quantity numerical value of the user, calculating a ratio between the current maintenance product smearing quantity numerical value of the user and a standard smearing quantity, and according to the ratio between the current maintenance product smearing quantity numerical value of the user and the standard smearing quantity, adjusting the first vibration duration threshold value to obtain a second vibration duration threshold value matched with the current maintenance product smearing quantity numerical value of the user as the basic vibration duration of the user.
With reference to the first possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, the cosmetic instrument operating parameter includes a vibration frequency; the step of obtaining the beauty instrument operation parameters corresponding to the target part by respectively and correspondingly multiplying the ratio of the beauty instrument operation parameters matched with the target skin data by the basic operation parameters of the user comprises the following steps:
acquiring component data of a maintenance product currently used by a user;
traversing a maintenance product component-vibration frequency threshold value corresponding relation table according to the component data of the maintenance product currently used by the user, and acquiring a vibration frequency threshold value matched with the component data of the maintenance product currently used by the user from the maintenance product component-vibration frequency threshold value corresponding relation table as a basic vibration frequency of the user;
and multiplying the vibration frequency parameter matching value matched with the target skin data by the basic vibration frequency of the user to obtain the vibration frequency of the beauty instrument corresponding to the target part.
With reference to the first possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, the cosmetic apparatus operating parameters further include an illumination parameter, and after the step of obtaining target skin data reflecting a target site skin condition from the facial skin measurement result, the method further includes:
and determining illumination parameters of the beauty instrument when beauty operation is performed on the target part according to the target skin data, wherein the illumination parameters comprise an illumination wavelength and an illumination duration.
With reference to the first aspect, in a sixth possible implementation manner of the first aspect, the step of controlling the beauty instrument to perform a beauty treatment operation at each part of the face of the user by using the beauty instrument operation parameters corresponding to each part includes:
identifying the position of each part on the face of a user based on the image of the face of the user, and labeling the beauty instrument operation parameters corresponding to each part to the image of the face of the user based on the position to generate an operation electronic book, so that the user performs beauty operation by adopting the beauty instrument operation parameters corresponding to each part on the face of the user respectively according to the operation electronic book in the process of using the beauty instrument; and/or
Sequencing the beauty instrument operation parameters corresponding to each part according to the part, generating a beauty instrument part use sequence corresponding to the sequence of the beauty instrument operation parameters corresponding to each part, and controlling the beauty instrument to adjust the beauty instrument operation parameters according to the sequence of the beauty instrument operation parameters corresponding to each part and execute beauty operation in the process that the user uses the beauty instrument according to the beauty instrument part use sequence; and/or
The method comprises the steps of acquiring a position image representing the current position of a beauty instrument in real time to determine the current position of the beauty instrument, controlling the beauty instrument to adjust the current operation parameters of the beauty instrument to the operation parameters of the beauty instrument corresponding to the current position according to the current position of the beauty instrument, and executing beauty operation.
A second aspect of the embodiments of the present application provides a control device for a beauty instrument, including:
the skin measuring result acquisition module is used for acquiring a face skin measuring result generated based on the image of the face of the user;
the skin state determining module is used for acquiring skin data corresponding to different parts of the face of the user according to the face skin measuring result, wherein one part corresponds to a position area of the face of the user;
the beauty instrument operating parameter configuration module is used for configuring the beauty instrument operating parameters corresponding to each part according to the skin data corresponding to each part;
and the beauty treatment operation control module is used for controlling the beauty treatment instrument to respectively adopt the beauty treatment instrument operation parameters corresponding to each part of the face of the user to execute beauty treatment operation.
A third aspect of the embodiments of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the method for controlling the cosmetic apparatus according to any one of the first aspect when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, in which a computer program is stored, which, when executed by a processor, implements the steps of the method for controlling a cosmetic apparatus according to any one of the first aspects.
Compared with the prior art, the embodiment of the application has the beneficial effects that:
the method comprises the steps of obtaining a face skin measurement result generated based on an image of the face of a user; acquiring skin data corresponding to different parts of the face of the user according to the face skin measuring result, wherein one part corresponds to one position area of the face of the user; configuring the operation parameters of the beauty instrument corresponding to each part according to the skin data corresponding to each part; and controlling the beauty instrument to respectively adopt the beauty instrument operation parameters corresponding to each part to execute beauty operation at each part of the face of the user. The method determines the current skin state of the user through the skin data in the skin measurement result of the face of the user, determines the operation parameters of the beauty instrument corresponding to the part of the face of the user according to the current skin states of different parts of the face of the user, and realizes that the beauty operation is executed on each part of the face of the user in regions. Moreover, the matching precision of the operation parameters of the beauty instrument and the skin state is high, the risk of skin damage caused by wrong skin care operation can be reduced, and the beauty effect is effectively improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flowchart of a basic method of a control method of a beauty instrument according to an embodiment of the present application;
fig. 2 is a flowchart illustrating a method for configuring operation parameters of a cosmetic apparatus in a control method of a cosmetic apparatus according to an embodiment of the present disclosure;
fig. 3 is a flowchart illustrating a method for configuring a vibration duration parameter of the cosmetic apparatus in a control method of the cosmetic apparatus according to an embodiment of the present application;
fig. 4 is a schematic flowchart of another method for determining a basic vibration duration of a user in the control method of the beauty instrument according to the embodiment of the present application;
fig. 5 is a flowchart illustrating a method for configuring a vibration frequency parameter of a cosmetic apparatus in a control method of a cosmetic apparatus according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a control device of a beauty instrument according to an embodiment of the present application;
fig. 7 is a schematic view of an electronic device for implementing a control method of a beauty instrument according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
The control method of the beauty instrument provided by the embodiment of the application is mainly applied to an intelligent beauty mirror system, based on a photographing skin measuring function and/or a skin article quantitative output function in the intelligent beauty mirror system, and the operation parameters of the beauty instrument are configured according to the current skin measuring result data and the skin article smearing amount of a user, so that the beauty instrument can be automatically adapted to the operation parameters of the beauty instrument most suitable for the current skin state of the user according to the current skin state of the user, the beauty effect is improved, and the risk of skin injury is reduced. It can be understood that the control method of the beauty instrument can also be directly applied to the beauty instrument, for example, external equipment is used for obtaining skin measurement result data and skin product usage data of a user, the obtained skin measurement result data and skin product usage data of the user are transmitted to the beauty instrument, and the beauty instrument configures operation parameters according to the received skin measurement result data and skin product usage data of the user.
In some embodiments of the present application, please refer to fig. 1, fig. 1 is a basic method flowchart of a control method of a cosmetic apparatus provided in the embodiments of the present application, which is detailed as follows:
step S11: a face skin measurement result generated based on an image of the user's face is obtained.
In this embodiment, be provided with the end of making a video recording in the intelligence beauty mirror, have and shoot and survey the skin function. When the user uses the intelligent beauty mirror, the intelligent beauty mirror performs photographing and skin measuring processing on the face of the user, and a skin measuring result of the face of the user is generated based on an image of the face of the user, which is obtained by photographing through the camera of the intelligent beauty mirror. In this embodiment, when an image of a face of a user is captured, a camera terminal is used to capture at least two images with different light supplement parameters, a skin color of the user is determined by analyzing an image brightness difference between the images, an ambient light brightness value in a current capturing scene is then deduced according to the skin color of the user, the light supplement parameters and the camera parameters of the camera terminal are adjusted according to the ambient light brightness value in the current capturing scene, and finally, the adjusted camera terminal is used to capture the image of the face of the user for performing a skin measurement operation.
Illustratively, when an image of the face of the user is shot, a first face image of the user can be directly shot through the camera, and the light supplement parameters corresponding to the first face image are obtained through recognition. Then, based on the brightness (such as the average brightness value) of the first face image and the corresponding fill-in light parameter, the fill-in light parameter of the camera is adaptively and automatically adjusted, and then a second face image of the user is shot and recorded together to obtain the adjusted fill-in light parameter. In this embodiment, the adjustment of the light compensation parameter may be adaptive adjustment according to a preset adjustment manner, for example, whether a first face image captured is dark or bright is preliminarily determined, if the first face image is too dark, the light compensation parameter is adjusted up by a preset value, and if the first face image is too bright, the light compensation parameter is adjusted down by an adjustment value. It can be understood that, in some specific implementation manners, more than two face images may be set to be acquired based on the precision requirement of the user, and when the more than two face images are acquired, if the initial judgments corresponding to the two face images shot at the latest are different, the adjustment value of the light supplement parameter is half of the adjustment value of the latest face image, and the adjustment direction is opposite to the adjustment direction of the latest face image. For example, if the preliminary determination of the first face image is dark, and the fill-in light parameter is adjusted up by 1 adjustment value, then the second face image is captured, and at this time, if the preliminary determination of the second face image is bright, the fill-in light parameter of the third face image is adjusted down by 1/2 adjustment value. In this embodiment, the difference in image brightness between two face images taken under two types of ambient light illumination may be different due to different skin colors of the faces. Therefore, different ambient light is adopted to shoot various faces with different skin colors, a large number of face image samples are obtained, A4 paper is shot to obtain image samples serving as a judgment standard, the image samples are used as experimental data to be analyzed, a corresponding skin color interval can be analyzed, a mathematical model used for the mathematical model is constructed in advance through the analysis of the experimental data, and the corresponding relation between the image brightness difference value and the skin color type is established in the mathematical model. When a mathematical model is constructed, based on a skin color type which is preset manually, for example, a plurality of skin color types are divided according to a color number value in a skin color card, the number of the skin color types is configurable, and an image brightness difference value which can be used for representing the skin color type is obtained for each divided skin color type. In the correspondence between the image brightness difference value and the skin color type, the brightness difference value may be represented as a numerical range. Therefore, the image brightness difference value between any two collected human face images can be obtained by comparing the image brightness values of the two collected human face images, and then the mathematical model is inquired according to the image brightness difference value between the two human face images based on the mathematical model, so that the skin color type of the user can be determined according to the corresponding relation between the image brightness difference value and the skin color type in the mathematical model. In this embodiment, for each preset skin color type, a large number of faces with the same skin color are photographed by using a plurality of known different ambient light to obtain face image samples, and the correspondence between the image brightness difference value and the ambient light brightness value is analyzed based on the face images, so as to establish and generate a table or a curve for representing the correspondence between the image brightness difference value and the ambient light brightness value. Wherein, one skin color type corresponds to the corresponding relation between the brightness difference value of one image and the brightness value of the environment light. Therefore, after the skin color type of the user is determined, a corresponding table or curve for representing the corresponding relation between the image brightness difference value and the ambient light brightness value can be found according to the skin color type of the user. And then, according to the image brightness difference value between the obtained face images, inquiring a table or a curve, and determining the ambient light brightness value in the current shooting scene. On the other hand, in the cosmetic mirror, an optimum imaging brightness value for capturing a skin measurement image, that is, a brightness threshold value set based on a demand for the skin measurement image is set in advance. After the ambient light brightness value in the current shooting scene is determined, the ambient light brightness value in the current shooting scene is compared with a preset brightness threshold value in the beauty mirror to calculate the light supplement parameter and the shooting parameter to be adjusted of the shooting end, wherein the shooting parameter comprises the shutter speed, the gain and the like of the shooting end.
In this embodiment, the skin measurement function of the intelligent beauty mirror can be realized by performing image evaluation on the obtained image through a pre-trained skin measurement model. For example, the skin measurement model may be a neural network model trained to a convergence state, the neural network model is configured to compare features extracted from the image of the face of the user with skin data obtained by training in the model and reflecting various skin states, so as to determine the skin state reflected by the features in the image of the face of the user and obtain the skin data reflecting the skin state, and then the model outputs position information of the features in the image and the skin data obtained based on the features, and the face skin measurement result includes position information corresponding to all the features in the image and skin data corresponding to the respective skin states. The skin condition data may be characterized as skin tone data, wrinkle data, speckle data, sensitivity data, dark eye circles, under-eye bags, silkworm data, moisture and oil data, and the like, among others.
Step S12: and acquiring skin data corresponding to different parts of the face of the user according to the face skin measuring result, wherein one part corresponds to one position area of the face of the user.
In the present embodiment, the image of the face of the user for performing the skin measurement operation is divided into a plurality of regions, and for different regions, the skin data corresponding to each region is extracted from the face skin measurement result obtained in the past for each region according to the position of each region in the image, so that the skin state of each region is reflected by the skin data corresponding to each region.
For example, in one specific implementation, the image of the face of the user may be pre-divided into a periocular region, a forehead region, a cheek region, a nose region, a chin region, and the like. Wherein, for the periocular region, the skin data belonging to the periocular region, such as pouch, silkworm decumbent, black eye data, lacrimal duct, and fishtail line data, are extracted according to the position information of each feature in the face skin measurement result, so as to reflect the skin state of the periocular region through the obtained skin data belonging to the periocular region. And aiming at the forehead, extracting skin data such as head raising line data belonging to the forehead according to the position information of each feature in the face skin measurement result, and reflecting the skin state of the forehead by the obtained skin data belonging to the forehead. And for the cheek part, extracting skin data, such as stature data, belonging to the cheek part according to the position information of each feature in the facial skin measurement result, so as to reflect the skin state of the cheek part through the obtained skin data belonging to the cheek part. For the nose part, skin data such as blackhead data belonging to the nose part is extracted according to position information of each feature in the face skin measurement result so as to reflect the skin state of the nose part through the obtained skin data belonging to the nose part. And extracting skin data, such as fat particle data, belonging to the chin part according to the position information of each feature in the face skin measurement result aiming at the chin part so as to reflect the skin state of the chin part through the obtained skin data belonging to the chin part.
For example, in another specific implementation, the image of the face of the user may be further divided according to the skin problem based on the face skin measurement result, that is, each part is automatically generated according to the obtained face skin measurement result, such as a pox part, a pigmented spot part, a wrinkle part, and the like. Based on an image of the face of a user, a face skin measurement result comprises position information corresponding to all features in the image and skin data corresponding to respective reaction skin states, taking an acne part as an example, the face skin measurement result can be used for acquiring the acne data, determining positions of the features corresponding to the acne data, and setting an area frame at the positions, so that an area range contained by the area frame in the image is the acne part. In this embodiment, when the face skin measurement result includes a plurality of pox data, whether the position of the feature corresponding to each pox data is set in one region frame may be determined according to the position distance of the feature corresponding to each pox data, for example, if the position distance of the feature corresponding to two pox data is greater than a preset distance value, the position of the feature corresponding to the two pox data is set in a region frame respectively, so as to form two pox parts; and if the distance between the positions of the features corresponding to the two pox data is smaller than a preset distance value, setting the positions of the features corresponding to the two pox data in an area frame so as to form a pox part. It is to be understood that pigmented spots, wrinkles, etc. may also be formed in a manner similar to that described above.
Step S13: and configuring the operation parameters of the beauty instrument corresponding to each part according to the skin data corresponding to each part.
In this embodiment, after the face of the user is divided into a plurality of parts, the skin state of each part is represented by the skin data corresponding to each part, after the skin data of each part is obtained by the intelligent beauty mirror, the operating parameters corresponding to each part are configured according to the skin data of each part, and the operating parameters corresponding to each part obtained by the configuration are synchronized into the beauty instrument, so that the beauty instrument performs beauty operation on each part of the face of the user by using the corresponding operating parameters.
Fig. 2 is a schematic flow chart of a method for configuring operation parameters of a cosmetic apparatus in a control method of a cosmetic apparatus according to an embodiment of the present application. The details are as follows:
step S21: acquiring target skin type data for reflecting the skin type state of a target part from the face skin measuring result, wherein the target part is any part of the face of the user;
step S22: inquiring a preset skin data-beauty instrument operation parameter matching relation table according to the target skin data, and obtaining a beauty instrument operation parameter matching value matched with the target skin data from the preset skin data-beauty instrument operation parameter matching relation table;
step S23: and multiplying the ratio of the beauty instrument operation parameters matched with the target skin data by the basic operation parameters of the user correspondingly to obtain the beauty instrument operation parameters corresponding to the target part, wherein the basic operation parameters comprise basic vibration duration and/or basic vibration frequency.
In this embodiment, the face skin measurement result includes the position information corresponding to each of all the features in the image and the skin data corresponding to each of the features for reflecting the skin condition, and therefore, after the regions are divided, the skin data for reflecting the skin condition of each of the regions can be obtained based on the position information of each of the regions. For example, the periocular region, for example, the pouch, silkworm decumbent, black eye data, lacrimal duct, and fishtail data within the position range of the periocular region may be obtained according to the position range of the periocular region in the image, and these pouch, silkworm decumbent, black eye data, lacrimal duct, and fishtail data are target skin texture data. For another example, the pox position can be obtained according to the position range of the pox position in the image, and the pox data in the position range of the pox position can be the target skin quality data. When the operation parameters of the beauty instrument corresponding to a certain part of a user are configured, the part can be used as a target part, and target skin data for reflecting the skin state of the target part is acquired based on the position of the target part.
In this embodiment, based on two cosmetic apparatus operation parameters, i.e., a skin data-cosmetic apparatus operation parameter ratio table, which is a correspondence table between skin data and a vibration duration parameter ratio and a correspondence table between skin data and a vibration frequency parameter ratio, is pre-established, wherein the two cosmetic apparatus operation parameters are respectively corresponding to the two cosmetic apparatus operation parameters. In the two corresponding relation tables, the corresponding beauty instrument operation parameter ratio (vibration duration parameter ratio or vibration frequency parameter ratio) is expressed by percentage. Illustratively, by combining multiple skin measurement indexes contained in skin measurement result data, presetting multiple skin data score intervals for measuring the skin state according to the skin measurement indexes for each skin measurement index, and enabling each score interval to correspond to one percentage value, a corresponding relation between the skin data corresponding to the skin measurement indexes and the operation parameter proportion of the beauty instrument is established.
In the preset correspondence table of skin type data-cosmetic instrument operation parameter ratios in this embodiment, based on a plurality of skin measurement indexes included in the skin measurement result data, the correspondence between different skin type data and cosmetic instrument operation parameter ratios can be configured correspondingly for each skin measurement index. For example, the skin color data may correspond to a beauty instrument operating parameter ratio, the moisture content data may correspond to a beauty instrument operating parameter ratio, the stain data may correspond to a beauty instrument operating parameter ratio, the wrinkle data may correspond to a beauty instrument operating parameter ratio, the pox data may correspond to a beauty instrument operating parameter ratio, and so on. After the target skin data of the target part is obtained, the corresponding relation between the corresponding skin data and the operation parameter proportion of the beauty instrument can be inquired according to the target skin data of the part, so that the corresponding percentage value is obtained. The obtained percentage value is multiplied by the basic operation parameter, so that the operation parameter of the beauty instrument required to be configured at the target part at present can be obtained. Wherein the basic operating parameter comprises a basic vibration duration and/or a basic vibration frequency. It can be understood that, if the target portion contains multiple target skin data, after the percentage value corresponding to each target skin data is obtained, a lower percentage value is selected from the obtained percentage values to be multiplied by the basic operation parameter, so as to obtain the operation parameter of the beauty instrument, which is required to be configured currently, of the target portion. For example, the target skin data obtained based on a certain part includes wrinkle data and vaccinia data, the percentage value obtained by inquiring the corresponding relationship between the wrinkle data and the operation parameter ratio of the cosmetic instrument according to the wrinkle data is 120%, the percentage value obtained by inquiring the corresponding relationship between the vaccinia data and the operation parameter ratio of the cosmetic instrument according to the vaccinia data is 70%, and at this time, the percentage value of 70% is selected to be multiplied by the basic operation parameter, so that the cosmetic instrument operation parameter which needs to be configured at present at the target part can be obtained.
For example, in the correspondence relationship between the pox data and the cosmetic instrument operation parameter ratio, the pox data is divided into five grades from light to heavy according to the severity of the pox, each grade corresponds to a cosmetic instrument operation parameter ratio, for example, the cosmetic instrument operation parameter ratio corresponding to the first grade is 100%, the cosmetic instrument operation parameter ratio corresponding to the second grade is 80%, the cosmetic instrument operation parameter ratio corresponding to the third grade is 60%, the cosmetic instrument operation parameter ratio corresponding to the fourth grade is 40%, and the cosmetic instrument operation parameter ratio corresponding to the fifth grade is 20%. At this time, if the obtained target skin data shows that the severity of acne at the part of the user is three-level, the cosmetic instrument operation parameter ratio can be correspondingly obtained from the correspondence between the acne data and the cosmetic instrument operation parameter ratio to be 60%. Therefore, the 60 percent value is multiplied by the basic operation parameter, so that the operation parameter of the beauty instrument required to be configured at the part at present can be obtained.
In the present embodiment, the basic vibration duration and the basic vibration frequency may be obtained according to the skin sensitivity of the entire face of the user. Illustratively, the skin sensitivity is divided into a plurality of levels, and a corresponding basic vibration duration and a basic vibration frequency are set for each level, so that the basic vibration duration and the basic vibration frequency of the user can be obtained by judging the skin sensitivity of the user.
In some embodiments of the present application, the operation parameters of the beauty instrument may further include illumination parameters, and after obtaining the target skin data of each part for reflecting the skin condition, the illumination parameters required by the target part may be determined according to the target skin data of the target part, wherein the illumination parameters include an illumination wavelength and an illumination duration. The beauty instrument is provided with three kinds of illumination of blue light, yellow light and red light, wherein the wavelength of the blue light is 460nm, the blue light is related to acne data, the effect is to repair the problem of rough surface skin, the skin is more fine and smooth, and acne regeneration can be inhibited, namely acne is eliminated. The yellow light wavelength is 590nm, is associated with skin color data, and has the effects of enhancing lymphatic flow, promoting skin cell repair, activating skin tissues, restoring brightness and enabling the skin to be tender and transparent. The red light has the wavelength of 640nm, is associated with pigment spots and wrinkle data, has the effects of promoting the generation of collagen, solving the problems of skin wrinkles, dark pigments, color spots and the like, relieving the fatigue feeling of the skin and recovering the elasticity and luster of the skin. At this time, according to the target skin data of the target region, if the target skin data of the target region contains the pox data, determining that the illumination parameter required by the target region is blue light, namely configuring a wavelength parameter corresponding to the blue light for the target region. And if the target skin data of the target part contains pigment spot and fine line data, determining that the illumination parameter required by the target part is red light, namely configuring a wavelength parameter corresponding to the red light for the target part. And determining whether the target part is irradiated by yellow light or not by judging whether the skin color value exceeds a threshold value or not according to skin color data contained in target skin data of the target part, and if the skin color value exceeds the threshold value, irradiating by yellow light. Moreover, the yellow light irradiation duration can be determined according to the specific skin color value of the target part, wherein the skin color value shows that the darker the skin is, the longer the yellow light irradiation duration is configured. In the embodiment, different illumination can be selected to massage according to the skin states of different parts, so that the repair and maintenance of the skin can be effectively promoted, and the beauty effect is improved.
Step S14: and controlling the beauty instrument to respectively adopt the beauty instrument operation parameters corresponding to each part to execute beauty operation at each part of the face of the user.
In this embodiment, the modes for controlling the beauty instrument to perform the beauty operation include, but are not limited to, a manual mode, a semi-automatic mode, and a fully-automatic mode. Wherein:
first, according to a manual mode, the position of each part on the face of a user can be identified based on the image of the face of the user, and the operation parameters of the beauty instrument corresponding to each part are labeled to the image of the face of the user based on the position to generate an operation electronic book, so that the user can respectively adopt the operation parameters of the beauty instrument corresponding to each part on the face of the user to perform beauty operation according to the operation electronic book in the process of using the beauty instrument.
For example, the operation electronic book is generated based on an image of the face of the user, and is characterized in that position regions of various parts are marked in the image of the face of the user, and various operation parameter information corresponding to the various parts, such as vibration duration information, vibration frequency information, illumination wavelength information, illumination duration information and the like, is marked in the position regions of the various parts. Therefore, when the user uses the beauty instrument, the user can manually adjust the operation parameters of the beauty instrument by referring to the parameter information correspondingly marked on each part in the operation electronic book.
Second, according to a semi-automatic mode, the beauty instrument operation parameters corresponding to each part may be sorted by part, and a beauty instrument part use order corresponding to the order of the beauty instrument operation parameters corresponding to each part may be generated, so that the user controls the beauty instrument to adjust the beauty instrument operation parameters in the order of the beauty instrument operation parameters corresponding to each part and perform a beauty operation while using the beauty instrument according to the beauty instrument part use order.
Illustratively, according to the skin measuring result of the face of the user, when the skin states corresponding to different parts in the face of the user are determined, the face of the user is divided into a plurality of parts, and five parts including an eye circumference part, a forehead part, a cheek part, a nose part and a chin part are assumed. The sequence of the parts can be, for example, the forehead part, the eye periphery part, the nose part, the cheek part and the chin part, when the user uses the beauty instrument to perform the beauty operation, the beauty instrument adjusts the operation parameters of the parts according to the sequence, wherein, the time node when the operation parameters are adjusted between the two parts is the end of the vibration duration of the previous part. For example, the duration of the vibration of the forehead is 3 minutes, and the starting time of the user performing the beauty treatment operation is 06 o 'clock 23 minutes, at this time, the beauty treatment instrument adjusts the operation parameters to the operation parameters correspondingly configured at the eye circumference part at 06 o' clock 26 minutes.
Thirdly, according to a full-automatic mode, a position image representing the current position of the beauty instrument can be obtained in real time to determine the current position of the beauty instrument, and according to the current position of the beauty instrument, the beauty instrument is controlled to adjust the current operating parameters of the beauty instrument to the operating parameters of the beauty instrument corresponding to the current position and to execute beauty operation.
Illustratively, by establishing communication connection between the intelligent cosmetic mirror and the cosmetic instrument, the intelligent cosmetic mirror acquires a position image representing the current position of the cosmetic instrument in a mode of shooting an image by a camera end in real time so as to determine the position of the cosmetic instrument on the face of a user; or the position image representing the current position of the beauty instrument is sent to the beauty instrument so that the beauty instrument determines the position of the beauty instrument on the face of the user. Specifically, when the position of the beauty instrument on the face of a user is determined, all angles of the beauty instrument can be recorded into the system through modeling, when the user uses the beauty instrument to perform beauty operation on the face, the camera firstly extracts key points of the face, and then compares the key points with the position image to judge the position of the face of the user where the beauty instrument is located currently. Then the operation parameters are automatically adjusted to the operation parameters corresponding to the part where the cosmetic instrument is positioned, and the cosmetic operation is executed according to the adjusted operation parameters.
The control method of the beauty instrument provided by the embodiment determines the current skin state of the user through the skin data in the skin measurement result of the face, and determines the corresponding operation parameters of the beauty instrument according to different parts of the face of the user, thereby realizing the execution of beauty operation on each part of the face of the user in different regions. The matching precision of the operation parameters of the beauty instrument obtained by the configuration and the skin state of the user is high, the risk of skin injury caused by wrong skin care operation is reduced, the beauty effect is achieved, and the risk of skin injury is low.
In some embodiments of the present application, please refer to fig. 3, and fig. 3 is a flowchart illustrating a method for configuring a vibration duration parameter of a cosmetic apparatus in a control method of the cosmetic apparatus according to the embodiment of the present application. The details are as follows:
step S31: acquiring the current time of using the beauty instrument by a user;
step S32: according to the corresponding relation table of the traversing time of the current beauty instrument using time of the user and the threshold value of the vibration time, acquiring a first threshold value of the vibration time matched with the current beauty instrument using time of the user from the corresponding relation table of the time and the threshold value of the vibration time as the basic vibration time of the user;
step S33: and multiplying the vibration duration parameter matching value matched with the target skin data by the basic vibration duration of the user to obtain the vibration duration of the beauty instrument corresponding to the target part.
In this embodiment, the current time when the user uses the beauty instrument is represented as the time when the user performs the beauty operation using the beauty instrument. In this embodiment, a configuration rule of a vibration duration is preset based on a time dimension, and illustratively, a time-vibration duration threshold correspondence table is constructed by dividing a day into a plurality of time periods, each of which is provided with a corresponding vibration duration threshold. In this embodiment, the correspondence table may be expressed by dividing the time of day into 6:00-10, 10: 00. 20: 00-next day 6: the vibration time length threshold corresponding to the time period of 00-10: the vibration duration threshold corresponding to the 00 time period is configured as B1, 20: the vibration period threshold corresponding to 00-day 6. When the vibration duration threshold is configured, based on the fact that a user is in urgent need to work in the morning, the vibration duration needs to be properly reduced, and the rest time of the user is long in the evening, the vibration duration can be properly increased, and A1 is more than B1 and more than C1 are configured. Therefore, after the current time for using the beauty instrument by the user is obtained, the time period range of the current time period for using the beauty instrument by the user can be judged according to the corresponding relation table of the traversal time of the current time for using the beauty instrument by the user and the vibration time period threshold value, so that the first vibration time period threshold value (A1 or B1 or C1) matched with the current time for using the beauty instrument by the user is obtained from the corresponding relation table according to the time period to which the current time for using the beauty instrument by the user belongs, and at the moment, the first vibration time period threshold value is the basic vibration time period of the beauty operation of the user.
For example, assume that the user currently uses the beauty instrument for 6: and 50, the time period of the current use of the beauty instrument by the user is 6:00-10, at this time, a first vibration duration threshold value which is matched with the current time for using the beauty instrument by the user can be obtained from the time-vibration duration threshold value correspondence table and is A1, that is, A1 is set as the basic vibration duration of the beauty operation of the user.
In this embodiment, after obtaining the basic vibration duration of the user, a correspondence table between the skin data and the vibration duration parameter ratio is queried according to the target skin data of the target portion, so as to obtain the vibration duration parameter ratio of the target portion from the correspondence table, and the obtained vibration duration parameter ratio is multiplied by the basic vibration duration of the user, so as to obtain the vibration duration of the beauty instrument corresponding to the target portion.
It can be understood that, in this embodiment, the time-vibration duration threshold correspondence table may also be set according to the skin sensitivity. For example, different time-vibration duration threshold value corresponding relation tables are set according to different skin sensitivity levels, so that two influence factors of the time when a user uses the beauty instrument to perform beauty operation and the current skin sensitivity of the user are considered when the basic vibration duration is determined.
The embodiment can set different basic vibration durations through the current time difference of using the beauty instrument by the user, so that the appropriate vibration duration can be configured for the user, and the user experience is enhanced.
In some embodiments of the present application, please refer to fig. 4, and fig. 4 is a flowchart illustrating another method for determining the basic vibration duration of the user in the control method of the cosmetic apparatus according to the embodiment of the present application. The details are as follows:
step S41: acquiring the current coating quantity value of the maintenance product of a user;
step S42: and according to the current maintenance product smearing quantity numerical value of the user, calculating a ratio between the current maintenance product smearing quantity numerical value of the user and a standard smearing quantity, and according to the ratio between the current maintenance product smearing quantity numerical value of the user and the standard smearing quantity, adjusting the first vibration duration threshold value to obtain a second vibration duration threshold value matched with the current maintenance product smearing quantity numerical value of the user as the basic vibration duration of the user.
In this embodiment, the intelligent beauty mirror is configured with a skin article usage amount calculation function, so that the skin article usage amount required by the user in the skin state can be calculated according to the current skin state of the user, and the skin article memory built in the intelligent beauty mirror is triggered to output skin articles with corresponding usage amounts according to the amount, and the data of the output usage amount each time can be backed up and recorded in the intelligent beauty mirror. The skin product memory built in the intelligent beauty mirror stores general skin care products, and the massage is used for promoting the introduction and absorption of the care products, so that in the embodiment, before the massage, the user can calculate the required use amount of the care products in the skin state and output the corresponding use amount of the care products, namely, the current care product application amount value of the user can be locally read from the intelligent beauty mirror. In the embodiment, the basic vibration duration of the user can be adjusted according to the current value of the coating amount of the maintenance product, so that the user is effectively helped to promote the introduction and absorption of the maintenance product. For example, a corresponding standard application amount is preset for each vibration duration threshold configured according to the time period, the standard application amount is a maintenance product usage amount which can help a user to absorb under the vibration duration threshold, and the standard application amount can be determined after a first vibration duration threshold is obtained according to the current time for using the cosmetic instrument by the user. In this embodiment, the vibration duration is in positive correlation with the amount of the care product that can help the user to absorb, that is, the longer the vibration duration is, the more the amount of the care product that can help the user to absorb is. Therefore, after the current maintenance product smearing quantity value of the user is obtained, the first vibration duration threshold value can be adjusted according to the positive correlation relation that the ratio between the current maintenance product smearing quantity value of the user and the standard smearing quantity is equal to the ratio between the actual vibration duration and the standard vibration duration which are required to be configured by the user currently, by calculating the ratio between the current maintenance product smearing quantity value of the user and the standard smearing quantity, so that the second vibration duration threshold value matched with the current maintenance product smearing quantity value of the user is obtained as the basic vibration duration of the user. Specifically, the calculation formula of the second vibration duration threshold is as follows: standard spread/user present treat spread value = first shake duration threshold/second shake duration threshold.
In some embodiments of the present application, please refer to fig. 5, and fig. 5 is a flowchart illustrating a method for configuring a vibration frequency parameter of a cosmetic apparatus in a control method of the cosmetic apparatus according to an embodiment of the present application. The details are as follows:
step S51: acquiring component data of a maintenance product currently used by a user;
step S52: traversing a maintenance product component-vibration frequency threshold value corresponding relation table according to the component data of the maintenance product currently used by the user, and acquiring a vibration frequency threshold value matched with the component data of the maintenance product currently used by the user from the maintenance product component-vibration frequency threshold value corresponding relation table as a basic vibration frequency of the user;
step S53: and multiplying the vibration frequency parameter matching value matched with the target skin data by the basic vibration frequency of the user to obtain the vibration frequency of the beauty instrument corresponding to the target part.
In this embodiment, when the skin product memory is built in the intelligent beauty mirror and the care product is stored, the information of the care product is recorded, the efficacy information of the care product is recorded in the local intelligent beauty mirror, and when the composition data of the care product currently used by the user is acquired, the data can be directly read from the local intelligent beauty mirror. In this embodiment, a configuration rule of a vibration frequency of the beauty instrument is preset based on the dimension of the components of the care product, for example, a standard vibration frequency is preset, the components of the care product can be further divided into six components of the care product with efficacies of whitening, moisturizing, speckle fading, aging resistance, stability, acne removal and the like, a corresponding vibration frequency ratio is respectively configured for each component of the efficacy, the vibration frequency ratio configured for each component of the care product is respectively multiplied by the standard vibration frequency to obtain a respective vibration frequency threshold, and each component of the care product is respectively mapped and associated with the respective vibration frequency threshold to obtain a correspondence table of the components of the care product and the vibration frequency thresholds. In the table of the relationship between the components of the cosmetic and the vibration frequency threshold in this embodiment, the vibration frequency ratio configured for the components with the stability maintaining effect is less than 1, and the vibration frequency threshold corresponding to the stability maintaining component can be obtained by multiplying the standard vibration frequency by the vibration frequency ratio less than 1. Therefore, the vibration frequency is reduced when the user is helped to perform skin stability maintenance, and the stability maintaining effect is enhanced when the cosmetic operation is performed. Aiming at the anti-aging components, the configured vibration frequency ratio is greater than 1, and the vibration frequency threshold corresponding to the anti-aging components can be obtained by multiplying the standard vibration frequency by the vibration frequency ratio greater than 1. And aiming at the four components of whitening, moisturizing, spot lightening and acne removing, the standard vibration frequency is directly used as the vibration frequency threshold value corresponding to each component. Thus, after the component data of the care product currently used by the user is obtained, the care product component-vibration frequency threshold value correspondence table can be traversed according to the component data of the care product currently used by the user, and the vibration frequency threshold value matched with the component data of the care product currently used by the user can be obtained from the care product component-vibration frequency threshold value correspondence table as the basic vibration frequency of the user.
In this embodiment, after obtaining the basic vibration frequency of the user, a correspondence table between the skin data and the vibration frequency parameter ratio is queried according to the target skin data obtained from the target site, so as to obtain the vibration frequency parameter ratio of the target site from the correspondence table, and the obtained vibration frequency parameter ratio is multiplied by the basic vibration frequency of the user, so as to obtain the vibration frequency of the beauty instrument corresponding to the target site.
The embodiment can effectively enhance the beauty effect and improve the user experience by configuring different vibration frequencies for different components of the care products.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an order of execution, and the order of execution of the processes should be determined by their functions and inherent logic, and should not limit the implementation processes of the embodiments of the present application.
In some embodiments of the present application, please refer to fig. 6, fig. 6 is a schematic structural diagram of a control device of a beauty instrument according to an embodiment of the present application, which is detailed as follows:
the control device of the beauty instrument comprises: a skin measurement result acquisition module 61, a skin condition determination module 62, a beauty instrument operation parameter configuration module 63 and a beauty operation control module 64. The skin measurement result obtaining module 61 is configured to obtain a face skin measurement result generated based on an image of the face of the user. The skin state determining module 62 is configured to obtain skin data corresponding to different parts of the face of the user according to the result of measuring the skin of the face, where one part corresponds to a position region of the face of the user. The beauty instrument operating parameter configuration module 63 is configured to configure the beauty instrument operating parameters corresponding to each part according to the skin data corresponding to each part. The beauty treatment operation control module 64 is configured to control the beauty treatment apparatus to perform beauty treatment operations at each part of the face of the user by using the beauty treatment apparatus operation parameters corresponding to each part.
The control device of the beauty instrument corresponds to the control method of the beauty instrument one by one.
In some embodiments of the present application, please refer to fig. 7, and fig. 7 is a schematic diagram of an electronic device for implementing a control method of a beauty instrument according to an embodiment of the present application. As shown in fig. 7, the electronic apparatus 7 of this embodiment includes: a processor 71, a memory 72 and a computer program 73, such as a control program for a cosmetic instrument, stored in said memory 72 and operable on said processor 71. The processor 71 implements the steps in the above-described embodiments of the control method of each cosmetic device when executing the computer program 72. Alternatively, the processor 71 implements the functions of the modules/units in the above-described device embodiments when executing the computer program 73.
Illustratively, the computer program 73 may be partitioned into one or more modules/units, which are stored in the memory 72 and executed by the processor 71 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 73 in the electronic device 7. For example, the computer program 73 may be divided into:
the skin measurement result acquisition module is used for acquiring a face skin measurement result generated based on the image of the face of the user;
the skin state determining module is used for acquiring skin data corresponding to different parts of the face of the user according to the face skin measuring result, wherein one part corresponds to a position area of the face of the user;
the beauty instrument operating parameter configuration module is used for configuring the beauty instrument operating parameters corresponding to each part according to the skin data corresponding to each part;
and the beauty treatment operation control module is used for controlling the beauty treatment instrument to respectively adopt the beauty treatment instrument operation parameters corresponding to each part of the face of the user to execute beauty treatment operation.
The electronic device may include, but is not limited to, a processor 71, a memory 72. It will be appreciated by those skilled in the art that fig. 7 is merely an example of the electronic device 7, and does not constitute a limitation of the electronic device 7, and may include more or less components than those shown, or combine certain components, or different components, for example, the electronic device may also include input output devices, network access devices, buses, etc.
The Processor 71 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 72 may be an internal storage unit of the electronic device 7, such as a hard disk or a memory of the electronic device 7. The memory 72 may also be an external storage device of the electronic device 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the electronic device 7. Further, the memory 72 may also include both an internal storage unit and an external storage device of the electronic device 7. The memory 72 is used for storing the computer programs and other programs and data required by the electronic device. The memory 72 may also be used to temporarily store data that has been output or is to be output.
It should be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is only used for illustration, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the apparatus may be divided into different functional units or modules to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described or recited in any embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. . Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (7)

1. A control device of a beauty instrument is characterized by comprising:
the skin measuring result acquisition module is used for acquiring a face skin measuring result generated based on the image of the face of the user;
the skin state determining module is used for acquiring skin data corresponding to different parts of the face of the user according to the face skin measuring result, wherein one part corresponds to a position area of the face of the user;
the beauty instrument operating parameter configuration module is used for configuring the beauty instrument operating parameters corresponding to each part according to the skin data corresponding to each part, and the process of configuring the beauty instrument operating parameters corresponding to each part is as follows: acquiring target skin data for reflecting the skin condition of a target part from the face skin measurement result, inquiring a preset skin data-beauty instrument operation parameter matching relation table according to the target skin data, acquiring a beauty instrument operation parameter matching value matched with the target skin data from the preset skin data-beauty instrument operation parameter matching relation table, correspondingly multiplying the beauty instrument operation parameter matching value matched with the target skin data by a basic operation parameter of a user to acquire a beauty instrument operation parameter corresponding to the target part, wherein the beauty instrument operation parameter comprises vibration duration and/or vibration frequency, if the beauty instrument operation parameter is the vibration duration, acquiring the time of the user currently using the beauty instrument, acquiring a first vibration duration threshold matched with the time of the user currently using the beauty instrument as the basic vibration duration of the user according to the time traversal time-vibration duration threshold corresponding relation table of the user currently using the beauty instrument, and multiplying the vibration duration corresponding value of the beauty instrument matched with the target skin data by the basic vibration duration of the beauty instrument, and acquiring the vibration duration corresponding ratio of the target part of the beauty instrument;
and the beauty treatment operation control module is used for controlling the beauty treatment instrument to respectively adopt the beauty treatment instrument operation parameters corresponding to each part of the face of the user to execute beauty treatment operation.
2. The control apparatus of the cosmetic instrument of claim 1 wherein the cosmetic instrument operating parameter configuration module is further configured to:
acquiring a current maintenance product coating amount value of a user;
and according to the current maintenance product smearing quantity numerical value of the user, calculating a ratio between the current maintenance product smearing quantity numerical value of the user and a standard smearing quantity, and according to the ratio between the current maintenance product smearing quantity numerical value of the user and the standard smearing quantity, adjusting the first vibration duration threshold value to obtain a second vibration duration threshold value matched with the current maintenance product smearing quantity numerical value of the user as the basic vibration duration of the user.
3. The cosmetic device control apparatus of claim 1 wherein if the cosmetic device operating parameter is a vibration frequency, the cosmetic device operating parameter configuration module is further configured to:
acquiring component data of a maintenance product currently used by a user;
traversing a maintenance product component-vibration frequency threshold value corresponding relation table according to the component data of the maintenance product currently used by the user, and acquiring a vibration frequency threshold value matched with the component data of the maintenance product currently used by the user from the maintenance product component-vibration frequency threshold value corresponding relation table as a basic vibration frequency of the user;
and multiplying the vibration frequency parameter matching value matched with the target skin data by the basic vibration frequency of the user to obtain the vibration frequency of the beauty instrument corresponding to the target part.
4. The cosmetic instrument control apparatus of claim 1 wherein the cosmetic instrument operating parameters further comprise lighting parameters, the cosmetic instrument operating parameter configuration module further configured to:
and determining the illumination parameters of the beauty instrument when the beauty instrument performs the beauty operation on the target part according to the target skin data, wherein the illumination parameters comprise an illumination wavelength and an illumination time length.
5. The control device of the cosmetic instrument of claim 1, wherein the cosmetic operation control module is further configured to:
identifying the position of each part on the face of a user based on the image of the face of the user, and labeling the beauty instrument operation parameters corresponding to each part into the image of the face of the user based on the position to generate an operation electronic book, so that the user performs beauty operation by adopting the beauty instrument operation parameters corresponding to each part on the face of the user respectively according to the operation electronic book in the process of using the beauty instrument; and/or
Sequencing the beauty instrument operation parameters corresponding to each part according to the part, generating a beauty instrument part use sequence corresponding to the sequence of the beauty instrument operation parameters corresponding to each part, and controlling the beauty instrument to adjust the beauty instrument operation parameters according to the sequence of the beauty instrument operation parameters corresponding to each part and execute beauty operation in the process that the user uses the beauty instrument according to the beauty instrument part use sequence; and/or
The method comprises the steps of acquiring a position image representing the current position of the beauty instrument in real time to determine the current position of the beauty instrument, controlling the beauty instrument to adjust the current operating parameters of the beauty instrument to the operating parameters of the beauty instrument corresponding to the current position according to the current position of the beauty instrument, and executing beauty operation.
6. An electronic device comprising a memory, a processor and a computer program stored in said memory and executable on said processor, characterized in that said processor implements the functions of the control means of the cosmetic apparatus according to any one of claims 1 to 5 when executing said computer program.
7. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, realizes the functions of a control device of a cosmetic device according to any one of claims 1 to 5.
CN202110507390.1A 2021-05-10 2021-05-10 Control method, device and equipment of beauty instrument and storage medium Active CN113397480B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110507390.1A CN113397480B (en) 2021-05-10 2021-05-10 Control method, device and equipment of beauty instrument and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110507390.1A CN113397480B (en) 2021-05-10 2021-05-10 Control method, device and equipment of beauty instrument and storage medium

Publications (2)

Publication Number Publication Date
CN113397480A CN113397480A (en) 2021-09-17
CN113397480B true CN113397480B (en) 2023-02-10

Family

ID=77678145

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110507390.1A Active CN113397480B (en) 2021-05-10 2021-05-10 Control method, device and equipment of beauty instrument and storage medium

Country Status (1)

Country Link
CN (1) CN113397480B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114305334A (en) * 2021-12-09 2022-04-12 深圳贵之族生科技有限公司 Intelligent beauty method, device, equipment and storage medium
CN114333036B (en) * 2022-01-20 2023-04-07 深圳市宝璐美容科技有限公司 Intelligent beauty control method, device, equipment and storage medium
CN114816567A (en) * 2022-04-12 2022-07-29 林镇清 Beauty parameter adjusting method and device, beauty instrument and storage medium
CN117651261A (en) * 2022-09-29 2024-03-05 广州星际悦动股份有限公司 Output control method, device, equipment and storage medium of beauty equipment
CN116747431A (en) * 2023-05-18 2023-09-15 深圳市宗匠科技有限公司 Method and device for detecting action position of beauty instrument and energy output and beauty instrument

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3847270B2 (en) * 2000-11-09 2006-11-22 審美インターナショナル株式会社 Facial equipment
JP2008029507A (en) * 2006-07-27 2008-02-14 Kao Corp Cosmetic operation assisting tool and cosmetic operation method using the same
JP2009028268A (en) * 2007-07-26 2009-02-12 Panasonic Electric Works Co Ltd Light irradiating beauty instrument
JP2016081441A (en) * 2014-10-22 2016-05-16 パナソニックIpマネジメント株式会社 Beauty care assist device, beauty care assist system, and beauty care assist method
CN106334268B (en) * 2015-07-13 2018-08-31 深圳斯坦普光生物科技有限公司 A kind of intelligent facial beauty instrument based on internet high in the clouds big data
CN106178255A (en) * 2016-07-04 2016-12-07 邓琳 A kind of intelligence facial beauty instrument
CN107754093A (en) * 2016-08-21 2018-03-06 深圳斯坦普光生物科技有限公司 A kind of beauty method and beauty appliance based on LED light biology skin makeup
JP6763753B2 (en) * 2016-11-16 2020-09-30 マクセルホールディングス株式会社 Beauty equipment
CN106975158A (en) * 2017-04-12 2017-07-25 仓勇 A kind of beauty instrument and the method that beautifying skin is carried out using the beauty instrument
CN107680128B (en) * 2017-10-31 2020-03-27 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107798652A (en) * 2017-10-31 2018-03-13 广东欧珀移动通信有限公司 Image processing method, device, readable storage medium storing program for executing and electronic equipment
JP7207857B2 (en) * 2018-03-15 2023-01-18 サンスター株式会社 How to provide a facial beauty treatment
CN109603011A (en) * 2018-11-14 2019-04-12 深圳美丽策光生物科技有限公司 A kind of photon surveys skin skin beautifying apparatus and its processing method
JP6704586B1 (en) * 2019-03-28 2020-06-03 ピクシーダストテクノロジーズ株式会社 Ultrasonic beauty device, information processing device, method, program
CN111814520A (en) * 2019-04-12 2020-10-23 虹软科技股份有限公司 Skin type detection method, skin type grade classification method, and skin type detection device
CN110287809B (en) * 2019-06-03 2021-08-24 Oppo广东移动通信有限公司 Image processing method and related product
CN112057735B (en) * 2019-06-10 2023-12-22 创锐思智能有限公司 Beauty method, beauty instrument, terminal and beauty system
CN111513678A (en) * 2020-04-28 2020-08-11 深圳市三七智联科技有限公司 Skin management method and device based on beauty instrument and computer readable storage medium
CN112641436A (en) * 2020-11-05 2021-04-13 西安拾玖岁信息科技有限公司 Cosmetic method and device
CN112546438A (en) * 2020-11-10 2021-03-26 添可智能科技有限公司 Beauty instrument control and working method and equipment

Also Published As

Publication number Publication date
CN113397480A (en) 2021-09-17

Similar Documents

Publication Publication Date Title
CN113397480B (en) Control method, device and equipment of beauty instrument and storage medium
CN110443747B (en) Image processing method, device, terminal and computer readable storage medium
CN110111245B (en) Image processing method, device, terminal and computer readable storage medium
CN108447017B (en) Face virtual face-lifting method and device
US10643087B2 (en) Systems and methods of biometric analysis to determine a live subject
CN104586364B (en) A kind of skin quality detection system and method
CN107945135B (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN113411507B (en) Skin measurement image acquisition method, device, equipment and storage medium
CN107730446B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN107862653B (en) Image display method, image display device, storage medium and electronic equipment
CN107862657A (en) Image processing method, device, computer equipment and computer-readable recording medium
US20120016231A1 (en) System and method for three dimensional cosmetology imaging with structured light
CN107369133B (en) Face image beautifying method and device
CN107862659A (en) Image processing method, device, computer equipment and computer-readable recording medium
CN113610844A (en) Intelligent skin care method, device, equipment and storage medium
CN110008878A (en) A kind of anti-false method of Face datection and the face identification device for having anti-false function
CN103714225A (en) Information system with automatic make-up function and make-up method of information system
CN114305334A (en) Intelligent beauty method, device, equipment and storage medium
CN112686800B (en) Image processing method, device, electronic equipment and storage medium
CN108391356A (en) A kind of Intelligent House Light control system
CN113255802A (en) Intelligent skin tendering system based on infrared laser
CN113610723B (en) Image processing method and related device
CN114972014A (en) Image processing method and device and electronic equipment
CN113377020A (en) Device control method, device and storage medium
JP2005128600A (en) Image processing method and object photographing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant