CN112546438A - Beauty instrument control and working method and equipment - Google Patents

Beauty instrument control and working method and equipment Download PDF

Info

Publication number
CN112546438A
CN112546438A CN202011247170.1A CN202011247170A CN112546438A CN 112546438 A CN112546438 A CN 112546438A CN 202011247170 A CN202011247170 A CN 202011247170A CN 112546438 A CN112546438 A CN 112546438A
Authority
CN
China
Prior art keywords
face
area
information
target
beauty instrument
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011247170.1A
Other languages
Chinese (zh)
Inventor
张敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tineco Intelligent Technology Co Ltd
Original Assignee
Tineco Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tineco Intelligent Technology Co Ltd filed Critical Tineco Intelligent Technology Co Ltd
Priority to CN202011247170.1A priority Critical patent/CN112546438A/en
Publication of CN112546438A publication Critical patent/CN112546438A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N1/00Electrotherapy; Circuits therefor
    • A61N1/18Applying electric currents by contact electrodes
    • A61N1/32Applying electric currents by contact electrodes alternating or intermittent currents
    • A61N1/36Applying electric currents by contact electrodes alternating or intermittent currents for stimulation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N1/00Electrotherapy; Circuits therefor
    • A61N1/18Applying electric currents by contact electrodes
    • A61N1/32Applying electric currents by contact electrodes alternating or intermittent currents
    • A61N1/328Applying electric currents by contact electrodes alternating or intermittent currents for improving the appearance of the skin, e.g. facial toning or wrinkle treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/06Radiation therapy using light
    • A61N5/0613Apparatus adapted for a specific treatment
    • A61N5/0616Skin treatment other than tanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/06Radiation therapy using light
    • A61N5/0613Apparatus adapted for a specific treatment
    • A61N5/0624Apparatus adapted for a specific treatment for eliminating microbes, germs, bacteria on or in the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N7/00Ultrasound therapy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/06Radiation therapy using light
    • A61N2005/065Light sources therefor
    • A61N2005/0651Diodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/06Radiation therapy using light
    • A61N2005/0658Radiation therapy using light characterised by the wavelength of light used
    • A61N2005/0662Visible light
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N7/00Ultrasound therapy
    • A61N2007/0004Applications of ultrasound therapy
    • A61N2007/0034Skin treatment

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Plastic & Reconstructive Surgery (AREA)
  • Biophysics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a beauty instrument control and working method and equipment. In the embodiment of the application, different types of face areas are determined in advance, corresponding relations between the different types of face areas and working parameters of the beauty instrument are established, in the working process of the beauty instrument, the face area where the beauty instrument is located currently is obtained, and when the face area belongs to the preset face area type, the beauty instrument is controlled to use the working parameters corresponding to the face area of the type to maintain the face area. The embodiment of the application can control the beauty instrument and adopt different working parameters to carry out different maintenance to the face region of different grade type, is favorable to improving the use flexibility of beauty instrument for the maintenance effect is more ideal, improves user's use and experiences, increases user's viscidity.

Description

Beauty instrument control and working method and equipment
Technical Field
The application relates to the technical field of electronic equipment, in particular to a beauty instrument control and working method and equipment.
Background
With the development of society, the living standard of people is continuously improved, and more people pay attention to the skin care. The beauty instrument is widely applied as a product with simple operation, easy use and remarkable effect. The beauty treatment function of the existing beauty treatment instrument is more and more powerful, a user can select a using mode, such as a red light skin tendering mode or a cleaning skin changing mode, through physical keys on the beauty treatment instrument, and then the beauty treatment instrument is moved back and forth on the face to achieve the beauty treatment effects of skin tendering, cleaning and the like. However, the existing beauty instrument still has imperfect parts, and has the problems of unsatisfactory beauty effect, poor use feeling of users and the like.
Disclosure of Invention
Aspects of the application provide a beauty instrument control and working method and equipment for improve the use flexibility of beauty instrument, make the maintenance effect more ideal, improve user's use and experience, increase user's viscidity.
The embodiment of the application provides a cosmetic instrument control method, which comprises the following steps: acquiring a face area where the beauty instrument is located currently and a face map, wherein the face map comprises at least one type of mark area, and the mark areas of different types correspond to different working parameters; determining type information of a target marking area as target type information when the target marking area of the face map is matched with the face area; and controlling the beauty instrument to maintain the face area according to the working parameters corresponding to the target type information.
The embodiment of the application also provides a cosmetic instrument working method, which is suitable for a cosmetic instrument, and the method comprises the following steps: receiving the corresponding relation between the marking information and the working parameters sent by the server, wherein different marking information corresponds to different types of marking areas in the face map; receiving target marking information, wherein the target marking information is marking information corresponding to a target marking area matched with a face area where the beauty instrument is located currently; determining working parameters corresponding to the target mark information according to the corresponding relation; and maintaining the current face area according to the working parameters corresponding to the target mark information.
An embodiment of the present application further provides a terminal device, including: a memory and a processor; a memory for storing a computer program; a processor coupled with the memory for executing the computer program for: acquiring a face area where the beauty instrument is located currently and a face map, wherein the face map comprises at least one type of mark area, and the mark areas of different types correspond to different working parameters; determining type information of a target marking area as target type information when the target marking area of the face map is matched with the face area; and controlling the beauty instrument to maintain the face area according to the working parameters corresponding to the target type information.
An embodiment of the present application further provides a server, including: a memory and a processor; a memory for storing a computer program; a processor coupled with the memory for executing the computer program for: acquiring a face image of a user, and identifying at least one type of face features contained in the face image, wherein the different types of face features correspond to different working parameters; in the face image, the area where at least one type of facial features are located is marked by using different marking information and is used as at least one type of marking area, so that a face map is obtained.
The embodiment of this application still provides a beauty instrument, includes: a body and a cosmetic head; the main control board is arranged on the machine body, and a main control module and a storage module are arranged on the main control board; the storage module is used for storing a computer program; the main control module is used for executing computer programs and is used for: receiving the corresponding relation between the marking information and the working parameters sent by the server, wherein different marking information corresponds to different types of marking areas in the face map; receiving target marking information, wherein the target marking information is marking information corresponding to a target marking area matched with a face area where the beauty instrument is located currently; determining working parameters corresponding to the target mark information according to the corresponding relation; and controlling the beauty head to maintain the current face area according to the working parameters corresponding to the target mark information.
In the embodiment of the application, different types of face areas are determined in advance, corresponding relations between the different types of face areas and working parameters of the beauty instrument are established, in the working process of the beauty instrument, the face area where the beauty instrument is located currently is obtained, and when the face area belongs to the preset face area type, the beauty instrument is controlled to use the working parameters corresponding to the face area of the type to maintain the face area. The embodiment of the application can control the beauty instrument and adopt different working parameters to carry out different maintenance to the face region of different grade type, is favorable to improving the use flexibility of beauty instrument for the maintenance effect is more ideal, improves user's use and experiences, increases user's viscidity.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1a is a schematic structural diagram of a control system of a beauty instrument according to an exemplary embodiment of the present application;
fig. 1b is a schematic diagram of a face map generation method according to an exemplary embodiment of the present application;
FIG. 1c is a schematic illustration of a different type of facial feature provided by an exemplary embodiment of the present application;
fig. 2 is a schematic structural view of another cosmetic instrument control system provided in an exemplary embodiment of the present application;
fig. 3a is a schematic flow chart of a cosmetic instrument control method according to an exemplary embodiment of the present application;
fig. 3b is a schematic flow chart of a working method of the beauty instrument according to an exemplary embodiment of the present application;
fig. 4 is a schematic structural diagram of a terminal device according to an exemplary embodiment of the present application;
fig. 5 is a schematic structural diagram of a server according to an exemplary embodiment of the present application;
fig. 6 is a schematic structural view of a beauty instrument according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1a is a schematic structural diagram of a cosmetic instrument control system according to an exemplary embodiment of the present application. As shown in fig. 1a, the beauty instrument control system 100 includes: a beauty instrument 101 and a terminal device 102.
The beauty instrument 101 is any electronic device having a beauty and/or skin care function, and the configuration of the beauty instrument 101 is not limited in this embodiment. As shown in fig. 1a, one implementation of the cosmetic apparatus 101 may include: a body 101a and a cosmetic head 101 b. The main body 101a is provided with a starting button, a gear/mode adjusting button and a circuit board, and the circuit board is provided with a main control module, a power supply module and other hardware modules; the main control module is used as the brain of the cosmetic apparatus 101 to implement a control logic of the cosmetic apparatus 101, for example, the main control module can control the cosmetic apparatus 101 to be turned on or turned off according to a power-on or power-off instruction sent by a user through a power-on key, and can also control the cosmetic apparatus 101 to operate in different gears or modes according to a gear/mode adjustment instruction sent by the user through a gear/mode adjustment key when the cosmetic apparatus 101 is in a power-on state.
The beauty head 101b is provided with at least one care module having a beauty or skin care function, and may include at least one of a micro-current module, an ultrasonic module, and an LED light module, for example. Wherein, under the condition that the micro-current module is arranged on the beauty treatment head 101b, the micro-current module can be utilized to output low-frequency micro-current outwards, and the low-frequency micro-current can promote skin metabolism and blood circulation; under the condition that the beauty head 101b is provided with the ultrasonic module, the ultrasonic module can be used for generating ultrasonic waves with certain frequency, and further the super-strong introduction performance of the ultrasonic waves is used for improving the circulation of local blood and lymph of the skin and enhancing the permeability of cells; under the condition that the beauty treatment head 101b is provided with the LED light module, the LED light module can be used for radiating red, green and/or purple light wave signals outwards, wherein the red light wave signals can accelerate the metabolism of the skin, the green light wave signals can promote the skin photosynthesis and whitening, and the purple light wave signals can promote the inflammation elimination of the skin. Further, a wireless transmission module is mounted on the circuit board of the beauty instrument 101, and is used for establishing wireless connection with other equipment such as terminal equipment. The wireless transmission module may be a Wifi module, a bluetooth module, an infrared module, or a Radio Frequency Identification (RFID) module. It should be noted that the above description is only an exemplary description of the structure and function of the beauty instrument 101, and is not limited thereto.
The terminal device 102 is a terminal device capable of controlling and guiding the use of the beauty instrument 101, and in the present embodiment, the implementation form of the terminal device 102 is not limited, and for example, the terminal device 102 may be, but is not limited to: desktop computers, notebook computers, smart phones, or the like. When the user uses the beauty instrument 101, the terminal device 102 can guide the use of the beauty instrument 101, guide the user to use the beauty instrument 101 more appropriately, and improve the beauty or care effect. In addition, the terminal device 102 can establish a wireless connection with the cosmetic apparatus 101 based on the wireless transmission module of the cosmetic apparatus 101. Further, an Application program (APP) corresponding to the cosmetic instrument 101 is installed on the terminal device 102, and a binding relationship between the cosmetic instrument 101 and the APP is established, so that the user can perform various controls on the cosmetic instrument 101 through the APP, for example, remotely controlling the cosmetic instrument to start, shut down, adjust gears/modes, and the like through the APP.
In this embodiment, the terminal device 102 may obtain a face map, where the face map refers to an indication map that may embody facial features of a user, the face map includes at least one type of mark area, the mark area refers to an area marked on the face map, and each type of mark area may be one or more. The different types of marking areas correspond to different working parameters of the beauty instrument, in other words, when the beauty instrument is used for maintaining the different types of marking areas, the beauty instrument needs to use different working parameters. In this embodiment, the manner of classifying the mark regions is not limited, and for example, regions defined by facial features, such as a forehead region, a cheek region, a chin region, and the like, may be marked as different types of mark regions. In addition, in an optional embodiment of the present application, the facial features of the human face may be classified, and then the human face is divided into different types of face regions according to the types of the facial features, and the face regions are correspondingly marked on the face map, so as to obtain the marked regions in the face map. For example, assuming that facial features are divided into facial organ features, bone features, facial concave-convex features and wrinkle features, a face map can be image-recognized and labeled according to these several types of facial features, and the labeled regions can be obtained by, but are not limited to: a region containing a designated facial organ, a region containing an obvious skeletal feature, a region containing a facial irregularity feature, a region containing wrinkles, and the like. This means that when the beauty instrument is used to perform maintenance on areas with different types of facial features, the beauty instrument can use different operating parameters, which makes the maintenance more targeted and improves the maintenance effect.
In the present embodiment, the type of the face feature is not limited, and can be flexibly set according to the requirement. For example, facial features may be classified according to facial organ features, bone features, facial irregularity features, and wrinkle features. For another example, the types of facial features may be divided according to the depth of wrinkles of the face and whether or not the facial features are included, facial features having a wrinkle depth greater than a set threshold (simply referred to as deep wrinkle features) are divided into a 1-type facial features, facial features having a wrinkle depth less than a set threshold (simply referred to as shallow wrinkle features) are divided into a 2-type facial features, facial organ features (such as eyes, nose, mouth) are divided into A3-type facial features, and facial features not including wrinkles and facial organs are divided into a 4-type facial features. For another example, as shown in fig. 1c, the types of facial features may be classified according to the positions of wrinkles, and wrinkles around the forehead, such as a head raising line and an glabellar line, may be classified as B1-type facial features, wrinkles around the eyes, such as a crow's feet line, a tear groove line, and a transverse nasal bridge line, may be classified as B2-type facial features, and wrinkles around the mouth, such as a french line, a labial line, and a perioral line, may be classified as B3-type facial features. Of course, the types of facial features may be distinguished according to the texture of wrinkles, the skin condition of the face, and the like, in addition to these. The types of facial features and therefore the types of labeled regions that are labeled are different. For example, the face may be divided into three types of face regions and labeled in a face map according to whether or not the face region is included and whether or not wrinkles are included, and specifically, a region including a facial organ may be divided into an M1 type labeled region, a face region including wrinkles may be divided into an M2 type labeled region, and a region not including wrinkles and organs may be divided into an M3 type labeled region. For another example, the face may be divided into four types of face regions and labeled in a face map according to the depth of wrinkles of the face and whether the face organ is included, specifically, the deep wrinkle region is divided into N1 types of labeled regions, the shallow wrinkle region is divided into N2 types of labeled regions, the region including the face organ is divided into N3 types of labeled regions, and the region without wrinkles and the face organ is divided into N4 types of labeled regions.
In this embodiment, the embodiment of the terminal device 102 acquiring the face map is not limited, and based on the above, the acquisition manners of the face map include, but are not limited to, the following three types:
mode a 1: the beauty instrument 101 is also provided with a camera which can collect the characteristic information of the face area in the working process of the beauty instrument 101. Wherein, the camera on the beauty instrument 101 can be but is not limited to: monocular camera, binocular camera, RGBD camera etc.. The camera of the beauty instrument 101 may be mounted on the beauty head 101b or the body 101a, and the mounting position is based on the feature information of the face region collected during the operation of the beauty instrument 101. Based on this, after the beauty instrument 101 and the terminal device 102 are connected, the terminal device 102 may prompt the user to traverse the whole face by using the beauty instrument 101, and in the traversing process, the beauty instrument 101 acquires feature information of the whole face by using a camera thereof, and sends the feature information of the whole face to the terminal device 102. Alternatively, the terminal device 102 may also use its camera to capture a face image of the user, where the face image includes feature information of the whole face. After obtaining the feature information of the whole face, the terminal device 102 may identify at least one type of facial feature from the feature information of the whole face, mark an area where the at least one type of facial feature is located, obtain at least one type of marked area, and thereby generate a face map. The feature information of the whole face includes, but is not limited to: bone topography, and location, distance, and angle of eye, nose, mouth, or wrinkle information. In the method a1, the beauty instrument 101 or the terminal device 102 collects feature information of a human face, and the terminal device 102 generates a human face map.
Mode a 2: similar to the manner a1, the difference is that the system further comprises a server 103. As shown in fig. 1a, after collecting feature information of a whole face, the beauty instrument 101 uploads the feature information of the whole face to the server 103; as shown in fig. 1a, the server 103 identifies at least one type of facial feature from the feature information of the whole face, and marks an area where the at least one type of facial feature is located to obtain at least one type of marked area, thereby generating a face map. After that, as shown in fig. 1a (c) and ((c)) the face map is provided to the terminal device 102. In the method a2, the beauty instrument 101 transmits the feature information of the face to the server 103, and the terminal device 102 receives the face map delivered from the server 103.
Mode a 3: as shown in fig. 1b, the system further comprises a server 103. A method for acquiring a face map comprises the following steps: the method comprises the steps that a camera on the terminal device 102 is used for shooting a face image of a user, the face image is uploaded to the server 103, the server 103 receives the face image of the user, at least one type of face features contained in the face image are recognized, the region where the at least one type of face features are located is marked, at least one type of marked region is obtained, and therefore a face map is generated. After obtaining the face map, the server 103 stores the face map, and provides the face map to the terminal device 102. In the method a3, the terminal device 102 uploads the face image to the server 103, and the terminal device 102 receives the face map delivered by the server 103.
In this embodiment, considering that the beauty instrument has its own operating parameters, the operating parameters supported by the beauty instrument include, but are not limited to: at least one of the working current, the working frequency or the vibration intensity of the beauty instrument. Moreover, the beauty instrument may have different values of working parameters, and the beauty or care effect and the user experience may be different. For example, some parts are maintained by using higher vibration intensity, so that the maintenance effect is more ideal, and the user experience is better; and other parts are maintained by using a smaller working current, so that the maintenance effect is better, and the user experience is better. Based on this, no matter the face map is generated by the server 103 or the terminal device 102, after the face map is generated, different cosmetic instrument operating parameters can be set for different types of marked areas in the face map, that is, a corresponding relationship between the type information of the marked areas in the face map and the cosmetic instrument operating parameters is established, and for different types of marked areas, the cosmetic instrument can adopt different operating parameters to perform different maintenance on different parts on the face of the user.
In this embodiment, the timing of obtaining the face map is not limited, and the terminal device 102 may obtain and store the face map locally in advance, and when the beauty instrument needs to be used for maintenance, the face map locally stored may be directly obtained. Alternatively, as shown in fig. 1a (c) and ((c)) the terminal device 102 may send an acquisition request to the server 103 and receive a face map sent by the server 103 according to the acquisition request when the cosmetic instrument needs to be used for maintenance. Alternatively, the terminal device 102 may acquire a face image each time the beauty instrument is used for maintenance, and generate a face map including at least one type of mark region in real time according to facial features included in the face image.
When the user needs to perform maintenance using the beauty instrument, the user moves the beauty instrument 101 on the face and performs maintenance on the skin of the face using the maintenance module on the beauty head 101b of the beauty instrument 101. During the operation of the beauty instrument 101, the terminal device 102 may acquire a face area where the beauty instrument 101 is currently located, where the face area where the beauty instrument 101 is currently located refers to a face area currently covered by the beauty head 101b of the beauty instrument 101. In the present embodiment, the embodiment in which the terminal device 102 acquires the face area where the beauty instrument 101 is currently located is not limited, and the following description will be made by way of example.
In an alternative embodiment, as shown in fig. 1a, the terminal device 102 has a camera, so that the camera on the terminal device 102 can be directed at the face of the user during the operation of the cosmetic apparatus 101, and the camera on the terminal device 102 is used to capture the face image containing the operation state of the cosmetic apparatus 101; according to the image recognition technology, the working coordinates of the beauty instrument 101 in the face image can be calculated; according to the pre-generated face map and the working coordinates of the beauty instrument in the face image, the face area where the beauty instrument is located at present can be determined, namely the position range of the face area where the beauty instrument is located in the face map is obtained. The camera on the terminal device 102 may be, but is not limited to: a monocular camera, a binocular camera, a Depth (Red Green Blue Depth map, RGBD) camera, etc.
In another optional embodiment, during the operation of the beauty instrument 101, the beauty instrument 101 acquires feature information of the current face area by using a camera on the beauty head, and transmits the acquired feature information of the current face area to the terminal device 102; the terminal device 102 receives the feature information of the current face area transmitted by the beauty instrument 101, and identifies the face area where the beauty instrument 101 is currently located, based on the feature information of the current face area. Optionally, the terminal device 102 may compare the feature information of the current face area with the feature information in the face map to determine the position range of the current face area in combination with the face map. For example, if the current face region contains the corners of the mouth, the current face region may be determined to be a region around the corners of the mouth, which may be, for example, a region where the left or right cheek is close to the mouth.
In any of the above embodiments, after the face area where the beauty instrument 101 is currently located is acquired, as shown in fig. 1a, the terminal device 102 may match the face area where the beauty instrument is currently located with a mark area in a face map, call a mark area in the face map that is matched with the face area where the beauty instrument is currently located as a target mark area, and determine type information to which the target mark area belongs as target type information.
In this embodiment, an implementation of matching any one of the mark areas in the face map using the current face area to obtain the target mark area is not limited. One specific embodiment of determining a target mark region includes: judging whether a mark area with a position range overlapped with the face area exists or not according to the position range of the face area where the beauty instrument is located in the face map and the position range of at least one mark area in the face map; if yes, the mark area with the position range overlapping with the face area is used as the target mark area in face area matching.
In this embodiment, after determining the target type information, as shown in fig. 1a, the terminal device 102 may control the beauty instrument to maintain the currently located face region according to the operating parameters corresponding to the target type information. The embodiment of the application can control the beauty instrument and adopt different working parameters to carry out different maintenance to the face region of different grade type, is favorable to improving the use flexibility of beauty instrument for the maintenance effect is more ideal, improves user's use and experiences, increases user's viscidity.
In some optional embodiments of the present application, the face map, whether generated by the server 103 or the terminal device 102, may be marked with different marking information in different types of marked areas in the face map. Accordingly, the process of generating the face map includes: acquiring a face image of a user, and identifying at least one type of face features contained in the face image, wherein the different types of face features correspond to different working parameters; in the face image, the area where at least one type of facial features are located is marked by using different marking information and is used as at least one type of marking area, so that a face map is obtained.
The marking information can be used as attribute information of the marking area and cannot be seen by a user; or, some visual information can be displayed to the user along with the face map, so that the user can more intuitively and conveniently know the marked area in the face map. In the case that the label information is implemented by using visual information, the label information may be some text information, for example, names of the respective areas. In addition, the marking information may be a line having different visual properties, wherein the visual properties of the line include at least one of color, line type, and line width. Based on this, in an optional embodiment, in the face image, the marking out the area where the at least one type of facial feature is located by using different marking information as the way of the at least one type of marking area includes: and utilizing lines with different visualization attributes to circle out the region where the at least one type of facial features are located as the at least one type of mark region. In another alternative embodiment, the marking information may also be implemented by filling with ground colors having different colors, and based on this, in the face image, the region where at least one type of facial feature is located may also be filled with different colors as at least one type of marking region.
Further, in the case that different types of mark areas are marked by using different mark information, the correspondence between the type information of the mark areas in the face map and the working parameters of the beauty instrument can be specifically realized as the correspondence between the mark information and the working parameters of the beauty instrument, and for different mark information, the beauty instrument can use different working parameters to perform different maintenance on different parts of the face of the user corresponding to different mark information.
Further, the type information of the mark area may be embodied by mark information of the mark area, and based on this, the target type information of the target mark area is determined, which is actually a process of determining the mark information corresponding to the target mark area. For convenience of description and distinction, the mark information corresponding to the target mark region is referred to as target mark information. Further, the beauty treatment apparatus 101 may be controlled to maintain the face area currently located according to the operation parameter corresponding to the target mark information based on the correspondence between the mark information of the mark area and the beauty treatment apparatus operation parameter.
Of course, when maintaining the association relationship between the operating parameters of the beauty instrument and the label information of the label area in the face map, the terminal device 102 may also set the operating parameters of the beauty instrument for the non-label area in the face map, so that when the face area where the beauty instrument is currently located does not match the label area in the face map, the terminal device may control the beauty instrument to maintain the face area where the beauty instrument is currently located according to the operating parameters corresponding to the non-label area.
In the present embodiment, the following examples are given without limiting the embodiment in which the terminal device 102 controls the beauty instrument 101 to maintain the face area where the beauty instrument is currently located according to the operation parameters corresponding to the target type information.
In an optional embodiment, the terminal device 102 generates a face map, where the face map includes at least one type of mark area, the mark areas of different types are marked with different mark information, and establishes a correspondence between the mark information of the mark area and an operating parameter of the cosmetic instrument, on one hand, the correspondence is stored locally, and on the other hand, the correspondence may be provided to the cosmetic instrument 101, and the cosmetic instrument receives and stores the correspondence between the mark information and the operating parameter. When a user uses the beauty instrument to maintain a face area, the terminal device 102 determines a target marking area based on the fact that the face area where the beauty instrument is located is matched with a marking area in a face map, and further obtains target marking information corresponding to the target marking area, and sends the target marking information to the beauty instrument 101, or the terminal device 102 sends the target marking information to the server 103, and the server 103 sends the target marking information to the beauty instrument 101; the beauty instrument 101 receives the target marking information, determines a working parameter corresponding to the target marking information based on the correspondence between the marking information and the working parameter, and maintains the current face area according to the working parameter.
In another optional embodiment, the terminal device 102 generates a face map, the face map includes at least one type of mark area, the different types of mark areas are marked by using different mark information, and a corresponding relationship between the mark information of the mark areas and the working parameters of the beauty instrument is established. When a user uses the beauty instrument to maintain a face area, the terminal device 102 determines a target mark area based on the fact that the face area where the beauty instrument is located is matched with a mark area in a face map, and further obtains target mark information of the target mark area, after the terminal device 102 obtains the target mark information, working parameters corresponding to the target mark information are determined based on the corresponding relation, and the working parameters are directly provided for the beauty instrument 101 or provided for the beauty instrument 101 through the server 103. The beauty instrument 101 receives the operating parameters and maintains the face area according to the operating parameters.
In yet another optional embodiment, the server 103 generates a face map, where the face map includes at least one type of mark area, the mark areas of different types are marked with different mark information, a correspondence between the mark information of the mark area and working parameters of the cosmetic instrument is established, and the face map and the correspondence are sent to the terminal device 102, and the terminal device 102 receives the face map sent by the server 103 and the correspondence between the mark information and the working parameters. When a user uses the beauty instrument to maintain a face area, the terminal device 102 determines a target marking area based on the fact that the face area where the beauty instrument is located is matched with a marking area in a face map, and then obtains target marking information of the target marking area; based on the correspondence, the operation parameter corresponding to the target mark information is determined, and the operation parameter is supplied to the cosmetic apparatus 101 directly or to the cosmetic apparatus 101 via the server 103. The beauty instrument 101 receives the operating parameters and maintains the face area according to the operating parameters.
In this embodiment, during the operation of the cosmetic apparatus, the user is supported to set the corresponding operating parameters of the cosmetic apparatus to the mark area by default, or manually set the corresponding operating parameters of the cosmetic apparatus to the mark area by the user. In an optional embodiment, in the process of maintaining the face area currently located by using the cosmetic instrument, if the cosmetic effect or feeling generated by the user on the current working parameters of the cosmetic instrument is not satisfactory, the working parameters of the cosmetic instrument can be manually adjusted, for example, the working current or intensity of the cosmetic instrument can be increased or decreased by physical keys on the cosmetic instrument; on one hand, the beauty instrument executes beauty work according to the working parameters manually adjusted by the user, on the other hand, the working parameters manually adjusted by the user can be sent to the terminal equipment 102, and the terminal equipment 102 receives the working parameters which are reported by the beauty instrument 101 and are actually used in the face area where the beauty instrument is located and are manually adjusted by the user; updating the working parameters corresponding to the target marking area into working parameters manually adjusted by a user; and updating the marking information corresponding to the target marking area to marking information corresponding to the operating parameters manually adjusted by the user. Therefore, the working parameters of the beauty instrument corresponding to different marking areas are updated according to the actual demands or use habits of the user, so that the beauty instrument is beneficial to the follow-up use of the updated working parameters to execute beauty work, meets the demands or use habits of the user better, and is beneficial to further improving the use experience of the user.
In the embodiment of the application, different types of face areas are determined in advance, corresponding relations between the different types of face areas and working parameters of the beauty instrument are established, in the working process of the beauty instrument, the face area where the beauty instrument is located currently is obtained, and when the face area belongs to the preset face area type, the beauty instrument is controlled to use the working parameters corresponding to the face area of the type to maintain the face area. The embodiment of the application can control the beauty instrument and adopt different working parameters to carry out different maintenance to the face region of different grade type, is favorable to improving the use flexibility of beauty instrument for the maintenance effect is more ideal, improves user's use and experiences, increases user's viscidity.
Fig. 2 is a schematic structural view of another cosmetic instrument control system provided in an exemplary embodiment of the present application; as shown in fig. 2, the beauty instrument control system 200 includes: beauty instrument 201, terminal equipment 202 and server 203.
This embodiment differs from the embodiment shown in fig. 1a in that: in fig. 1a, the beauty instrument is mainly controlled by the terminal device and uses different operating parameters to maintain the face area where the beauty instrument is currently located. For details of the beauty instrument and the terminal device, reference may be made to the foregoing embodiments, which are not described herein again.
In this embodiment, the server 203 may generate a face map of the user, where the face map includes at least one type of mark area, and the mark areas of different types correspond to different working parameters of the beauty instrument. As shown in fig. 2, the first process is a process in which the server obtains the face feature information through the beauty instrument, and the second process is a process in which the server generates the face map, and as for the implementation of the face map, reference may be made to the foregoing implementation a2-A3, which is not described herein again.
In this embodiment, the server 203 may also obtain the face area where the beauty instrument 201 is currently located, and in this embodiment, the embodiment in which the server 203 obtains the face area where the beauty instrument 201 is currently located is not limited. In an alternative embodiment, as shown in fig. 2 c, the terminal device 202 has a camera, so that the camera on the terminal device 202 can be directed at the face of the user during the operation of the beauty instrument 201, and the camera on the terminal device 202 is used to capture the face image containing the operation state of the beauty instrument 201; as shown in the fourth step in fig. 2, the face image is provided to the server 203, and the server 203 receives the face image and calculates the working coordinates of the beauty instrument 201 in the face image according to the image recognition technology, as shown in the fifth step in fig. 2; according to the pre-generated face map and the working coordinates of the beauty instrument in the face image, the face area where the beauty instrument is located at present can be determined, namely the position range of the face area where the beauty instrument is located in the face map is obtained.
In another optional embodiment, in the working process of the beauty instrument 201, the beauty instrument 201 acquires feature information of the current face area by using a camera on the beauty head, and sends the acquired feature information of the current face area to the server 203; the server 203 receives the feature information of the current face area transmitted by the beauty instrument 201, and identifies the face area where the beauty instrument 201 is currently located, based on the feature information of the current face area. Alternatively, the server 203 may compare the feature information of the current face area with the feature information in the face map to determine the position range of the current face area in combination with the face map. For example, if the current face region contains the corners of the mouth, the current face region may be determined to be a region around the corners of the mouth, which may be, for example, a region where the left or right cheek is close to the mouth.
In any of the above embodiments, after the face area where the beauty instrument 201 is currently located is acquired, as shown in fig. 2, the server 203 may match the acquired face area where the beauty instrument is currently located with a mark area in the face map, call a mark area in the face map that is matched with the face area where the beauty instrument is currently located as a target mark area, and determine type information to which the target mark area belongs as target type information. After determining the target type information, as shown in fig. 2, the server 103 may control the beauty instrument to maintain the currently located face region according to the working parameters corresponding to the target type information. The embodiment of the application can control the beauty instrument and adopt different working parameters to carry out different maintenance to the face region of different grade type, is favorable to improving the use flexibility of beauty instrument for the maintenance effect is more ideal, improves user's use and experiences, increases user's viscidity.
In some optional embodiments of the present application, in the face map, different types of marked areas may be marked with different marking information. For an implementation of using the mark information to mark the mark area in the face map, reference may be made to the foregoing embodiment, which is not described herein again.
Further, in the case where different types of mark areas are marked with different mark information, the correspondence between the type information of the mark areas in the face map and the working parameters of the cosmetic apparatus can be specifically realized as the correspondence between the mark information and the working parameters of the cosmetic apparatus.
Further, the type information of the mark area may be embodied by mark information of the mark area, and based on this, the determination of the target type information of the target mark area is actually a process of determining the target mark information corresponding to the target mark area. Further, the beauty treatment apparatus 101 may be controlled to maintain the face area currently located according to the operation parameter corresponding to the target mark information based on the correspondence between the mark information of the mark area and the beauty treatment apparatus operation parameter.
In the present embodiment, the following examples are given without limiting the embodiment in which the server 203 controls the beauty instrument 201 to maintain the face area currently located according to the operation parameters corresponding to the target type information.
In an optional embodiment, the server 203 generates a face map, where the face map includes at least one type of mark area, the mark areas of different types are marked with different mark information, and establishes a correspondence between the mark information of the mark area and working parameters of the cosmetic instrument, on one hand, the correspondence is stored locally, on the other hand, the correspondence may be provided to the cosmetic instrument, and the cosmetic instrument receives and stores the correspondence between the mark information and the working parameters. When a user uses the beauty instrument to maintain a face area, the server 203 can determine the face area where the beauty instrument is currently located based on a face image provided by the terminal device or the beauty instrument; matching a mark area in the face map based on the face area where the beauty instrument is located, determining a target mark area, further acquiring target mark information corresponding to the target mark area, and sending the target mark information to the beauty instrument 201; the beauty instrument 201 receives the target marking information, determines a working parameter corresponding to the target marking information based on the corresponding relationship between the marking information and the working parameter, and maintains the face area where the target marking information is located according to the working parameter.
In another alternative embodiment, the server 203 generates a face map, the face map includes at least one type of mark area, the different types of mark areas are marked by using different mark information, and a corresponding relationship between the mark information of the mark areas and the working parameters of the beauty instrument is established. When a user uses the beauty instrument to maintain a face area, the server 203 can determine the face area where the beauty instrument is currently located based on a face image provided by the terminal device or the beauty instrument; matching a mark area in the face map based on the face area where the beauty instrument is located, determining a target mark area, further acquiring target mark information corresponding to the target mark area, determining a working parameter corresponding to the target mark information based on the corresponding relation, and providing the working parameter to the beauty instrument 201; the beauty instrument 201 receives the working parameters and maintains the face area according to the working parameters.
In this embodiment, during the operation of the cosmetic apparatus, the user is supported to set the corresponding operating parameters of the cosmetic apparatus to the mark area by default, or manually set the corresponding operating parameters of the cosmetic apparatus to the mark area by the user. In an optional embodiment, in the process of maintaining the face area where the cosmetic instrument is located, if the cosmetic effect or feeling generated by the user on the current working parameters of the cosmetic instrument is not satisfactory, the working parameters of the cosmetic instrument can be manually adjusted, for example, the working current or intensity of the cosmetic instrument can be increased or decreased through a physical key on the cosmetic instrument; the beauty instrument executes beauty work according to the working parameters manually adjusted by the user on one hand, and can send the working parameters manually adjusted by the user to the server 203 on the other hand; the server 203 receives the working parameters which are reported by the beauty instrument 201 and are actually used in the face area where the beauty instrument is located and are manually adjusted by the user; updating the working parameters corresponding to the target marking area into working parameters manually adjusted by a user; and updating the marking information corresponding to the target marking area to marking information corresponding to the operating parameters manually adjusted by the user. Therefore, the working parameters of the beauty instrument corresponding to different marking areas are updated according to the actual demands or use habits of the user, so that the beauty instrument is beneficial to the follow-up use of the updated working parameters to execute beauty work, meets the demands or use habits of the user better, and is beneficial to further improving the use experience of the user.
Scenario implementation example:
the user buys a beauty instrument. Before use, an initialization configuration operation is required:
the method comprises the steps of starting the beauty instrument, establishing a connection relation between the APP and the beauty instrument in the terminal equipment, starting a camera function of the APP, aligning a face region of a user by using the camera, collecting a face image of the user, transmitting the face image to a server, receiving the face image of the user by the server, identifying at least one facial feature contained in the face image, and marking a region where the at least one type of facial feature is located by using different marking information in the face image to serve as the at least one type of marking region so as to obtain a face map. In the face map, three kinds of marking information of red, blue and orange are provided, wherein, the deep wrinkle area is circled by a red line; the areas with shallow wrinkles are circled with blue lines; the position of the facial organ is circled with an orange line. After obtaining the face map, the server stores the face map on one hand and provides the face map to the terminal equipment on the other hand. The server also maintains the corresponding relation between the mark information and the working parameters of the beauty instrument and sends the corresponding relation to the beauty instrument, and the beauty instrument receives and stores the corresponding relation. The correspondence between an exemplary tag information and an operating parameter is expressed as: red marker < - > current X1; blue label < - > current X2; orange label < - > current X3; no label < - > current X4; wherein, the current value magnitude relation is as follows: x1> X2> X3> X4.
The beauty instrument is used:
in the process of using the beauty instrument by a user, the user uses a terminal device to collect a face area where the beauty instrument is located currently, matches the face area with a mark area in a face map, acquires mark information corresponding to the mark area for the matched mark area, supposes that the mark information is a blue mark, provides the blue mark to the beauty instrument through a server, receives the blue mark by the beauty instrument, determines an operating parameter of the beauty instrument to be current X2 based on a corresponding relation between the pre-stored mark information and the operating parameter, and maintains the face area where the beauty instrument is located currently according to the operating parameter X2.
When the user uses the beauty instrument, the current is found to be small when the face area is maintained by using the current X2, and the user can adjust the current to the current X1 by using a key on the beauty instrument, namely, the current X1 is used for maintaining the current face area. The beauty instrument reports the current X1 actually used in the face area to the server, the server receives the current value X1, changes the current value of the marking area corresponding to the face area from the current X2 to the current X1, and simultaneously changes the marking information corresponding to the marking area from a blue mark to a red mark.
Fig. 3a is a schematic flowchart of a cosmetic instrument control method according to an exemplary embodiment of the present application, and as shown in fig. 3a, the method includes:
301a, acquiring a face area where the beauty instrument is located currently and a face map, wherein the face map comprises at least one type of mark area, and the mark areas of different types correspond to different working parameters;
302a, when a target mark area of a face map is matched with a face area, determining type information of the target mark area as target type information;
303a, controlling the beauty instrument to maintain the face area according to the working parameters corresponding to the target type information.
In an alternative embodiment, the operating parameters of the cosmetic device include: at least one of the working current, the working frequency or the vibration intensity of the beauty instrument.
In an optional embodiment, when determining the target mark region in the face region matching, the method further includes: judging whether a mark region with the position range overlapped with the face region exists or not according to the position range of the face region in the face map and the position range of at least one mark region in the face map; if yes, the mark area with the position range overlapping with the face area is used as the target mark area in face area matching.
In an optional embodiment, in the face map, different types of marking areas are marked by using different marking information; then, determining type information of the target mark area as target type information includes: target mark information used by the target mark area is determined, the target mark information representing target type information.
In an alternative embodiment, obtaining a face map includes: acquiring a face image of a user, and identifying at least one type of face features contained in the face image, wherein the different types of face features correspond to different working parameters; in the face image, the area where at least one type of facial features are located is marked by using different marking information and is used as at least one type of marking area, so that a face map is obtained.
In an alternative embodiment, the at least one type of facial feature comprises: at least one of facial organ characteristics, bone characteristics, facial unevenness characteristics, and wrinkles.
In an alternative embodiment, in the face image, a region in which at least one type of facial feature is located is marked by using different marking information as at least one type of marking region, including: utilizing lines with different visualization attributes to circle out a region where at least one type of facial features are located, and using the region as at least one type of marking region; wherein the visual attribute of the line comprises at least one of color, line type and line width.
In an optional embodiment, the method provided in this embodiment further includes: generating a face map, wherein the face map comprises at least one type of mark area, the mark areas of different types are marked by using different mark information, a corresponding relation between the mark information of the mark area and working parameters of the beauty instrument is established, and the corresponding relation between the mark information and the working parameters is sent to the beauty instrument; correspondingly, control beauty instrument according to the working parameter that target type information corresponds, maintain the face region at present place, include: and sending the target mark information to the beauty instrument so that the beauty instrument determines working parameters corresponding to the target mark information based on the corresponding relation, and maintaining the face area where the beauty instrument is located according to the determined working parameters.
In an optional embodiment, the method provided in this embodiment further includes: generating a face map, wherein the face map comprises at least one type of mark area, the mark areas of different types are marked by using different mark information, and a corresponding relation between the mark information of the mark area and working parameters of the beauty instrument is established, so that the beauty instrument is controlled to maintain the face area at present according to the working parameters corresponding to the target type information, and the method comprises the following steps: determining working parameters corresponding to the target mark information; and sending the working parameters to the beauty instrument so that the beauty instrument maintains the current face area according to the working parameters.
In an optional embodiment, the server generates a face map, where the face map includes at least one type of mark area, the different types of mark areas are marked with different mark information, and a correspondence between the mark information of the mark area and working parameters of the cosmetic instrument is established, and then the terminal device determines the working parameters corresponding to the target mark information, including: the terminal equipment receives the corresponding relation between the marking information and the working parameters issued by the server; and determining working parameters corresponding to the target mark information according to the corresponding relation.
In an optional embodiment, the method provided in this embodiment further includes: receiving working parameters which are reported by the beauty instrument and are actually used in the current face area and are manually adjusted by a user; updating the working parameters corresponding to the target marking area into working parameters manually adjusted by a user; and updating the marking information corresponding to the target marking area to marking information corresponding to the operating parameters manually adjusted by the user.
Fig. 3b is a schematic flowchart of a working method of a beauty treatment apparatus according to an exemplary embodiment of the present application, where the method is applied to the beauty treatment apparatus, as shown in fig. 3b, the method includes:
301b, receiving the corresponding relation between the mark information and the working parameters sent by the server, wherein different mark information corresponds to different types of mark areas in the face map;
302b, receiving target marking information, wherein the target marking information is marking information corresponding to a target marking area matched with a face area where the beauty instrument is located currently;
303b, determining working parameters corresponding to the target mark information according to the corresponding relation;
304b, maintaining the current face area according to the working parameters corresponding to the target mark information.
In the embodiment of the application, different types of face areas are determined in advance, corresponding relations between the different types of face areas and working parameters of the beauty instrument are established, in the working process of the beauty instrument, the face area where the beauty instrument is located currently is obtained, and when the face area belongs to the preset face area type, the beauty instrument is controlled to use the working parameters corresponding to the face area of the type to maintain the face area. The embodiment of the application can control the beauty instrument and adopt different working parameters to carry out different maintenance to the face region of different grade type, is favorable to improving the use flexibility of beauty instrument for the maintenance effect is more ideal, improves user's use and experiences, increases user's viscidity.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 301a to 303a may be device a; for another example, the execution main bodies of steps 301a and 302a may be device a, and the execution main body of step 303a may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 301a, 302a, etc., are merely used for distinguishing different operations, and the sequence numbers themselves do not represent any execution order. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel.
Fig. 4 is a schematic structural diagram of a terminal device according to an exemplary embodiment of the present application. As shown in fig. 4, the terminal device includes: a memory 44 and a processor 45.
The memory 44 is used for storing computer programs and may be configured to store other various data to support operations on the terminal device. Examples of such data include instructions for any application or method operating on the terminal device.
The memory 44 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 45, coupled to the memory 44, for executing computer programs in the memory 44 for: acquiring a face area where the beauty instrument is located currently and a face map, wherein the face map comprises at least one type of mark area, and the mark areas of different types correspond to different working parameters; determining type information of a target marking area as target type information when the target marking area of the face map is matched with the face area; and controlling the beauty instrument to maintain the face area according to the working parameters corresponding to the target type information.
In an alternative embodiment, processor 45 is further configured to: judging whether a mark region with the position range overlapped with the face region exists or not according to the position range of the face region in the face map and the position range of at least one mark region in the face map; if yes, the mark area with the position range overlapping with the face area is used as the target mark area in face area matching.
In an optional embodiment, in the face map, different types of marking areas are marked by using different marking information; the processor 45, when determining the type information of the target mark area as the target type information, is specifically configured to: target mark information used by the target mark area is determined, the target mark information representing target type information.
In an optional embodiment, when the processor 45 acquires the face map, it is specifically configured to: acquiring a face image of a user, and identifying at least one type of face features contained in the face image, wherein the different types of face features correspond to different working parameters; in the face image, the area where at least one type of facial features are located is marked by using different marking information and is used as at least one type of marking area, so that a face map is obtained.
In an alternative embodiment, the processor 45 marks, in the face image, an area where at least one type of facial feature is located by using different marking information, and when the area is used as the at least one type of marked area, the processor is specifically configured to: utilizing lines with different visualization attributes to circle out a region where at least one type of facial features are located, and using the region as at least one type of marking region; wherein the visual attribute of the line comprises at least one of color, line type and line width.
In an alternative embodiment, processor 45 is further configured to: sending the corresponding relation between the marking information and the working parameters to the beauty instrument; correspondingly, when the processor 45 controls the beauty instrument to maintain the current face area according to the working parameters corresponding to the target type information, the processor is specifically configured to: and sending the target mark information to the beauty instrument so that the beauty instrument determines working parameters corresponding to the target mark information based on the corresponding relation, and maintaining the face area where the beauty instrument is located according to the determined working parameters.
In an alternative embodiment, the at least one type of facial feature comprises: at least one of facial organ characteristics, facial irregularity characteristics, and wrinkles.
In an optional embodiment, when controlling the beauty instrument to maintain the current face area according to the working parameters corresponding to the target type information, the processor 45 is specifically configured to: determining working parameters corresponding to the target mark information; and sending the working parameters to the beauty instrument so that the beauty instrument maintains the current face area according to the working parameters.
In an optional embodiment, when determining the working parameter corresponding to the target mark information, the processor 45 is specifically configured to: receiving the corresponding relation between the marking information and the working parameters issued by the server; and determining working parameters corresponding to the target mark information according to the corresponding relation.
In an alternative embodiment, processor 45 is further configured to: receiving working parameters which are reported by the beauty instrument and are actually used in the current face area and are manually adjusted by a user; updating the working parameters corresponding to the target marking area into working parameters manually adjusted by a user; and updating the marking information corresponding to the target marking area to marking information corresponding to the operating parameters manually adjusted by the user.
In an alternative embodiment, the operating parameters of the cosmetic device include: at least one of the working current, the working frequency or the vibration intensity of the beauty instrument.
The terminal equipment provided by the embodiment of the application determines face areas of different types in advance, establishes corresponding relations between the face areas of different types and working parameters of the beauty instrument, acquires the face area where the beauty instrument is located currently in the working process of the beauty instrument, and controls the beauty instrument to maintain the face area by using the working parameters corresponding to the face area of the type when the face area belongs to the preset face area type. The embodiment of the application can control the beauty instrument and adopt different working parameters to carry out different maintenance to the face region of different grade type, is favorable to improving the use flexibility of beauty instrument for the maintenance effect is more ideal, improves user's use and experiences, increases user's viscidity.
Further, as shown in fig. 4, the terminal device further includes: communication components 46, display 47, power components 48, audio components 49, and the like. Only some of the components are schematically shown in fig. 4, and it is not meant that the terminal device includes only the components shown in fig. 4. It should be noted that the components within the dashed line frame in fig. 4 are optional components, not necessary components, and may be determined according to the product form of the terminal device. The terminal device is realized as a desktop computer, a notebook computer, a smart phone, and the like.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing a computer program, which when executed, can implement the steps that can be executed by the terminal device in the cosmetic instrument control method embodiment.
Fig. 5 is a schematic structural diagram of a server according to an exemplary embodiment of the present application. As shown in fig. 5, the server includes: a memory 54 and a processor 55.
A memory 54 for storing computer programs and may be configured to store other various data to support operations on the server. Examples of such data include instructions for any application or method operating on the server.
The memory 54 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 55 coupled to the memory 54 for executing computer programs in the memory 54 for: acquiring a face image of a user, and identifying at least one type of face features contained in the face image, wherein the different types of face features correspond to different working parameters; in the face image, the area where at least one type of facial features are located is marked by using different marking information and is used as at least one type of marking area, so that a face map is obtained.
In an optional embodiment, the face area where the beauty instrument is located currently is obtained; determining type information of a target marking area as target type information when the target marking area of the face map is matched with the face area; and controlling the beauty instrument to maintain the face area according to the working parameters corresponding to the target type information.
In an alternative embodiment, processor 55 is further configured to: judging whether a mark region with the position range overlapped with the face region exists or not according to the position range of the face region in the face map and the position range of at least one mark region in the face map; if yes, the mark area with the position range overlapping with the face area is used as the target mark area in face area matching.
In an optional embodiment, in the face map, different types of marking areas are marked by using different marking information; the processor 55, when determining the type information of the target mark area as the target type information, is specifically configured to: target mark information used by the target mark area is determined, the target mark information representing target type information.
In an alternative embodiment, the processor 55 is specifically configured to, when marking out, in the face image, an area where at least one type of facial feature is located by using different marking information, as at least one type of marking area: utilizing lines with different visualization attributes to circle out a region where at least one type of facial features are located, and using the region as at least one type of marking region; wherein the visual attribute of the line comprises at least one of color, line type and line width.
In an alternative embodiment, processor 55 is further configured to: sending the corresponding relation between the marking information and the working parameters to the beauty instrument; correspondingly, when the processor 55 controls the beauty instrument to maintain the current face area according to the working parameters corresponding to the target type information, the processor is specifically configured to: and sending the target mark information to the beauty instrument so that the beauty instrument determines working parameters corresponding to the target mark information based on the corresponding relation, and maintaining the face area where the beauty instrument is located according to the determined working parameters.
In an alternative embodiment, the at least one type of facial feature comprises: at least one of facial organ characteristics, facial irregularity characteristics, and wrinkles.
In an optional embodiment, when controlling the beauty instrument to maintain the current face area according to the working parameters corresponding to the target type information, the processor 55 is specifically configured to: determining working parameters corresponding to the target mark information; and sending the working parameters to the beauty instrument so that the beauty instrument maintains the current face area according to the working parameters.
In an alternative embodiment, processor 55 is further configured to: receiving working parameters which are reported by the beauty instrument and are actually used in the current face area and are manually adjusted by a user; updating the working parameters corresponding to the target marking area into working parameters manually adjusted by a user; and updating the marking information corresponding to the target marking area to marking information corresponding to the operating parameters manually adjusted by the user.
In an alternative embodiment, the operating parameters of the cosmetic device include: at least one of the working current, the working frequency or the vibration intensity of the beauty instrument.
The server provided by the embodiment of the application determines face areas of different types in advance, establishes corresponding relations between the face areas of different types and working parameters of the beauty instrument, acquires the face area where the beauty instrument is located currently in the working process of the beauty instrument, and controls the beauty instrument to maintain the face area by using the working parameters corresponding to the face area of the type when the face area belongs to the preset face area type. The embodiment of the application can control the beauty instrument and adopt different working parameters to carry out different maintenance to the face region of different grade type, is favorable to improving the use flexibility of beauty instrument for the maintenance effect is more ideal, improves user's use and experiences, increases user's viscidity.
Further, as shown in fig. 5, the server further includes: communication components 56 and power components 58, among other components. Only some of the components are schematically shown in fig. 5, and it is not meant that the server includes only the components shown in fig. 5. It should be noted that the components within the dashed line box in fig. 5 are optional components, not necessary components, and may be determined according to the product form of the server. The server may be implemented as a conventional server, a cloud server, or an array of servers, among others.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing a computer program, which when executed, can implement the steps executable by the server in the cosmetic instrument control method embodiment.
Fig. 6 is a schematic structural view of a beauty instrument according to an exemplary embodiment of the present application, and as shown in fig. 6, the beauty instrument 600 includes: a body 601 and a cosmetic head 602; the main body 601 is provided with a main control board 603, and the main control board 603 is provided with a main control module 604 and a storage module 605.
The memory module 605 is used to store computer programs and may be configured to store other various data to support operations on the cosmetic instrument. Examples of such data include instructions for any application or method operating on the cosmetic device.
The main control module 605 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), magnetic memory, flash memory, magnetic disk, or optical disk.
The main control module 605 is used for executing computer programs for: receiving the corresponding relation between the marking information and the working parameters sent by the server, wherein different marking information corresponds to different types of marking areas in the face map; receiving target marking information, wherein the target marking information is marking information corresponding to a target marking area matched with a face area where the beauty instrument is located currently; determining working parameters corresponding to the target mark information according to the corresponding relation; and controlling the beauty head to maintain the current face area according to the working parameters corresponding to the target mark information.
Further, as shown in fig. 6, the main control board 603 of the beauty instrument 600 further includes: other components such as a communication component 606, a display 607, an audio component 608, and a power component 609; the cosmetic head 602 further includes: a micro-current module 610, an ultrasonic module 611, an LED light module 612, and the like. Only some of the components are shown schematically in fig. 6, and the beauty instrument is not meant to include only the components shown in fig. 6. It should be noted that the components shown in the dashed line in fig. 6 are optional components, not necessary components, and may be determined according to the product form of the cosmetic apparatus.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing a computer program, which when executed, can implement the steps that can be executed by the cosmetic instrument in the above-mentioned cosmetic instrument operation method embodiments.
The communication components of fig. 4-6 described above are configured to facilitate wired or wireless communication between the device in which the communication component is located and other devices. The device where the communication component is located can access a wireless network based on a communication standard, such as a WiFi, a 2G, 3G, 4G/LTE, 5G and other mobile communication networks, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
The displays in fig. 4 and 6 described above include screens, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply components of fig. 4-6 described above provide power to the various components of the device in which the power supply components are located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
The audio components of fig. 4 and 6 described above may be configured to output and/or input audio signals. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (13)

1. A cosmetic instrument control method, characterized in that the method comprises:
acquiring a face area where the beauty instrument is located currently and a face map, wherein the face map comprises at least one type of mark area, and the mark areas of different types correspond to different working parameters;
when the face area is matched with a target marking area of the face map, determining the type information of the target marking area as target type information;
and controlling the beauty instrument to maintain the face area according to the working parameters corresponding to the target type information.
2. The method of claim 1, further comprising:
judging whether a mark region with a position range overlapped with the face region exists or not according to the position range of the face region in the face map and the position range of the at least one mark region in the face map;
and if so, taking the mark region with the position range overlapping with the face region as a target mark region in the face region matching.
3. The method according to claim 1, characterized in that in the face map, different types of marking areas are marked with different marking information;
then, determining the type information of the target mark area as target type information includes: and determining target mark information used by the target mark area, wherein the target mark information represents the target type information.
4. The method of claim 3, wherein obtaining a face map comprises:
acquiring a face image of a user, and identifying at least one type of face features contained in the face image, wherein the different types of face features correspond to different working parameters;
and marking the area where the at least one type of facial features are located in the face image by using different marking information to serve as at least one type of marking area so as to obtain a face map.
5. The method according to claim 4, wherein the marking out the area where the at least one type of facial feature is located in the facial image by using different marking information as the at least one type of marked area comprises:
using lines with different visualization attributes to circle out the region where the at least one type of facial features are located to serve as at least one type of marking region; wherein the visualization attribute of the line comprises at least one of color, line type and line width.
6. The method of claim 4, further comprising: sending the corresponding relation between the marking information and the working parameters to the beauty instrument;
correspondingly, controlling the beauty instrument to maintain the face area at present according to the working parameters corresponding to the target type information, and the method comprises the following steps:
and sending the target mark information to a beauty instrument so that the beauty instrument determines working parameters corresponding to the target mark information based on the corresponding relation, and maintaining the face area where the beauty instrument is located according to the determined working parameters.
7. The method of claim 3, wherein controlling the beauty treatment apparatus to maintain the currently located face area according to the working parameters corresponding to the target type information comprises:
determining working parameters corresponding to the target mark information;
and sending the working parameters to a beauty instrument so that the beauty instrument maintains the current face area according to the working parameters.
8. The method of claim 7, wherein determining the operating parameters corresponding to the target mark information comprises:
receiving the corresponding relation between the marking information and the working parameters issued by the server;
and determining working parameters corresponding to the target mark information according to the corresponding relation.
9. The method of claim 3, further comprising:
receiving working parameters which are reported by the beauty instrument and are actually used in the current face area and are manually adjusted by a user;
updating the working parameters corresponding to the target marking area into working parameters manually adjusted by a user; and
and updating the marking information corresponding to the target marking area to the marking information corresponding to the working parameters manually adjusted by the user.
10. A cosmetic instrument working method is suitable for a cosmetic instrument and is characterized by comprising the following steps:
receiving the corresponding relation between the marking information and the working parameters sent by the server, wherein different marking information corresponds to different types of marking areas in the face map;
receiving target marking information, wherein the target marking information is marking information corresponding to a target marking area matched with a face area where the beauty instrument is located currently;
determining working parameters corresponding to the target mark information according to the corresponding relation;
and maintaining the current face area according to the working parameters corresponding to the target mark information.
11. A terminal device, comprising: a memory and a processor;
the memory for storing a computer program;
the processor, coupled with the memory, to execute the computer program to:
acquiring a face area where the beauty instrument is located currently and a face map, wherein the face map comprises at least one type of mark area, and the mark areas of different types correspond to different working parameters;
when the face area is matched with a target marking area of the face map, determining type information of the target marking area as target type information;
and controlling the beauty instrument to maintain the face area according to the working parameters corresponding to the target type information.
12. A server, comprising: a memory and a processor;
the memory for storing a computer program;
the processor, coupled with the memory, to execute the computer program to:
acquiring a face image of a user, and identifying at least one type of face features contained in the face image, wherein the different types of face features correspond to different working parameters;
and marking the area where the at least one type of facial features are located in the face image by using different marking information to serve as at least one type of marking area so as to obtain a face map.
13. A cosmetic instrument, comprising: a body and a cosmetic head; the main control board is arranged on the machine body, and a main control module and a storage module are arranged on the main control board;
the storage module is used for storing a computer program; the main control module is configured to execute the computer program, so as to:
receiving the corresponding relation between the marking information and the working parameters sent by the server, wherein different marking information corresponds to different types of marking areas in the face map;
receiving target marking information, wherein the target marking information is marking information corresponding to a target marking area matched with a face area where the beauty instrument is located currently;
determining working parameters corresponding to the target mark information according to the corresponding relation;
and controlling the beauty head to maintain the current face area according to the working parameters corresponding to the target mark information.
CN202011247170.1A 2020-11-10 2020-11-10 Beauty instrument control and working method and equipment Withdrawn CN112546438A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011247170.1A CN112546438A (en) 2020-11-10 2020-11-10 Beauty instrument control and working method and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011247170.1A CN112546438A (en) 2020-11-10 2020-11-10 Beauty instrument control and working method and equipment

Publications (1)

Publication Number Publication Date
CN112546438A true CN112546438A (en) 2021-03-26

Family

ID=75041894

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011247170.1A Withdrawn CN112546438A (en) 2020-11-10 2020-11-10 Beauty instrument control and working method and equipment

Country Status (1)

Country Link
CN (1) CN112546438A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113397480A (en) * 2021-05-10 2021-09-17 深圳数联天下智能科技有限公司 Control method, device and equipment of beauty instrument and storage medium
CN114305334A (en) * 2021-12-09 2022-04-12 深圳贵之族生科技有限公司 Intelligent beauty method, device, equipment and storage medium
CN114333036A (en) * 2022-01-20 2022-04-12 深圳市宝璐美容科技有限公司 Intelligent beauty control method, device, equipment and storage medium
CN114816567A (en) * 2022-04-12 2022-07-29 林镇清 Beauty parameter adjusting method and device, beauty instrument and storage medium
CN116747431A (en) * 2023-05-18 2023-09-15 深圳市宗匠科技有限公司 Method and device for detecting action position of beauty instrument and energy output and beauty instrument

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008178495A (en) * 2007-01-24 2008-08-07 Matsushita Electric Works Ltd Beauty treatment apparatus
CN105163695A (en) * 2012-10-12 2015-12-16 伊卢米内奇有限公司 A method and system for cosmetic skin procedures for home use
US20170215962A1 (en) * 2016-02-01 2017-08-03 S & Y Enterprises Llc Automatic aesthetic treatment device and method
US20180033205A1 (en) * 2016-08-01 2018-02-01 Lg Electronics Inc. Mobile terminal and operating method thereof
CN109865195A (en) * 2017-12-01 2019-06-11 蔡朝辉 Intelligent cosmetic apparatus based on treatment region identification
CN110193140A (en) * 2019-07-02 2019-09-03 厦门美图之家科技有限公司 Pulse beautifying instrument and cosmetic system
CN111714084A (en) * 2019-03-20 2020-09-29 株式会社爱茉莉太平洋 Skin beauty instrument and control method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008178495A (en) * 2007-01-24 2008-08-07 Matsushita Electric Works Ltd Beauty treatment apparatus
CN105163695A (en) * 2012-10-12 2015-12-16 伊卢米内奇有限公司 A method and system for cosmetic skin procedures for home use
US20170215962A1 (en) * 2016-02-01 2017-08-03 S & Y Enterprises Llc Automatic aesthetic treatment device and method
US20180033205A1 (en) * 2016-08-01 2018-02-01 Lg Electronics Inc. Mobile terminal and operating method thereof
CN109865195A (en) * 2017-12-01 2019-06-11 蔡朝辉 Intelligent cosmetic apparatus based on treatment region identification
CN111714084A (en) * 2019-03-20 2020-09-29 株式会社爱茉莉太平洋 Skin beauty instrument and control method thereof
CN110193140A (en) * 2019-07-02 2019-09-03 厦门美图之家科技有限公司 Pulse beautifying instrument and cosmetic system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113397480A (en) * 2021-05-10 2021-09-17 深圳数联天下智能科技有限公司 Control method, device and equipment of beauty instrument and storage medium
CN114305334A (en) * 2021-12-09 2022-04-12 深圳贵之族生科技有限公司 Intelligent beauty method, device, equipment and storage medium
CN114333036A (en) * 2022-01-20 2022-04-12 深圳市宝璐美容科技有限公司 Intelligent beauty control method, device, equipment and storage medium
CN114816567A (en) * 2022-04-12 2022-07-29 林镇清 Beauty parameter adjusting method and device, beauty instrument and storage medium
CN116747431A (en) * 2023-05-18 2023-09-15 深圳市宗匠科技有限公司 Method and device for detecting action position of beauty instrument and energy output and beauty instrument

Similar Documents

Publication Publication Date Title
CN112546438A (en) Beauty instrument control and working method and equipment
US10854017B2 (en) Three-dimensional virtual image display method and apparatus, terminal, and storage medium
CN108229415B (en) Information recommendation method and device, electronic equipment and computer-readable storage medium
US20200213530A1 (en) Terminal and server providing a video call service
CN108520241B (en) Fingerprint acquisition method and device based on optical fingerprint technology and user terminal
CN107832784B (en) Image beautifying method and mobile terminal
CN105574484A (en) Electronic device, and method for analyzing face information in electronic device
US20150182757A1 (en) A method and system for cosmetic skin procedures for home use
CN102693288A (en) Automatic recommendation method for makeup scheme
CN105512605A (en) Face image processing method and device
US11682148B2 (en) Method for displaying advertisement picture, method for uploading advertisement picture, and apparatus
CN106875925A (en) The refresh rate method of adjustment and device of screen
CN108875462A (en) Eyebrow moulding guidance device and its method
CN110136236B (en) Personalized face display method, device and equipment for three-dimensional character and storage medium
CN111047511A (en) Image processing method and electronic equipment
US11412341B2 (en) Electronic apparatus and controlling method thereof
US20200305580A1 (en) Intelligent head cover and control method thereof, and terminal
CN107369142A (en) Image processing method and device
CN114377293A (en) Beauty instrument control method, equipment and storage medium
CN109451235B (en) Image processing method and mobile terminal
KR20190127127A (en) Method for Providing Augmented Nail Art Service Based on Mobile Web
CN113255396A (en) Training method and device of image processing model, and image processing method and device
CN111103975A (en) Display method, electronic equipment and system
CN107563353B (en) Image processing method and device and mobile terminal
US12008696B2 (en) Translation method and AR device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210326

WW01 Invention patent application withdrawn after publication