CN110674710A - Dressing auxiliary control method, dressing auxiliary control system and storage medium - Google Patents
Dressing auxiliary control method, dressing auxiliary control system and storage medium Download PDFInfo
- Publication number
- CN110674710A CN110674710A CN201910849775.9A CN201910849775A CN110674710A CN 110674710 A CN110674710 A CN 110674710A CN 201910849775 A CN201910849775 A CN 201910849775A CN 110674710 A CN110674710 A CN 110674710A
- Authority
- CN
- China
- Prior art keywords
- user
- control center
- image
- face
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 33
- 210000000744 eyelid Anatomy 0.000 claims abstract description 15
- 230000001815 facial effect Effects 0.000 claims abstract description 11
- 238000011156 evaluation Methods 0.000 claims description 10
- 239000002537 cosmetic Substances 0.000 claims description 8
- 238000004590 computer program Methods 0.000 claims description 7
- 230000007246 mechanism Effects 0.000 claims description 6
- 230000003993 interaction Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 235000002673 Dioscorea communis Nutrition 0.000 description 2
- 241000544230 Dioscorea communis Species 0.000 description 2
- 208000035753 Periorbital contusion Diseases 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 239000006260 foam Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 239000000344 soap Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L17/00—Speaker identification or verification
- G10L17/22—Interactive procedures; Man-machine interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
Abstract
The invention relates to a dressing auxiliary control method, a dressing auxiliary control system and a storage medium, wherein the method comprises the following steps: pre-storing and storing a reference image, shooting a facial image of a user, comparing the shot image with the reference image, prompting the user to clean when the face of the user is judged not to be cleaned, and recommending appropriate makeup to cover when the skin blackness at the lower eyelid of the user is judged to be larger than or equal to a preset value. Based on the above, the dressing auxiliary control method provided by the invention can realize the recommendation of dressing and the prompt that the face is not cleaned, and is more convenient for users to dress compared with the existing dressing table which only can provide light assistance.
Description
Technical Field
The invention relates to the technical field of smart home, in particular to a dressing auxiliary control method, a dressing auxiliary control system and a storage medium.
Background
The dressing table is a tool which is usually necessary for female household personnel during the process of making up, and generally refers to a fixture used for providing assistance for the cosmetic personnel, namely a furniture dressing table.
Conventional dressing tables generally include: a table top for placing cosmetics, auxiliary tools, etc.; and a mirror surface for a user to observe a facial condition. Such a vanity table functions very little and can only meet the basic needs of the user, which is not helpful for adjusting the lights, looking at facial details, etc.
The conventional dressing table is improved by the existing intelligent home system, so that the dressing table is more intelligent. Existing household vanities typically include: the LED lamp comprises an instruction acquisition unit, an LED lamp belt, a reflector, a light adjustment unit, a human-computer interaction unit and a cosmetic mirror, wherein the instruction acquisition unit is used for acquiring an operation instruction of a user and generating a corresponding human-computer interaction instruction; the reflector is used for emitting light emitted by the LED lamp strip; the light adjusting unit is used for correspondingly adjusting the rotation of the angle of the reflector; the man-machine interaction unit is used for executing a man-machine interaction instruction and controlling the LED lamp strip, the reflector, the light adjusting unit and the cosmetic mirror.
The dressing table in the intelligent home system solves the problem of the traditional dressing table to a certain extent, but still can only provide assistance, and does not help dressing.
Therefore, the prior art has yet to be improved.
Disclosure of Invention
Therefore, it is necessary to provide a toilet auxiliary control method, a toilet auxiliary control system and a storage medium capable of automatically detecting at least one component leak in order to solve the problem that a toilet table in the existing smart home system can only provide assistance and does not help the toilet itself.
The technical scheme of the invention is as follows:
a method of cosmetic aid control, comprising:
receiving and storing a user face image uploaded by a user or shot in advance as a reference image in advance;
the control center receives a user operation instruction and controls the camera to shoot a face image of the user;
the control center compares the image shot by the camera with the reference image, and judges whether the face of the user is cleaned and whether the skin blackness at the lower eye and face of the user is larger than or equal to a preset value;
when the face of the user is judged to be not cleaned, the control center controls the mirror surface to display the part which is not cleaned in an amplifying way, and prompts the user to clean the displayed part; receiving a user operation instruction, and controlling the camera to shoot the image again until the face of the user is cleaned;
when the skin blackness degree of the lower eyelid of the user is judged to be larger than or equal to the preset value, the control center controls the mirror surface to display the makeup capable of covering the skin of the lower eyelid; and receiving a user operation instruction, and controlling the camera to shoot the image again until the skin blackness degree of the lower eye and face of the user in the shot image is less than a preset value.
In a further preferred embodiment, the step of the control center receiving a user operation instruction and controlling the camera to capture a facial image of the user further includes:
the microphone receives the sound emitted by the resident;
the control center judges the user identity according to the sound received by the microphone, and calls a face image corresponding to the user identity in the database as a reference image.
In a further preferred scheme, the step of receiving and storing a user face image uploaded by a user or shot in advance as a reference image further comprises:
the control center receives a user operation instruction and controls the camera to shoot the clothes and accessories of the resident; or the control center receives and stores the clothes and the accessory photos uploaded by the user;
the control center receives the user operation, splices the clothes and the accessory photos to form the clothes suit photos.
In a further preferred embodiment, the step of the control center receiving a user operation instruction and controlling the camera to capture a facial image of the user further includes:
the control center controls the mirror surface to alternately display a plurality of clothes suit photos.
In a further preferred embodiment, the step of controlling the mirror to alternately display a plurality of photos of the clothing suit by the control center further comprises:
the control center receives an instruction of selecting a clothes suit photo by a user and controls the clothes hanger pop-up mechanism to pop up the clothes hanger corresponding to the clothes and the clothes selected by the user.
In a further preferred embodiment, the step of the control center receiving a user operation instruction and controlling the camera to capture a facial image of the user further includes:
the control center acquires the today's weather forecast and controls the mirror surface to display at least one dress suit photo matched with the acquired today's weather forecast.
In a further preferred embodiment, the step of the control center receiving a user operation instruction and controlling the camera to capture a facial image of the user further includes:
the microphone receives the sound emitted by the user;
the control center judges the makeup style and/or the clothing style selected by the user according to the sound received by the microphone, and controls the mirror surface to display the makeup appearance matched with the makeup style and/or the clothing suit photo matched with the clothing style according to the judgment result.
In a further preferred scheme, when the skin blackness degree of the lower eyelid of the user is judged to be greater than or equal to a preset value, the control center controls the mirror surface to display the makeup capable of covering the skin of the lower eyelid; and receiving a user operation instruction, and controlling the camera to shoot the image again until the skin blackness of the lower eye and face of the user in the shot image is less than a preset value, wherein the method further comprises the following steps:
the control center controls the mirror surface to display the evaluation test table, receives the selection of the user through the microphone or the mirror surface to perfect the evaluation test table, and displays or plays the life habit suggestion according to the evaluation test result.
A vanity aid control system comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory, and configured to be executed by one or more processors the one or more programs comprising instructions for performing the vanity aid control method as described above.
A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of a vanity aid control method as in any one of the above.
Compared with the prior art, the dressing auxiliary control method provided by the invention comprises the following steps: pre-storing and storing a reference image, shooting a facial image of a user, comparing the shot image with the reference image, prompting the user to clean when the face of the user is judged not to be cleaned, and recommending appropriate makeup to cover when the skin blackness at the lower eyelid of the user is judged to be larger than or equal to a preset value. Based on the above, the dressing auxiliary control method provided by the invention can realize the recommendation of dressing and the prompt that the face is not cleaned, and is more convenient for users to dress compared with the existing dressing table which only can provide light assistance.
Drawings
FIG. 1 is a flow chart of a toilet aid control method according to a preferred embodiment of the present invention.
Fig. 2 is a functional block diagram of a toilet aid control system in a preferred embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in FIG. 1, the method for controlling makeup accessories according to the present invention comprises the steps of:
and S100, receiving and storing a user face image uploaded by a user or shot in advance as a reference image. In a specific implementation, the reference image may be a plain color photograph taken by the user, a makeup photograph taken by the user, or a plain color photograph taken and beautified by the user, and the most basic function of the reference image is to be used as a basis for determining whether the face of the user is clean and whether dark circles exist (i.e., whether the skin blackness at the lower eyelid is greater than or equal to a preset value), so that the reference image may be used as long as the photograph with the function is used, which is not specifically limited in the present invention.
S200, the control center receives a user operation instruction and controls the camera to shoot the face image of the user.
S300, the control center compares the image shot by the camera with the reference image, and judges whether the face of the user is cleaned and whether the skin blackness of the lower eyelid of the user is larger than or equal to a preset value.
The main judgment of whether the face of the user is clean is to judge whether foam (foam left by a face cleaning product such as soap, facial cleanser and the like) is left on the face of the user or whether an unwashed part exists or not. And judging whether the skin blackness degree at the lower eyelid of the user is larger than or equal to a preset value, namely judging whether the user has dark eyes.
S400, when the face of the user is judged not to be cleaned, the control center controls the mirror surface to display the part which is not cleaned in an amplifying way, and prompts the user to clean the displayed part; and receiving a user operation instruction, and controlling the camera to shoot the image again until the face of the user is cleaned.
There are many ways to mirror the image, such as: the mirror surface is formed by splicing, wherein one part of the mirror surface is a display, and the other part of the mirror surface is a common mirror; for another example: a small projection device is arranged and projects to a partial area of the mirror surface and the like; the present invention has many embodiments, which cannot be listed one by one, but it should be understood that all technical embodiments meeting the requirements of the present invention should fall within the protection scope of the present invention.
S500, when the skin blackness degree of the lower eyelid of the user is judged to be larger than or equal to a preset value, the control center controls the mirror surface to display the makeup capable of covering the skin of the lower eyelid; and receiving a user operation instruction, and controlling the camera to shoot the image again until the skin blackness degree of the lower eye and face of the user in the shot image is less than a preset value.
The makeup model of the sample member can be trained, or the existing trained model which is suitable for the skin color of the user is used, so that the makeup capable of covering the black eye ring is recommended to the user; or recommend a variety of different styles of makeup for selection by the user.
As a preferred embodiment of the present invention, the S200 further includes:
the microphone receives the sound emitted by the resident, the microphone may be activated actively by the user (for example, by pressing a switch on the toilet table or by touching a mirror surface), or may be activated automatically (for example, by using a distance sensor to detect the position of the user, a position sensor to detect the current position of the user, a weight sensor to detect whether the user sits on a stool associated with the dressing table, or the like).
The control center judges the user identity according to the sound received by the microphone, and calls a face image corresponding to the user identity in the database as a reference image. Since there may be multiple users using the cosmetic assistant control system, the user identity can be quickly confirmed through voiceprint recognition, and then a reference image, a picture of a suit, etc. corresponding to the user is activated.
According to another aspect of the present invention, the S100 further includes:
the control center receives a user operation instruction and controls the camera to shoot the clothes and accessories of the resident; or the control center receives and stores the clothes and the accessory photos uploaded by the user;
the control center receives the user operation, splices the clothes and the accessory photos to form the clothes suit photos.
The user often has a style of the user for matching the clothing (matching between a skirt, a shirt, trousers and the like) and matching the clothing and accessories (a bracelet, a necklace, an earring, a bracelet and the like), and the style cannot be changed easily, so that the matching mode can be stored in a preset mode to serve as a suit, and the user can browse or select the suit when dressing.
Therefore, in a further embodiment of the present application, the step of the control center receiving a user operation instruction and controlling the camera to capture a facial image of the user further includes: the control center controls the mirror surface to alternately display a plurality of clothes suit photos.
In addition, the step of controlling the mirror surface to alternately display a plurality of clothes suit photos further comprises the following steps: the control center receives an instruction of selecting a clothes suit photo by a user and controls the clothes hanger pop-up mechanism to pop up the clothes hanger corresponding to the clothes and the clothes selected by the user.
The clothes hanger ejecting mechanism can be a three-axis linkage mechanism, the Z axis drives the clamping mechanism to clamp the clothes hanger to ascend, the X axis drives the Z axis to move left and right, and the Y axis drives the Z axis to move back and forth so as to drive the clothes hanger to eject.
In addition, the control center can control the clothes rack to move along a preset track until the clothes rack reaches a specified position (such as the vicinity of a dressing table, and then the clothes rack can return back in the original path) so as to be convenient for a user to use.
Preferably, the step S200 may further include: the control center acquires the today's weather forecast and controls the mirror surface to display at least one dress suit photo matched with the acquired today's weather forecast. So that the user can select appropriate makeup and clothing accessories according to the weather.
According to another aspect of the present invention, the S500 may further include: the control center controls the mirror surface to display the evaluation test table, receives the selection of the user through the microphone or the mirror surface to perfect the evaluation test table, and displays or plays the life habit suggestion according to the evaluation test result.
The control center judges that the user has the dark eye, and can estimate which reason the user may cause the dark eye according to the evaluation test table after judging that the user has the dark eye, if the user causes the dark eye due to the bad life habit, the user is reminded to change the life habit in a voice or display screen display mode, and the body health is kept; if the black eye is caused by the disease, the user is reminded to go to the hospital as early as possible for corresponding examination, and the disease deterioration is prevented.
As shown in FIG. 2, the present invention also provides a toilet aid control system, which comprises a memory 10, and one or more programs, wherein the one or more programs are stored in the memory 10, and the one or more programs configured to be executed by the one or more processors 20 comprise a program for executing the toilet aid control method as described above.
The invention also provides a storage medium on which a computer program is stored, wherein the computer program, when executed by a processor, implements the steps of the method of toilet aid control as described above.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, databases, or other media used in embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (SyNchlinNk) DRAM (SLDRAM), Rambus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A makeup auxiliary control method is characterized by comprising the following steps:
receiving and storing a user face image uploaded by a user or shot in advance as a reference image in advance;
the control center receives a user operation instruction and controls the camera to shoot a face image of the user;
the control center compares the image shot by the camera with the reference image, and judges whether the face of the user is cleaned and whether the skin blackness at the lower eye and face of the user is larger than or equal to a preset value;
when the face of the user is judged to be not cleaned, the control center controls the mirror surface to display the part which is not cleaned in an amplifying way, and prompts the user to clean the displayed part; receiving a user operation instruction, and controlling the camera to shoot the image again until the face of the user is cleaned;
when the skin blackness degree of the lower eyelid of the user is judged to be larger than or equal to the preset value, the control center controls the mirror surface to display the makeup capable of covering the skin of the lower eyelid; and receiving a user operation instruction, and controlling the camera to shoot the image again until the skin blackness degree of the lower eye and face of the user in the shot image is less than a preset value.
2. The method as claimed in claim 1, wherein the step of the control center receiving the user operation command and controlling the camera to capture the facial image of the user further comprises:
the microphone receives the sound emitted by the resident;
the control center judges the user identity according to the sound received by the microphone, and calls a face image corresponding to the user identity in the database as a reference image.
3. The method for assisting in controlling dressing according to claim 1, wherein the step of previously receiving and storing a face image of the user uploaded by the user or previously photographed as a reference image further comprises:
the control center receives a user operation instruction and controls the camera to shoot the clothes and accessories of the resident; or the control center receives and stores the clothes and the accessory photos uploaded by the user;
the control center receives the user operation, splices the clothes and the accessory photos to form the clothes suit photos.
4. The make-up auxiliary control method according to claim 3, wherein the step of the control center receiving the user operation instruction and controlling the camera to capture the image of the face of the user further comprises:
the control center controls the mirror surface to alternately display a plurality of clothes suit photos.
5. The toilet aid control method according to claim 4, wherein the step of the control center controlling the mirror to alternately display a plurality of photos of the suit of clothing further comprises:
the control center receives an instruction of selecting a clothes suit photo by a user and controls the clothes hanger pop-up mechanism to pop up the clothes hanger corresponding to the clothes and the clothes selected by the user.
6. The make-up auxiliary control method according to claim 3, wherein the step of the control center receiving the user operation instruction and controlling the camera to capture the image of the face of the user further comprises:
the control center acquires the today's weather forecast and controls the mirror surface to display at least one dress suit photo matched with the acquired today's weather forecast.
7. The make-up auxiliary control method according to claim 3, wherein the step of the control center receiving the user operation instruction and controlling the camera to capture the image of the face of the user further comprises:
the microphone receives the sound emitted by the user;
the control center judges the makeup style and/or the clothing style selected by the user according to the sound received by the microphone, and controls the mirror surface to display the makeup appearance matched with the makeup style and/or the clothing suit photo matched with the clothing style according to the judgment result.
8. The makeup auxiliary control method according to claim 1, wherein when it is judged that the skin blackness at the lower eyelid of the user is greater than or equal to a preset value, the control center controls the mirror to display a makeup that can cover the skin at the lower eyelid; and receiving a user operation instruction, and controlling the camera to shoot the image again until the skin blackness of the lower eye and face of the user in the shot image is less than a preset value, wherein the method further comprises the following steps:
the control center controls the mirror surface to display the evaluation test table, receives the selection of the user through the microphone or the mirror surface to perfect the evaluation test table, and displays or plays the life habit suggestion according to the evaluation test result.
9. A cosmetic makeup auxiliary control system comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory, and the one or more programs configured to be executed by the one or more processors include a program for executing the cosmetic makeup auxiliary control method according to any one of claims 1 to 8.
10. A storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the steps of the dressing support control method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910849775.9A CN110674710A (en) | 2019-09-09 | 2019-09-09 | Dressing auxiliary control method, dressing auxiliary control system and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910849775.9A CN110674710A (en) | 2019-09-09 | 2019-09-09 | Dressing auxiliary control method, dressing auxiliary control system and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110674710A true CN110674710A (en) | 2020-01-10 |
Family
ID=69076745
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910849775.9A Pending CN110674710A (en) | 2019-09-09 | 2019-09-09 | Dressing auxiliary control method, dressing auxiliary control system and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110674710A (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105656734A (en) * | 2015-07-14 | 2016-06-08 | 宇龙计算机通信科技(深圳)有限公司 | Method, device and system for controlling intelligent housing system |
CN105867157A (en) * | 2016-05-20 | 2016-08-17 | 天津工业大学 | Intelligent wardrobe and control method thereof |
CN107115053A (en) * | 2017-07-16 | 2017-09-01 | 吴静 | A kind of intelligent prompt method and its equipment |
CN107730483A (en) * | 2016-08-10 | 2018-02-23 | 阿里巴巴集团控股有限公司 | The methods, devices and systems of mobile device, processing face biological characteristic |
CN108765268A (en) * | 2018-05-28 | 2018-11-06 | 京东方科技集团股份有限公司 | A kind of auxiliary cosmetic method, device and smart mirror |
CN108920509A (en) * | 2018-05-30 | 2018-11-30 | 深圳市赛亿科技开发有限公司 | A kind of intelligent cosmetic mirror |
CN109766456A (en) * | 2018-11-29 | 2019-05-17 | 深圳市赛亿科技开发有限公司 | A kind of Intelligent mirror management-control method and system |
-
2019
- 2019-09-09 CN CN201910849775.9A patent/CN110674710A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105656734A (en) * | 2015-07-14 | 2016-06-08 | 宇龙计算机通信科技(深圳)有限公司 | Method, device and system for controlling intelligent housing system |
CN105867157A (en) * | 2016-05-20 | 2016-08-17 | 天津工业大学 | Intelligent wardrobe and control method thereof |
CN107730483A (en) * | 2016-08-10 | 2018-02-23 | 阿里巴巴集团控股有限公司 | The methods, devices and systems of mobile device, processing face biological characteristic |
CN107115053A (en) * | 2017-07-16 | 2017-09-01 | 吴静 | A kind of intelligent prompt method and its equipment |
CN108765268A (en) * | 2018-05-28 | 2018-11-06 | 京东方科技集团股份有限公司 | A kind of auxiliary cosmetic method, device and smart mirror |
CN108920509A (en) * | 2018-05-30 | 2018-11-30 | 深圳市赛亿科技开发有限公司 | A kind of intelligent cosmetic mirror |
CN109766456A (en) * | 2018-11-29 | 2019-05-17 | 深圳市赛亿科技开发有限公司 | A kind of Intelligent mirror management-control method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6250390B2 (en) | Display and lighting device for fitting room | |
US10413042B2 (en) | Makeup support device, makeup support method, and makeup support program | |
US20190146128A1 (en) | Smart mirror | |
JP5090870B2 (en) | Makeup unit | |
JP6292639B2 (en) | Hair smart mirror system using virtual reality | |
JP5879562B2 (en) | Mirror device with camera, fixture with mirror | |
WO2014125831A1 (en) | Electronic mirror device | |
KR101967814B1 (en) | Apparatus and method for matching personal color | |
US20230014469A1 (en) | Sensor privacy setting control | |
US20200410960A1 (en) | Information processing device, information processing method, and recording medium | |
CN107347135A (en) | Take pictures processing method, device and terminal device | |
KR102065480B1 (en) | Body information analysis apparatus and lip-makeup analysis method thereof | |
CN107317974B (en) | Beautiful makeup photographing method and device | |
JP2017220158A (en) | Virtual makeup apparatus, virtual makeup method, and virtual makeup program | |
CN106880156A (en) | Method and its system are recommended in a kind of makeups on dressing glass | |
US11776187B2 (en) | Digital makeup artist | |
US20220202168A1 (en) | Digital makeup palette | |
TW201942714A (en) | Information output method, apparatus and system | |
CN111306803B (en) | Water source supply control system and method based on big data | |
KR20190093040A (en) | Personal Color Analysis System with complex perception of the auditory and visual and fashion persona color matching method using the same | |
JP2017511929A (en) | Makeup base matching system | |
CN110674710A (en) | Dressing auxiliary control method, dressing auxiliary control system and storage medium | |
JP4826139B2 (en) | Dressing system | |
JP4063583B2 (en) | Digital camera | |
CN105278848A (en) | Wallpaper updating method for terminal and wallpaper updating system for terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: Room 2306, east block, Skyworth semiconductor design building, 18 Gaoxin South 4th Road, Gaoxin community, Yuehai street, Nanshan District, Shenzhen, Guangdong 518052 Applicant after: Shenzhen Kukai Network Technology Co.,Ltd. Address before: 518052 Room 601, block C, Skyworth building, 008 Gaoxin South 1st Road, Nanshan District, Shenzhen City, Guangdong Province Applicant before: Shenzhen Coocaa Network Technology Co.,Ltd. |
|
CB02 | Change of applicant information | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200110 |
|
RJ01 | Rejection of invention patent application after publication |