CN114867383A - Makeup assisting device, makeup assisting method, makeup assisting program, and makeup assisting system - Google Patents

Makeup assisting device, makeup assisting method, makeup assisting program, and makeup assisting system Download PDF

Info

Publication number
CN114867383A
CN114867383A CN202080087543.5A CN202080087543A CN114867383A CN 114867383 A CN114867383 A CN 114867383A CN 202080087543 A CN202080087543 A CN 202080087543A CN 114867383 A CN114867383 A CN 114867383A
Authority
CN
China
Prior art keywords
application
area
cosmetic
information
makeup
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080087543.5A
Other languages
Chinese (zh)
Inventor
德重麻吕
青木基治
上和野大辅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shiseido Co Ltd
Original Assignee
Shiseido Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shiseido Co Ltd filed Critical Shiseido Co Ltd
Publication of CN114867383A publication Critical patent/CN114867383A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Cosmetics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A makeup assisting device for guiding application of a cosmetic, the makeup assisting device comprising: an area detection unit that detects an area to which a cosmetic is applied; and a display unit that displays information associated with the area, the information being information indicating a state of application of the cosmetic in the area.

Description

Makeup assisting device, makeup assisting method, makeup assisting program, and makeup assisting system
Technical Field
The present disclosure relates to a makeup assisting device, a makeup assisting method, a makeup assisting program, and a makeup assisting system.
Background
As a makeup assisting technique, there is a technique of guiding application of a cosmetic. For example, the following techniques are disclosed (see patent document 1): the makeup machine is provided with a portable terminal device carried by a user and a measurement device of a setting type connected in a state of being capable of data communication with the portable terminal device, and creates information for assisting makeup applied to the user based on three-dimensional shape information created by the measurement device and imaging information of the user when makeup is applied acquired by the portable terminal device.
Documents of the prior art
Patent document 1 WO2016/158729
Disclosure of Invention
Technical problem to be solved by the invention
The present disclosure addresses the problem of providing a makeup assisting device that guides the appropriate application of a cosmetic product.
Means for solving the problems
In order to solve the above problem, one aspect of the present disclosure provides a makeup assisting device for guiding application of a cosmetic, the makeup assisting device including: an area detection unit that detects an area to which a cosmetic is applied; and a display unit that displays information associated with the area, the information being information indicating a state of application of the cosmetic in the area.
ADVANTAGEOUS EFFECTS OF INVENTION
According to one aspect of the present disclosure, a makeup assisting device that guides appropriate application of a cosmetic product can be provided.
Drawings
Fig. 1 is a diagram showing an outline of a makeup support system according to an embodiment.
Fig. 2 is a diagram showing an application device constituting a part of the makeup assisting system according to the embodiment.
Fig. 3 is a view of the application device of fig. 2 as viewed from the application surface side.
Fig. 4 is a view showing a state where the sheet (sheet) and the holding tool are removed from the application device of fig. 3.
Fig. 5 is a block diagram showing an internal configuration of the application device of fig. 2.
Fig. 6 is a diagram showing a makeup assisting device constituting a part of the makeup assisting system according to the present embodiment.
Fig. 7 is a flowchart for implementing the makeup assisting device according to embodiment 1.
Fig. 8 is a diagram showing an example of area-related information displayed in embodiment 1.
Fig. 9 is a diagram showing an example of area-related information displayed in embodiment 1.
Fig. 10 is a flowchart for implementing the makeup assisting device according to embodiment 2.
Fig. 11 is a diagram showing an example of area-related information displayed in embodiment 2.
Fig. 12 is a diagram showing an example of area-related information displayed in embodiment 2.
Fig. 13 is a diagram showing an example of area-related information displayed in embodiment 2.
Fig. 14 is a flowchart for implementing the makeup assisting device according to embodiment 3.
Fig. 15 is a diagram showing an example of area-related information displayed in embodiment 3.
Fig. 16 is a diagram showing an example of area-related information displayed in embodiment 3.
Detailed Description
Embodiments of the present invention will be described with reference to the accompanying drawings. In addition, the same reference numerals are given to the common portions in the drawings, and the description thereof may be omitted. In addition, the scale of each member in each drawing may be different from the actual scale.
< Makeup assisting System >
Fig. 1 is a diagram showing an outline of a makeup support system according to an embodiment of the present disclosure. Fig. 2 is a diagram showing an application device (application device) constituting a part of the makeup assisting system according to the present embodiment. Fig. 3 is a view of the application device of fig. 2 as viewed from the application surface side, and fig. 4 is a view showing a state where the sheet and the holding tool are removed from the application device of fig. 3. Fig. 5 is a block diagram showing an internal configuration of the application device of fig. 2. Fig. 6 is a diagram showing a makeup assisting device constituting a part of the makeup assisting system according to the present embodiment.
The makeup assistant system 100 according to the present embodiment is a makeup assistant system that guides application of a cosmetic product, and includes an application device that applies a cosmetic product and a makeup assistant device that is communicably connected to the application device, the makeup assistant device including an area detection unit that detects an area to which the cosmetic product is applied and a display unit that displays information associated with the area, the information being information indicating a state of application of the cosmetic product in the area.
Specifically, the makeup assisting system 100 of the present embodiment has the application device 10 and the portable terminal 30. The application device 10 is communicatively connected to the portable terminal 30. The application device 10 is an example of an application device that constitutes a part of the makeup assisting system according to the present embodiment. The portable terminal 30 is an example of a makeup assisting device constituting a part of the makeup assisting system according to the present embodiment. The mobile terminal 30 is also an example of the makeup assisting device according to the present embodiment.
The application device 10 constitutes an application tool (refer to fig. 1 and 2) used by the user P in an application or application operation of the cosmetic. The applicator 10 includes a main body 11, a handle 12, an application surface 13, and a holding tool 14 (see fig. 2 to 4). The main body 11 has an internal configuration (see fig. 5) described later.
Here, the cosmetics are not particularly limited, and examples thereof include cosmetics for skin care (also referred to as base cosmetics) such as lotions, beauty solutions, milky lotions, and creams, and cosmetics for applying makeup (also referred to as makeup cosmetics) such as foundations, blushers, lipsticks, color cosmetics (gloss), eyebrow pencils, eyeliners, eye shadows, mascaras, nail varnishes, and perfumes. In the present embodiment, the cosmetic is preferably a skin care cosmetic.
In addition, painting means painting (or applying) a cosmetic. The area to which the cosmetic is applied (hereinafter also referred to as an application area) indicates an area (or range) to which the cosmetic is applied in the surface of the skin of the person. Detecting a region means detecting the coordinates of the region and determining the position (or location) of the region from the detected coordinates.
The grip 12 constitutes a grip (or a handgrip) for the user P to hold (or grip) the application apparatus 10 to perform an application or a painting action of the cosmetic. A push-type switch (button) 12A that switches on-off of the application device 10 is provided on the handle 12. The applying surface 13 constitutes a surface that touches the skin of the user P when applying the cosmetic. The sheet S impregnated with the cosmetic is placed on the applying surface 13 in a state fixed by the holding tool 14 (see fig. 2 to 4).
The internal configuration of the main body 11 is arbitrary, but includes, for example, a position sensor 15, a pressure sensor 16, an acceleration sensor 17, a cosmetic sensor 18, a vibration motor 19, a computer 20, a communication module 26, and a power supply 27 (see fig. 5).
The position sensor 15 detects position information of the image of the user P acquired by the mobile terminal 30. The pressure sensor 16 detects a pressure (or pressing force) generated in the application apparatus 10 during the cosmetic application operation. The acceleration sensor 17 detects acceleration generated in the application apparatus 10 in the cosmetic application operation.
The cosmetic sensor 18 has electrodes 18A and 18B exposed on the application surface 13, and detects the moisture content of the cosmetic impregnated in the sheet S placed on the application surface 13 through the electrodes 18A and 18B.
In the present embodiment, a method of detecting the moisture content is adopted as a detection method of the cosmetic sensor, but the present invention is not limited to this detection method. For example, a method of detecting cosmetics using an optical sensor such as a photo reflector may be adopted as the detection method of the cosmetic sensor.
The vibration motor 19 notifies the user P of a feeling when operating the application device 10 and information to be described later by vibration or vibration sound. The computer 20 has an internal configuration (see fig. 5) described later. The communication module 26 transmits information obtained in the application device 10 to the portable terminal 30, and receives information transmitted from the portable terminal 30. The power supply 27 supplies power to the elements 15 to 20, 26 of the application device 10.
The computer 20 has an arbitrary internal configuration, and includes, for example, a Central Processing Unit (CPU)21, a memory 23, an information input unit 24, and an information output unit 25. The CPU21 includes a control unit 22 and can control the application device 10. Specifically, the CPU21 controls the position sensor 15, the pressure sensor 16, the acceleration sensor 17, the cosmetic product sensor 18, the vibration motor 19, the communication module 26, and the power supply 27, and performs various processes such as information calculation, control, storage, transfer, input, output, transmission, and reception. The control unit 22 may be configured separately from the CPU 21.
The memory 23 constitutes a temporary storage device that stores information to be calculated when the CPU21 executes processing. The application device 10 may also include auxiliary devices such as a Hard Disk Drive (HDD) and a module socket, which are not shown.
The information input section 24 inputs information transmitted from the portable terminal 30 and received by the communication module 26 to the application device 10. The information output section 25 outputs information such as a detection result obtained by the application apparatus 10. The information output from the information output unit 25 is transmitted to the mobile terminal 30 via the communication module 26.
Auxiliary device for makeup
The portable terminal 30, which is an example of the makeup assisting device according to the present embodiment, is communicably connected to the application device 10. The mobile terminal 30 is a portable communication terminal (for example, a smartphone) having a computer function. The makeup assisting device according to the present embodiment is not limited to a portable terminal capable of wireless communication with the application device 10, and may be a personal computer or the like connected by a wire. The makeup assisting device according to the present embodiment may be connected to the application device 10 via a network line so as to be able to communicate with the application device, without being limited to a wired or wireless connection.
The portable terminal 30 includes a main body 31, a display 32, and a camera 33 (see fig. 6). The main body 31 incorporates a computer and a communication module, not shown, and performs various processes such as information calculation, control, storage, transfer, input, output, transmission, and reception. The main body 31 incorporates an auxiliary storage device (not shown) in which a program for implementing the makeup auxiliary device according to the present embodiment is stored so as to be readable by a computer.
The display 32 visually displays information such as characters and images output from a computer (not shown) in the main body 31. The camera 33 is a camera (or a front camera) having a lens 33A provided on the display 32 side of the portable terminal 30. The camera 33 is controlled by a computer in the main body 31, and takes an image of the user P.
The captured image of the user P is displayed on the display 32 as a user image 40. The user image 40 includes an area 41 formed of an image of the face F of the user P to which the cosmetic is applied. In addition, a part of the region 41 includes application regions 41A and 41B to which cosmetics are locally applied. A display field 80 (see fig. 6) for displaying a message to be described later is displayed on the lower portion of the display 32.
Fig. 7 is a flowchart for implementing the makeup assisting device according to embodiment 1. The makeup assisting device according to embodiment 1 is realized by the portable terminal 30, and guides application of the cosmetic. The portable terminal 30 has an area detection unit that detects an area to which the cosmetic is applied. Specifically, step S11 of detecting the area 41 (including the application areas 41A, 41B) to which the cosmetic is applied with the camera 33 of the portable terminal 30 is executed. The camera 33 of the portable terminal 30 is an example of an area detection unit constituting the makeup assistant device according to the present embodiment.
Specifically, in step S11 of fig. 7, the camera 33 of the mobile terminal 30 captures an image of the user P, the coordinates of the area 41 included in the user image 40 captured as the image of the user P are detected, and the position (or part) of the area 41 is specified based on the detected coordinates. The areas 41 include the painted areas 41A and 41B, and the positions (or portions) of the painted areas 41A and 41B are also determined (see fig. 6). In step S12 of fig. 7, the user image 40 of the user P is displayed on the display 32, and the region 41 including the painted regions 41A and 41B is displayed on the display 32 (see fig. 6 and 7).
The object of the area 41 (painted areas 41A, 41B) is arbitrary. In the makeup assisting device (portable terminal 30) according to the present embodiment, the surface F of the face to which the cosmetic is applied is the target of the area 41 ( application areas 41A and 41B). Here, the surface of the face represents the face centered on the chin, cheeks, and forehead (see fig. 6). In the present embodiment, the painted region 41A corresponds to the left cheek of the user P, and the painted region 41B corresponds to the right cheek of the user P (see fig. 6).
Embodiment 1 further includes a display unit that displays information related to the area. Specifically, step S13 (see fig. 7 and 8) is executed in which the information 50 related to the area 41 (the painted area 41A, 41B) is displayed on the display 32 of the mobile terminal 30. The display 32 of the portable terminal 30 is an example of a display unit constituting the makeup auxiliary device according to the present embodiment.
Here, the information associated with the area (hereinafter referred to as area-associated information) indicates information linked (or associated) with a predetermined area. Displaying information means representing in a way that the information can be determined or identified.
In embodiment 1, the information (area-related information) 50 related to the area 41 ( application areas 41A, 41B) is information indicating the application state of the cosmetic in the area 41 ( application areas 41A, 41B). Here, the information indicating the application state of the cosmetics (hereinafter, referred to as guidance information) indicates information indicating whether or not the application operation of the cosmetics is appropriate and/or information teaching (or instructing) the appropriate application operation of the cosmetics. For example, the area-related information 50 is guide information 51 and 52 (see fig. 8 and 9) in which arrows displayed in the application areas 41A and 41B indicate appropriate application directions of the cosmetics.
The display form of the guidance information 51 and 52 displayed as the area-related information 50 is not limited. In the makeup assisting device (portable terminal 30) according to the present embodiment, the area-related information 50 may be configured by information that can be recognized visually, audibly, tactually, or the like. For example, as shown in fig. 8 and 9, the information visually recognizable may be a sign such as an arrow indicating the application direction. The information that can be recognized by auditory sense may be a sound such as a dial sound or a vibration sound to indicate the application speed. Further, the information that can be recognized by the sense of touch may indicate the application pressure by vibration or the like.
In the present embodiment, it is preferable that the display unit (step S13) for displaying the area-related information 50 displays the area-related information 50 (see fig. 7) in the area 41 (the smear area 41A, 41B) displayed in step S12. Here, displaying the area-related information in the area indicates that at least a part of the area-related information is also displayed inside the displayed area.
In the present embodiment, for example, an arrow having an inverted S-shape is displayed as the guide information 51 (see fig. 8) indicating the smearing direction in the smearing region 41A (the smearing region corresponding to the left cheek of the user P). In addition, an arrow of an S-letter shape is displayed in the smear region 41B (the smear region corresponding to the right cheek of the user P) as the guide information 52 (see fig. 9) indicating the smear direction.
In the present embodiment, the guidance information 51, 52 displayed as the area-related information 50 is preferably information based on a predetermined painting condition. Here, the predetermined application condition indicates that the application condition is set according to the state (for example, a part, a shape, and the like) of the region (application region).
The content of the application condition is arbitrary, but the application condition preferably includes at least any one of the application range, the application direction, the application pressure, the application speed, and the application amount of the cosmetic in the area 41 ( application areas 41A, 41B).
Here, the application range indicates a range indicating at least a part of the area to which the cosmetic is applied. The application direction means a direction (or orientation) in which the cosmetic is applied. The application pressure means a pressure (or pressing force) applied to an application area when applying the cosmetic. The application speed means a speed (or a speed) at which the cosmetic is applied. The application amount indicates the amount of cosmetic required for application.
In the makeup assisting device (portable terminal 30) according to the present embodiment, it is preferable that the information display is different between the case where the application condition is not satisfied and the case where the application condition is satisfied. Here, the condition that the application of the cosmetic satisfies the application condition means that the application operation with respect to the application area is performed within a limit allowable by the guidance information displayed as the area-related information. The difference in display indicates that the two display modes can be distinguished.
The mode of the applying operation for the area 41 (the applying areas 41A and 41B) is not particularly limited. For example, as described above, by using the device 10 or the like capable of applying a cosmetic, which has a sensor function, a communication function, or the like, to actually apply the cosmetic to the area 41 ( application areas 41A, 41B), information obtained by measuring an application range, an application direction, an application pressure, an application speed, an application amount, or the like can be handled as an application operation.
In the present embodiment, in step S14 of fig. 7, the smearing operation obtained as such information is compared or compared with the above-described predetermined smearing condition, and it is determined whether or not the smearing operation satisfies the smearing condition (see fig. 7). If the determined result is that the smearing operation satisfies the smearing condition, the area-related information 50 (guidance information 51, 52) displayed in the smearing area 41A is deleted (step S15 in fig. 7). In the present embodiment, the case where the display of the area-related information 50 (the guide information 51, 52) is deleted indicates that the smearing operation performed on the area 41 (the smearing areas 41A, 41B) is appropriate.
On the other hand, when the smearing operation does not satisfy the smearing condition, the display form of the area-related information 50 (guidance information 51, 52) displayed in the smearing area 41A is changed (step S16 in fig. 7). For example, the arrow is thickened when the application pressure is insufficient, and the arrow is thinned when the application pressure is excessive. The application speed is indicated by a blinking arrow, and if the application speed is too slow, the arrow blinking speed is increased, and if the application speed is too fast, the arrow blinking speed is decreased.
If the smearing operation does not satisfy the smearing condition in step S16 in fig. 7 (if the display mode of the area-related information 50 is changed), the process returns to step S14 in fig. 7, and it is determined again whether or not the smearing operation satisfies the smearing condition. Then, steps S14 and S16 are repeated until the application operation satisfies the application condition.
In step S15 of fig. 7, when the smearing operation satisfies the smearing condition (when the display of the area-related information 50 is deleted), the process proceeds to step S17, and it is determined whether or not another smearing area is present. If the determination result indicates that there is another application region, steps S13 to S17 are repeated for another application region (for example, if there is an application region 41B with respect to the application region 41A) (see steps S18 and fig. 9 in fig. 7). On the other hand, if it is determined in step S17 that there is no other application area, the implementation of the makeup assisting device according to embodiment 1 is terminated (see fig. 6 and 7).
In the makeup assisting device (portable terminal 30) according to embodiment 1, when the application condition is not satisfied by the application of the cosmetic, the area-related information 50 (guide information 51, 52) is displayed in the area 41 ( application areas 41A, 41B) (steps S14 to S16).
In the present embodiment, for example, a case where the area-related information 50 (the guide information 51, 52) is displayed in the area 41 (the painted areas 41A, 41B) indicates a case where the painting operation is not yet performed on the area 41 (the painted areas 41A, 41B), or a case where the area-related information 50 (the guide information 51, 52) is displayed in the area 41 (the painted areas 41A, 41B) as it is due to an inappropriate painting operation performed on the area 41 (the painted areas 41A, 41B) (see fig. 8).
In the makeup assisting device (portable terminal 30) according to the present embodiment, as described above, the area 41 ( application areas 41A and 41B) is detected, the information (area-related information) 50 related to the area 41 ( application areas 41A and 41B) is displayed, and the user P can perform the application operation (or makeup operation) in accordance with the displayed area-related information 50. Further, by displaying information (guide information) 51 indicating the application status of the cosmetics as the area-related information 50, the user P can perform the application operation in accordance with the area-related information 50, and thus can apply the cosmetics appropriately to the area 41 ( application areas 41A, 41B) (see fig. 8 and 9).
As described above, by displaying the area-related information 50 in the area 41 (the application areas 41A and 41B), the user P can easily identify the area 41 (the application areas 41A and 41B) where the application operation is performed, in accordance with the guide information 51 and 52 (see fig. 8 and 9).
The user P can synchronize the application operation performed on the area 41 (the application areas 41A and 41B) with the guidance information 51 and 52 displayed on the area 41 (the application areas 41A and 41B). Thus, in the present embodiment, appropriate application of the cosmetic corresponding to the displayed area 41 ( application areas 41A, 41B) can be performed.
In the present embodiment, as described above, by displaying information based on predetermined application conditions as the area-related information 50 (the guidance information 51, 52), the user P can apply appropriate cosmetics according to the state of the area 41 (the application areas 41A, 41B) (see fig. 8 and 9). Thus, in the present embodiment, appropriate cosmetic application can be performed for each of the areas 41 ( application areas 41A and 41B).
As described above, since the predetermined application condition includes at least one of the application range, the application direction, the application pressure, the application speed, and the application amount, the objectivity of the guidance information 51 and 52 displayed as the area-related information 50 can be ensured. Thus, the user P can perform the applying operation in accordance with the area-related information 50 (the guidance information 51, 52), thereby accurately applying the appropriate cosmetic to the area 41 (the application areas 41A, 41B).
In the present embodiment, as described above, by making the display of the area-related information 50 (the guide information 51, 52) different between the case where the application of the cosmetic does not satisfy the application condition and the case where the application of the cosmetic satisfies the application condition, the user P can recognize whether the application operation to the area 41 (the application areas 41A, 41B) is appropriate or inappropriate. Thereby, the user P can learn (or grasp) an appropriate smearing operation in the area 41 (smearing areas 41A, 41B).
In the present embodiment, as described above, when the application condition is not satisfied by the application of the cosmetic, the area-related information 50 (the guide information 51, 52) is displayed in the area 41 (the application areas 41A, 41B), so that the user P can recognize whether the application operation in the area 41 (the application areas 41A, 41B) is appropriate or inappropriate. Thereby, the user P can learn (or grasp) an appropriate smearing operation in the area 41 (smearing areas 41A, 41B).
In the present embodiment, as described above, by deleting the display of the area-related information 50 (guide information 51, 52) when the application of the cosmetic satisfies the application condition, the user P can recognize whether the application operation in the area 41 ( application areas 41A, 41B) is appropriate or inappropriate. In addition, since the user P is prompted to cancel the display of the area-related information 50 (guidance information 51, 52), it is possible to teach (or guide) the application of the appropriate cosmetic to the area 41 ( application areas 41A, 41B) with high accuracy. This enables the user P to accurately learn (or grasp) an appropriate application operation in the area 41 ( application areas 41A and 41B).
Further, in the present embodiment, by setting the target of the region 41 (the smear regions 41A and 41B) as the face surface F, the user P can perform the smear operation on the face surface F in accordance with the displayed region-related information 50. Further, the user P can apply the appropriate cosmetic to the face surface F by performing the application operation according to the area-related information 50.
In the present embodiment, by using cosmetics for skin care as the cosmetics applied to the area 41 (applied areas 41A, 41B), the user P can perform skin care in accordance with the displayed area-related information 50. Further, the user P can perform appropriate skin care for the area 41 (the applied area 41A, 41B) by performing the applying operation according to the area-related information 50.
Fig. 10 is a flowchart for implementing the makeup assisting device according to embodiment 2. In embodiment 2, the same or corresponding reference numerals are used for portions common to embodiment 1, and the description thereof may be omitted.
The makeup assisting device (portable terminal 30) according to embodiment 2 includes a cover layer (overlay)60 in which area-related information (guide information) is displayed in an area 41. Specifically, the mobile terminal 30 is configured to display the cover layer 60 in the area 41 (the smear areas 41A and 41B) (step S22 in fig. 10). The cover layer 60 shown in embodiment 2 is an example of the area-related information 50 (guide information 51, 52).
Here, the cover layer indicates a transmissive image that covers at least a part of the displayed area. In the present embodiment, the cover layer 60 is displayed in a grid pattern so as to cover the area 41 (including the application areas 41A and 41B) (see fig. 11). The method of displaying the area where the coating layer is displayed is arbitrary, and for example, the coating layer may be displayed in an area where the coating operation is not performed or an area where the coating operation is not performed appropriately. In addition, the coating layer may be displayed in an area where the coating operation is appropriately performed.
In embodiment 2, it is preferable to delete the coating layer 60 when the application of the cosmetic in the area 41 where the coating layer 60 is displayed satisfies the application condition. Specifically, when the application operation in the application area 41A satisfies the application condition, the cover layer 60 in the application area 41A is deleted (see step S24, step S25, fig. 12, and fig. 13 in fig. 10). In fig. 12 and 13, the application areas 41A and 41B in the area 41 in which the coating layer 60 is removed represent areas where the application operation is appropriate.
On the other hand, when the application operation in the application area 41A does not satisfy the application condition, the cover layer 60 in the application area 41A is displayed as it is (see step S26 and fig. 11 of fig. 10). In fig. 11, the application regions 41A and 41B in the region 41, in which the cover layer 60 is displayed, are regions where the application operation is not performed or regions where the application operation is not performed appropriately.
In addition, when the application operation in the application area 41A does not satisfy the application condition, a cover layer different from the cover layer 60 may be displayed. For example, when the color of the cover layer 60 is transparent purple, the color is displayed by decreasing the purple when the application pressure is insufficient in the application operation of the application area 41A, or by changing the purple to light blue. When the application pressure is excessive, the purple color of the coating layer 60 is increased or displayed in black instead of the purple color.
If the application operation does not satisfy the application condition in step S26 of fig. 10 (if the blanket 60 of the application area 41A is displayed as it is), the process returns to step S24, and it is determined again whether or not the application operation satisfies the application condition. Then, step S24 and step S26 are repeated until the smearing operation satisfies the smearing condition.
On the other hand, when the application operation satisfies the application condition in step S25 in fig. 10 (when the cover layer 60 in the application area 41A is deleted), the process proceeds to step S27 in fig. 10 to determine whether or not another application area is present. If another applied region is present as a result of the determination, steps S23 to S27 (see steps S28 and fig. 13 in fig. 10) are repeated with respect to the other applied region (for example, if the applied region 41B is present with respect to the applied region 41A). On the other hand, if it is determined in step S27 in fig. 10 that there is no other application area, the implementation of the makeup assisting device according to embodiment 2 is terminated (see fig. 6 and 10).
In addition, when the overlay layer 60 is displayed in embodiment 2, the application direction (arrow) and the like constituting the area related information 50 (the guide information 51, 52) in embodiment 1 may be displayed at the same time.
In embodiment 2, as described above, by displaying the cover layer 60 in the area 41 (the application areas 41A and 41B), the user P can recognize that the application operation has not been performed, the application operation is appropriate, or the application operation is not appropriate in the area 41 (the application areas 41A and 41B) where the cover layer 60 is displayed (see fig. 11).
In embodiment 2, by deleting the cover layer 60 when the application of the cosmetic satisfies the application condition in the area 41 ( application areas 41A and 41B) where the cover layer 60 is displayed, the user P can recognize that the application operation is appropriate in the area 41 (application area 41A shown in fig. 12 and application areas 41A and 41B shown in fig. 13) where the cover layer 60 is deleted. In addition, the user P is given an incentive to cause the displayed overlay layer 60 to cancel, and therefore, the application of appropriate cosmetics to the area 41 ( application areas 41A, 41B) can be taught (or guided) with high accuracy.
Further, in embodiment 2, by displaying the application area 41B of the blanket 60 (the blanket 61) and the application area 41A in which a part of the blanket 60 is removed in the area 41 in a coexisting manner, the user P can distinguish the application area 41A in the area 41 in which the application operation is appropriate from the application area 41B in which the application operation is inappropriate or the application operation has not been performed (see fig. 12). By this, the user P can accurately learn (or grasp) an appropriate application operation in the area 41 ( application areas 41A, 41B).
Fig. 14 is a flowchart of a makeup assisting device according to embodiment 3. In embodiment 3, the same or corresponding reference numerals are given to portions common to embodiment 1, and the description thereof may be omitted.
In the makeup assisting device (portable terminal 30) according to embodiment 3, when the application condition includes the application amount, the application amount is preferably calculated in accordance with the area of the region. In the makeup assisting device (portable terminal 30) according to embodiment 3, the area of the detected region 41 ( application regions 41A and 41B) is calculated (step S31 and step S32). Then, the smear amount (hereinafter, sometimes referred to as a region smear amount) corresponding to the area of the calculated region 41 (smear regions 41A, 41B) is calculated (step S33 in fig. 14).
Here, the smearing amount is calculated in accordance with the area of the region, and the smearing amount is calculated based on the area of the region. For example, the amount of application per unit area (unit amount of application) of the predetermined area 41 is calculated by multiplying the unit amount of application by the area of the cosmetic application area 41 ( application areas 41A and 41B) (total amount).
Further, in embodiment 3, it is determined whether or not the usage amount of the cosmetic satisfies the application amount (area application amount) corresponding to the area 70 (areas 71, 72) of the area 41 ( application areas 41A, 41B) (step S34). If the usage amount does not satisfy the area smearing amount as a result of the determination, a message 90 (messages 91 and 92) is displayed (see step S35, fig. 15, and fig. 16 of fig. 14).
Here, the usage amount indicates an amount of the cosmetic that can be applied to an area (application area) (hereinafter, referred to as a usage amount) or an amount of the cosmetic actually used for application (hereinafter, referred to as a used amount). Such an amount of use can be converted from the amount of moisture of the cosmetic impregnated in the sheet S placed on the application surface 13 of the application device 10, for example. Further, the moisture amount of the cosmetic impregnated in the sheet S can be detected by the cosmetic sensor 18 of the application apparatus 10.
The message indicates the information content indicated by the language and the symbol. In the present embodiment, for example, the moisture content of the cosmetic before use (cosmetic before use) impregnated in the sheet S placed on the application surface 13 of the application device 10 or the moisture content of the cosmetic after use (cosmetic after use) is detected as the usable amount and compared with the area application amount. If it is determined that the detected usable amount (moisture amount of the cosmetic before use or the cosmetic after use) does not satisfy the area application amount, the "cosmetic shortage" is displayed in the display field 80 displayed on the lower portion of the display 32. "as a message 91 (see fig. 15).
In addition, the amount obtained by subtracting the moisture content of the cosmetic after use from the moisture content of the cosmetic before use (hereinafter referred to as used amount) is integrated as used amount and compared with the area application amount. If it is determined that the cosmetic is insufficient (the amount of use will not satisfy the area application amount in the future) based on the application amount corresponding to the area of the area where the application operation has not been performed, a character "cosmetic will soon become insufficient" is displayed in the display field 80 as a message 92 (see fig. 16).
In addition to the messages 91 and 92 or instead of the messages 91 and 92, characters "please supplement cosmetics" may be displayed in the display column 80 as a message not shown. The user P can replenish the sheet S with cosmetics by confirming these messages ( messages 91, 92, etc.).
Further, the used amount is compared with the area painting amount (painting amount corresponding to the area of the area where the painting action has been performed). When the used amount is smaller than the area smear amount, a character "smear amount is insufficient" is displayed in the display field 80 as a message not shown. In addition to the message or instead of the message, the display column 80 may display a character "please continue to paint" as a message not shown. The user P can continue to apply the cosmetics by confirming the messages.
If the message 90 is displayed in step S35 of fig. 14 (if the usage amount of the cosmetic does not satisfy the area application amount), the process returns to step S34, and it is determined again whether or not the usage amount satisfies the area application amount. Then, steps S34 and S35 are repeated until the usage amount satisfies the area application amount.
On the other hand, in the case where the usage amount satisfies the area painting amount in step S34 of fig. 14, the message 90 is not displayed in step S36 of fig. 14. In addition, in the case where the message 90 has already been displayed, the message 90 is deleted in step S36 of fig. 14.
If the message 90 is not displayed in step S36 in fig. 14 or if the message 90 is deleted (if the usage amount satisfies the area smearing amount), the process proceeds to step S37 in fig. 14, and it is determined whether or not an unprocessed smearing area (smearing area for which smearing operation has not been performed) is present. If the unprocessed applied region is present as a result of the determination, the process returns to step S34, and it is determined again whether or not the usage amount satisfies the region application amount. Then, steps S34 to S37 are repeated until the usage amount satisfies the area application amount. On the other hand, if it is determined in step S37 in fig. 14 that there is no unprocessed application region, the implementation of the makeup assisting device according to embodiment 3 is terminated (see fig. 6 and 14).
In addition, when the area smearing amount is calculated (or a message is displayed) in embodiment 3, the smearing direction (arrow) and the like constituting the area-related information 50 (the guide information 51, 52) in embodiment 1 may be displayed simultaneously. The cover layer 60 in embodiment 2 may be displayed simultaneously with or instead of simultaneously displaying the application direction (arrow) or the like constituting the area related information 50 (guide information 51, 52) in embodiment 1.
In embodiment 3, as described above, by calculating the smear amount (area smear amount) corresponding to the area 70 (areas 71 and 72) of the area 41 (smear areas 41A and 41B), it is possible to ensure objectivity of the smear amount included in the smear condition that is the basis of the area-related information 50 (guide information 51 and 52). Thus, the user P can accurately grasp the appropriate amount of cosmetic to be applied to the area 41 (the application areas 41A and 41B) when performing the application operation in accordance with the guide information 50 (the guide information 51 and 52).
Further, in embodiment 3, by displaying the message 90 (messages 91, 92) when the usage amount of the cosmetic does not satisfy the area application amount, the user P can recognize that the usage amount of the cosmetic required for the area application amount is insufficient or will be insufficient in the future. Thus, the user P is given an incentive to prepare or supplement the cosmetic so that the usage amount satisfies the area application amount, and therefore, the appropriate application amount of the cosmetic for the area 41 ( application areas 41A, 41B) can be taught (or guided) with high accuracy. In addition, the user P can accurately learn (or grasp) the appropriate application amount of the cosmetic for the area 41 ( application areas 41A, 41B).
The makeup assisting system 100 according to the present embodiment can provide the same effects as those obtained when the makeup assisting device is implemented, as a makeup assisting system including the application device (application device 10) and the makeup assisting device (portable terminal 30).
Specifically, as described above, by detecting the area 41 (the application areas 41A and 41B) and displaying the information (the area-related information) 50 related to the area 41 (the application areas 41A and 41B), the user P can perform the application operation (or the makeup operation) in accordance with the displayed area-related information 50 (see fig. 8 and 9). By displaying information (guide information) 51 indicating the application state of the cosmetics as the area-related information 50, the user P can apply appropriate cosmetics to the area 41 ( application areas 41A and 41B) by performing the application operation in accordance with the area-related information 50 (see fig. 8 and 9).
< cosmetic auxiliary method >
The makeup assisting method according to the present embodiment is a makeup assisting method for guiding application of a cosmetic product, and includes a step of detecting an area to which the cosmetic product is applied and a step of displaying information associated with the area, the information being information indicating a state of application of the cosmetic product in the area. Specifically, the makeup assisting method according to the present embodiment is executed by implementing the makeup assisting device described above (see steps S11 to S18 in fig. 7).
According to the makeup assisting method of the present embodiment, the same effects as those obtained when the makeup assisting device is applied can be obtained. Specifically, as described above, by detecting the area 41 (the application areas 41A and 41B) and displaying the information (the area-related information) 50 related to the area 41 (the application areas 41A and 41B), the user P can perform the application operation (or the makeup operation) in accordance with the displayed area-related information 50 (see fig. 8 and 9). By displaying information indicating the application state of the cosmetics (the area-related information 50 such as the guidance information 51 and 52) as the area-related information 50, the user P can apply the cosmetics appropriately to the area 41 (the application areas 41A and 41B) by performing the application operation in accordance with the area-related information 50 (see fig. 8 and 9).
< cosmetic Assist procedure >
The makeup support program according to the present embodiment is a makeup support program that guides application of a cosmetic product, detects an area to which the cosmetic product is applied, and displays information associated with the area, the information being information indicating the application state of the cosmetic product in the area. Specifically, the makeup support program according to the present embodiment is a program for implementing the makeup support device described above (see steps S11 to S18 in fig. 7).
The makeup auxiliary program according to the present embodiment is installed in an auxiliary storage device (not shown) built in the main body 31 of the portable terminal 30. The manner of installing the program is not limited. For example, the distributed storage medium (not shown) can be set in a drive device (not shown), and the drive device can read and install the program recorded in the storage medium. Further, a program downloaded from a network line may be installed via an interface device (not shown).
According to the makeup assisting method program according to the present embodiment, the same effects as those obtained when the makeup assisting device is implemented can be obtained. Specifically, as described above, by detecting the area 41 (the application areas 41A and 41B) and displaying the information (area-related information) 50 related to the area 41 (the application areas 41A and 41B), the user P can perform the application operation (or the makeup operation) in accordance with the displayed area-related information 50. By displaying information indicating the application state of the cosmetics (the area-related information 50 such as the guidance information 51 and 52) as the area-related information 50, the user P can perform the application operation in accordance with the area-related information 50, and apply the cosmetics appropriately to the area 41 (the application areas 41A and 41B) (see fig. 8 and 9).
While the embodiments of the present invention have been described above, the present invention is not limited to the specific embodiments, and various modifications and changes can be made within the scope of the invention described in the claims.
The present application claims priority based on japanese patent application No. 2019-236608, applied on 26.12.2019, the entire contents of which are incorporated herein by reference.
Description of the reference symbols
100 make-up assisting system
10 applying device (applying device)
11 main body
12 handle
12A push button
13 spreading surface
14 holding tool
S sheet
15 position sensor
16 pressure sensor
17 acceleration sensor
18 cosmetic product sensor
19 vibration motor
20 computer
21 Central Processing Unit (CPU)
22 control part
23 memory
24 information input unit
25 information output unit
26 communication module
27 power supply
30 Portable terminal (auxiliary device for dressing)
31 main body
32 display (display part)
33 cam (area detection part)
40 user image
Region 41
41A, 41B painted area
50 region related information
51 leading information
60. 61, 62 coating
70. 71, 72 area
80 display column
90. 91, 92 messages
P-user
Surface of face F

Claims (16)

1. A makeup assisting device for guiding application of a cosmetic product, comprising:
an area detection unit that detects an area to which a cosmetic is applied; and
a display section that displays information associated with the region,
the information is information indicating the application condition of the cosmetic in the area.
2. The makeup auxiliary device according to claim 1,
and displaying the area on the display part, and displaying the information in the area.
3. Cosmetic auxiliary device according to claim 1 or 2,
the information is information based on a predetermined painting condition.
4. The makeup auxiliary device according to claim 3,
the application condition includes at least any one of an application range, an application direction, an application pressure, an application speed, and an application amount of the cosmetic in the area.
5. Cosmetic auxiliary device according to claim 3 or 4,
the display of the information is made different in a case where the application of the cosmetic does not satisfy the application condition and in a case where the application of the cosmetic satisfies the application condition.
6. The makeup auxiliary device according to any one of claims 3 to 5,
displaying the information within the area if the application of the cosmetic does not satisfy the application condition.
7. The makeup auxiliary device according to claim 6,
in the area where the information is displayed, in a case where the application of the cosmetic satisfies the application condition, the display of the information is deleted.
8. The makeup auxiliary device according to any one of claims 1 to 7,
the information includes an overlay displayed in the area.
9. The makeup auxiliary device according to any one of claims 3 to 7,
the information comprises a cover layer displayed in the area,
in the area where the overlay layer is displayed, in a case where the application of the cosmetic satisfies the application condition, the overlay layer is deleted.
10. The makeup auxiliary device according to claim 4,
the coating amount is calculated in accordance with the area of the region.
11. Cosmetic auxiliary device according to claim 4 or 10,
displaying a message in a case where the usage amount of the cosmetic does not satisfy the application amount.
12. The makeup auxiliary device according to any one of claims 1 to 11,
the area is the surface of the face to which the cosmetic is applied.
13. The makeup auxiliary device according to any one of claims 1 to 12,
the cosmetic is a skin care cosmetic.
14. A cosmetic aid method for directing the application of a cosmetic product, comprising:
a step of detecting an area to which the cosmetic is applied; and
a step of displaying information associated with the area,
the information is information indicating the application condition of the cosmetic in the area.
15. A makeup assistant program for guiding application of a cosmetic product to cause a computer to execute:
the area where the cosmetic product is applied is detected,
displaying information associated with the region or regions of the display,
the information is information indicating the application condition of the cosmetic in the area.
16. A cosmetic aid system for directing the application of a cosmetic product, comprising:
an application device for applying a cosmetic; and
a makeup assisting device communicatively connected with the application device,
the makeup auxiliary device is provided with:
an area detection unit that detects an area to which the cosmetic is applied; and
a display section that displays information associated with the region,
the information is information indicating the application condition of the cosmetic in the area.
CN202080087543.5A 2019-12-26 2020-12-14 Makeup assisting device, makeup assisting method, makeup assisting program, and makeup assisting system Pending CN114867383A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2019-236608 2019-12-26
JP2019236608 2019-12-26
PCT/JP2020/046541 WO2021131852A1 (en) 2019-12-26 2020-12-14 Makeup assistance device, makeup assistance method, makeup assistance program, and makeup assistance system

Publications (1)

Publication Number Publication Date
CN114867383A true CN114867383A (en) 2022-08-05

Family

ID=76574427

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080087543.5A Pending CN114867383A (en) 2019-12-26 2020-12-14 Makeup assisting device, makeup assisting method, makeup assisting program, and makeup assisting system

Country Status (3)

Country Link
JP (1) JPWO2021131852A1 (en)
CN (1) CN114867383A (en)
WO (1) WO2021131852A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7426362B2 (en) 2021-09-10 2024-02-01 花王株式会社 Makeup motion support system
WO2023038136A1 (en) * 2021-09-13 2023-03-16 ヤーマン株式会社 Information processing device, program, and information processing method
WO2024034537A1 (en) * 2022-08-10 2024-02-15 株式会社 資生堂 Information processing device, information processing method, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014166218A (en) * 2013-02-28 2014-09-11 Panasonic Corp Makeup supporting device, makeup supporting method, and makeup supporting program
CN104203042A (en) * 2013-02-01 2014-12-10 松下电器产业株式会社 Makeup application assistance device, makeup application assistance method, and makeup application assistance program
CN104822292A (en) * 2013-08-30 2015-08-05 松下知识产权经营株式会社 Makeup assistance device, makeup assistance system, makeup assistance method, and makeup assistance program
WO2016158729A1 (en) * 2015-03-27 2016-10-06 株式会社メガチップス Makeup assistance system, measurement device, portable terminal device, and program
US20180033205A1 (en) * 2016-08-01 2018-02-01 Lg Electronics Inc. Mobile terminal and operating method thereof
WO2018117020A1 (en) * 2016-12-20 2018-06-28 株式会社資生堂 Application control device, application control method, program, and recording medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104203042A (en) * 2013-02-01 2014-12-10 松下电器产业株式会社 Makeup application assistance device, makeup application assistance method, and makeup application assistance program
JP2014166218A (en) * 2013-02-28 2014-09-11 Panasonic Corp Makeup supporting device, makeup supporting method, and makeup supporting program
CN104822292A (en) * 2013-08-30 2015-08-05 松下知识产权经营株式会社 Makeup assistance device, makeup assistance system, makeup assistance method, and makeup assistance program
WO2016158729A1 (en) * 2015-03-27 2016-10-06 株式会社メガチップス Makeup assistance system, measurement device, portable terminal device, and program
US20180033205A1 (en) * 2016-08-01 2018-02-01 Lg Electronics Inc. Mobile terminal and operating method thereof
WO2018117020A1 (en) * 2016-12-20 2018-06-28 株式会社資生堂 Application control device, application control method, program, and recording medium

Also Published As

Publication number Publication date
WO2021131852A1 (en) 2021-07-01
JPWO2021131852A1 (en) 2021-07-01

Similar Documents

Publication Publication Date Title
CN114867383A (en) Makeup assisting device, makeup assisting method, makeup assisting program, and makeup assisting system
US20090157478A1 (en) Usability evaluation method and system of virtual mobile information appliance
JP2008017936A (en) Makeup apparatus and method
US20150182757A1 (en) A method and system for cosmetic skin procedures for home use
CN112198962B (en) Method for interacting with virtual reality equipment and virtual reality equipment
WO2014103412A1 (en) Beauty system
US20140174463A1 (en) Method for moving color-makeup tool of automatic color-makeup machine
JP6799525B2 (en) Biological information analyzer and hand skin analysis method
JP2012232042A (en) Artificial nail service provision device and terminal unit
US20190208892A1 (en) Cosmetic applicator system including trackable cosmetic device, and client device to assist users in makeup application
CN110549353A (en) Force vision device, robot, and computer-readable medium storing force vision program
CN115904191A (en) Simulated makeup method, simulated makeup system and intelligent dressing table
CN110142769B (en) ROS platform online mechanical arm demonstration system based on human body posture recognition
KR20130143320A (en) Apparatus and method for language translation interface using vision tracking
CN113505791B (en) Method and device for attaching nail pieces, computer equipment and storage medium
US20230000241A1 (en) Conformable point array for discretized cosmetic design application
JP7476505B2 (en) IMAGE PROCESSING DEVICE, MAKEUP SIMULATION DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM
EP3889968A1 (en) Method for self-measuring facial or corporal dimensions, notably for the manufacturing of personalized applicators
KR20210108080A (en) Electronic device and method for providing visual effect using the same
CN115252365A (en) Upper limb mirror image rehabilitation training robot system, control method and computer medium
JP2003340757A (en) Robot
WO2022064412A1 (en) Optical stylus for optical position determination device
KR102123598B1 (en) Apparatus and system for skin diagnosis and method thereof
EP3435278A1 (en) Body information analysis apparatus capable of indicating shading-areas
TW200620990A (en) System and method for focusing images automatically

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination