CN110025116B - Notification device, notification method, and recording medium - Google Patents

Notification device, notification method, and recording medium Download PDF

Info

Publication number
CN110025116B
CN110025116B CN201910025846.3A CN201910025846A CN110025116B CN 110025116 B CN110025116 B CN 110025116B CN 201910025846 A CN201910025846 A CN 201910025846A CN 110025116 B CN110025116 B CN 110025116B
Authority
CN
China
Prior art keywords
notification
unit
information
user
makeup
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910025846.3A
Other languages
Chinese (zh)
Other versions
CN110025116A (en
Inventor
滨冈奈都美
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Casio Computer Co Ltd
Original Assignee
Casio Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2018231743A external-priority patent/JP7225749B2/en
Application filed by Casio Computer Co Ltd filed Critical Casio Computer Co Ltd
Publication of CN110025116A publication Critical patent/CN110025116A/en
Application granted granted Critical
Publication of CN110025116B publication Critical patent/CN110025116B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D42/00Hand, pocket, or shaving mirrors
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D42/00Hand, pocket, or shaving mirrors
    • A45D42/08Shaving mirrors
    • A45D42/10Shaving mirrors illuminated
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D44/005Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D44/02Furniture or other equipment specially adapted for hairdressers' rooms and not covered elsewhere
    • A45D44/04Special adaptations of portable frames or racks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/542Event management; Broadcasting; Multicasting; Notifications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/162Detection; Localisation; Normalisation using pixel segmentation or colour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D2044/007Devices for determining the condition of hair or skin or for selecting the appropriate cosmetic or hair treatment

Abstract

The invention provides a notification device, a notification method and a recording medium, which enable a user to more appropriately recognize the change of the state of the user. The notification device (1) is provided with a physical condition information acquisition unit (111) or a makeup information acquisition unit (115), a light emission unit (17) and a display unit (19) as notification units, a control unit (114), and a determination unit (112). A physical condition information acquisition unit (111) or a makeup information acquisition unit (115) acquires face information. A light emitting unit (17) and a display unit (19) as notification units perform notification based on face information. A determination unit (112) determines a change between the face information before notification and the face information after notification. A control unit (114) controls the notification by a light emitting unit (17) and a display unit (19) which are notification units, based on the determination result of the determination unit (112).

Description

Notification device, notification method, and recording medium
Technical Field
The invention relates to a notification device, a notification method and a recording medium.
Background
In recent years, there is a technique of acquiring information indicating a state of a user (for example, a state related to physical condition or a state related to makeup) and variously utilizing the acquired information.
For example, patent document 1 discloses a configuration in which information on a face of a user is acquired and notification based on the acquired information on the face is performed for the user.
Documents of the prior art
Patent literature
Patent document 1: JP 2014-166218 publication
However, in the conventional technique such as the technique disclosed in patent document 1, the user only stays in the notification of the user state at a certain point in time. For this reason, it may be difficult for the user to recognize the change of the state of the user. For example, it may be difficult for the user to recognize whether or not the expression is improved by means of massage or the like, or whether or not makeup is performed appropriately.
Disclosure of Invention
The present invention has been made in view of such circumstances, and an object thereof is to enable a user to appropriately recognize a change in the state of the user.
In order to achieve the above object, a notification device according to an aspect of the present invention includes: a face information acquisition unit that acquires face information; a notification unit that performs notification based on the face information; a determination unit configured to determine a change in the face information before notification and the face information after notification; and a control unit that controls the notification by the notification unit based on a determination result by the determination unit.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, the user can be made to recognize the change in the state of the user more appropriately.
Drawings
Fig. 1 is a configuration diagram showing an external configuration of a front surface of a notification device according to an embodiment of the present invention.
Fig. 2 is a structural diagram showing an external configuration of a side surface of the notification device.
Fig. 3 is a block diagram showing a hardware configuration of the notification device.
Fig. 4 is a functional block diagram showing a functional configuration for executing the physical status notification processing among the functional configurations of the notification device.
Fig. 5 is a flowchart illustrating a flow of the physical status notification process performed by the notification device.
Fig. 6 is a pictorial view illustrating transition of the mirror portion and the notification portion when the notification device executes the physical condition notification process.
Fig. 7 is a functional block diagram showing a functional configuration for executing the makeup notifying process among the functional configurations of the notifying device.
Fig. 8 is a flowchart illustrating a flow of the makeup notifying process executed by the notifying device.
Detailed Description
Embodiments of the present invention will be described below with reference to the drawings.
The notification device 1 according to one embodiment of the present invention is configured as a self-standing mirror that can be carried by a user. The notification device 1 acquires the physical condition information of the user who is looking right at the mirror. The notification device 1 controls the notification unit to notify the support information for improving the physical condition information. Further, the notification device 1 determines whether or not the physical condition information is improved after the notification of the support information.
According to the notification device 1, the user can grasp the method for improving the physical condition information from the support information. The notification device 1 can perform an arbitrary process after determining whether or not the physical condition of the user who grasps the method for improving the physical condition information is improved. That is, the notification device 1 enables the user to recognize the physical condition information of the user himself more appropriately.
[ appearance Structure ]
Fig. 1 is a configuration diagram showing an external configuration of a front surface of a notification device 1 according to an embodiment of the present invention. Fig. 2 is a structural diagram showing an external configuration of a side surface of the notification device 1. The front surface of the notification device 1 is formed to have, for example, an a4 size specified by ISO (International Organization for Standardization) 216.
As shown in fig. 1 and 2, the notification device 1 includes a mirror portion 30 and a leg portion 31. The mirror portion 30 is a mirror having a reflection surface.
The leg portion 31 and the hinge portion 32 are a mechanism for making the notification device 1 stand by itself. The leg portion 31 is rotatably engaged with the mirror portion 30 back and forth via the hinge portion 32.
As shown in fig. 2(a), when the user carries the notification device 1, the side surface of the mirror portion 30 and the side surface of the leg portion 31 can be aligned to form a small volume for transportation. On the other hand, as shown in fig. 2(B), when the user mounts the notification device 1 on a desk or the like for use, the user can make the notification device 1 stand by itself by rotating the leg portion 31 back and forth with the hinge portion 32 as a center point. The hinge portion 32 has a structure for holding the leg portion 31 at a predetermined angle so that the notification device 1 can stand by itself.
As shown in fig. 1, the notification device 1 further includes an imaging unit 16, a light emitting unit 17, an input unit 18, and a display unit 19.
The imaging unit 16 is a part that images the user who is facing the mirror unit 30 as an object when notifying use of the apparatus 1. The imaging unit 16 is disposed at a position where a face image of the user facing the mirror unit 30 can be imaged.
The light emitting unit 17 emits light to illuminate the user facing the mirror unit 30. The notification device 1 is illuminated by the light emitting unit 17 to function as an illuminated mirror (also referred to as an actress mirror). The light emitting unit 17 also serves as a notification unit in the health condition notification process described later.
The light emitting unit 17 is disposed at a plurality of positions at both ends of the mirror unit 30. However, for the sake of illustration, in fig. 1, only 1 light-emitting unit 17 is denoted by a reference numeral, and the other light-emitting units 17 are not denoted by a reference numeral.
The input unit 18 is a part that receives an operation input from a user. In fig. 1, the input unit 18a, the input unit 18b, the input unit 18c, and the input unit 18d are illustrated as the input unit 18 corresponding to the operation content. Here, the input unit 18a receives an operation for switching to the makeup mode. The input unit 18b receives an operation for switching to the care mode. Further, the input unit 18c receives an operation for switching to a setting mode in which various settings are performed. Further, the input unit 18d receives an operation for switching on/off of the power supply of the notification device 1.
The display unit 19 is a part that serves as a notification unit in the physical condition notification process described later. The display unit 19 displays messages such as characters, images, and the like. The display unit 19 may display the face image of the user facing the mirror unit 30, which is captured by the imaging unit 16. In the notification device 1, the reflection surface of the reflection portion constituting the mirror portion 30 and the display surface of the display portion 19 are arranged so as to be superimposed in the visual recognition direction of the user facing the mirror portion 30 so as to be simultaneously visually recognized from the user.
The display unit 19, which is formed of, for example, a liquid crystal display, is disposed in parallel and superposed on the back side in the visual recognition direction of the mirror unit 30 formed of a half mirror.
With such a configuration, the user can visually recognize the face of the user reflected by the mirror portion 30 and the information displayed on the display portion 19 at the same time. In the example of fig. 1, the display area of the display unit 19 is provided on the upper part of the screen, and the reflection area of the mirror unit 30 is provided on the lower part of the screen.
The external appearance structure of the notification device 1 is described above. However, this configuration is merely an example, and the configuration of the external appearance of the notification device 1 is not limited to this example.
For example, the imaging unit 16 may be disposed above the mirror unit 30 as shown in fig. 1, but may be disposed below the mirror unit 30. In addition, the imaging unit 16 may be disposed in parallel and superposed on the back side of the mirror unit 30 made of a half mirror in the visual recognition direction, as in the display unit 19.
For example, the light emitting unit 17 may be disposed above or below the mirror unit 30, or may be disposed around the entire periphery of the mirror unit 30. Further, for example, the number or arrangement of the input units 18 may be changed. For example, a part of the display unit 19 may be configured as a touch panel, and the input unit 18 and the display unit 19 may be integrally configured.
Further, for example, the display area realized by the display unit 19 may be disposed on the upper portion of the screen or may be disposed at another position. For example, it is conceivable that the user's face is reflected on the central portion of the mirror portion 30, and the display region is disposed around the central portion.
Further, for example, the mirror unit 30 may be disposed in a part of the front surface of the notification device 1, and the display unit 19 may be disposed in the other part of the front surface. That is, the mirror portion 30 and the display portion 19 do not necessarily have to be arranged so as to overlap.
[ hardware configuration ]
Fig. 3 is a block diagram showing the hardware configuration of the notification device 1.
As shown in fig. 3, the notification device 1 includes a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, a bus 14, an input/output interface 15, an image pickup Unit 16, a light emitting Unit 17, an input Unit 18, a display Unit 19, a storage Unit 20, and a driver 21.
The CPU11 executes various processes in accordance with a program recorded in the ROM12 or a program loaded from the storage unit 20 into the RAM 13.
The RAM13 also preferably stores data and the like necessary for the CPU11 to execute various processes.
The CPU11, ROM12, and RAM13 are connected to each other via the bus 14. An input/output interface 15 is also connected to the bus 14. The input/output interface 15 is connected to an imaging unit 16, a light emitting unit 17, an input unit 18, a display unit 19, a storage unit 20, and a driver 21.
The imaging unit 16 includes an optical lens unit and an image sensor, although not shown.
The optical lens unit is configured by a lens for converging light to photograph an object, for example, a focus lens, a zoom lens, and the like.
The focus lens is a lens that forms an image of a subject on a light receiving surface of the image sensor. A zoom lens is a lens in which a focal length is freely changed within a certain range.
The imaging unit 16 is also provided with a peripheral circuit for adjusting setting parameters such as focus, exposure, and white balance, as necessary.
The image sensor is configured by a photoelectric conversion element, AFE (Analog Front End), and the like.
The photoelectric conversion element is formed of, for example, a CMOS (Complementary Metal Oxide Semiconductor) type photoelectric conversion element or the like. The photoelectric conversion element receives an object image from the optical lens unit. For this purpose, the photoelectric conversion element photoelectrically converts (images) an object image, accumulates image signals for a certain time, and sequentially supplies the accumulated image signals to the AFE as analog signals.
The AFE performs various signal processes such as an Analog/Digital (a/D) conversion process on the Analog image signal. A digital signal is generated by various signal processing and output as an output signal of the imaging unit 16.
The output signal of the image pickup section 16 is appropriately supplied to the CPU11 and the like.
The Light Emitting unit 17 includes a Light Emitting body such as an LED (Light Emitting Diode) corresponding to each color in the RGB color model, and a control circuit capable of adjusting a color component in Light emission of the Light Emitting body based on the RGB color model. The light emitting unit 17 illuminates the user by adjusting the RGB color components to a predetermined state together with the activation of the notification device 1 to emit light. The predetermined state is a state in which the face of the user reflected on the mirror portion 30 is naturally visible. The light emitting unit 17 suppresses, for example, the emission of the light emitter corresponding to R based on the instruction from the CPU11, thereby reducing, for example, the R component and emitting light.
The input unit 18 is configured by various buttons and the like, and inputs various information in response to an instruction operation by a user.
The display unit 19 is configured by a liquid crystal display or the like, and displays an image corresponding to the image data output from the CPU 11. The mirror portion 30 is formed of a half mirror, and reflects, for example, the face of the user. The arrangement of the display unit 19 and the mirror unit 30 is as described above with reference to fig. 1.
The Memory unit 20 is formed of a semiconductor Memory such as a DRAM (Dynamic Random Access Memory) and stores various data.
The drive 21 is preferably provided with a removable medium 100 formed of a magnetic disk, an optical disk, an opto-magnetic disk, a semiconductor memory, or the like. The removable medium 100 stores various data such as a program for executing a physical condition notification process described later and image data. Various data such as programs and image data read from the removable medium 100 by the drive 21 are installed in the storage unit 20 as necessary.
The notification device 1 may have a hardware configuration other than the above-described hardware configuration. For example, the notification device 1 may include an output unit configured by a lamp, a speaker, a motor for vibration, or the like, and configured to output light, sound, or a vibration signal. For example, the notification device 1 may include a communication unit for performing communication with another device (not shown) via a network including the internet.
[ functional Structure ]
Fig. 4 is a functional block diagram showing a functional configuration for executing the physical condition notification processing among the functional configurations of the notification device 1.
The physical status notification process is a series of processes in which the notification device 1 performs notification based on the physical status information.
When the physical status notification process is executed, as shown in fig. 4, the CPU11 functions as a physical status information acquisition unit 111, a determination unit 112, a selection unit 113, and a control unit 114.
Further, the determination information storage unit 201 and the notification information storage unit 202 are set in one area of the storage unit 20.
The determination information storage unit 201 stores information for the determination unit 112 to determine the physical condition information. Specifically, the physical condition information of the user acquired in the physical condition notification process and the physical condition information serving as a criterion for the determination are stored.
The notification information storage unit 202 stores information for notification by the light emitting unit 17 and the display unit 19 under the control of the control unit 114. Specifically, control information for controlling light emission of the light emitting unit 17, and character data (text data) or image data for displaying on the display unit 19 are stored. In the following description, both the light emitting unit 17 and the display unit 19 are appropriately referred to as "notification units".
The various data such as the physical condition information and the notification information stored in the determination information storage unit 201 and the notification information storage unit 202 may be stored only in the storage unit 20, but may be appropriately read from the removable medium 100 by the drive 21.
The physical status information acquisition unit 111 is a part that acquires physical status information that is information indicating the physical status of the user. In the present embodiment, the physical status information acquisition unit 111 analyzes the image captured by the imaging unit 16, and acquires the analysis result as physical status information. Here, referring to fig. 1, as described above, the imaging unit 16 is disposed at a position where the face of the user facing the mirror unit 30 can be imaged. Therefore, the image captured by the imaging unit 16 includes an image of the face of the user.
The physical status information acquisition unit 111 first analyzes the image and specifies a region corresponding to the face of the user in the image. For example, the physical status information acquisition unit 111 identifies a region corresponding to the face of the user by extracting the contour of the face of the user in the image or extracting feature points such as eyes. Next, the physical status information acquisition unit 111 analyzes the color of each pixel of the region corresponding to the identified face of the user. For example, RGB values are calculated for each pixel, and the R values of the pixels are integrated to obtain the physical condition information of the user.
That is, in the present embodiment, the redness of the face of the user is used as the physical condition information indicating the physical condition or mental state of the user. This is because, in general, a ruddy face is not fatigued and can be grasped as a healthy face color, compared with a pale face.
In addition, weighting may be performed when integrating the R value of each pixel. For example, when determining a region corresponding to a face portion of the user, it is further confirmed that the region of the cheeks or lips, where redness tends to appear as a feature, is present. Then, the R value of the region of the cheek or lip where redness is likely to appear as a feature may be accumulated by adding a weight larger than the R value of the other region.
The physical status information acquisition unit 111 acquires the physical status information (hereinafter, this physical status information is referred to as "1 st physical status information") by performing the analysis as described above at the start of the physical status notification process. After that, when a predetermined time has elapsed, the physical status information acquisition unit 111 performs analysis as described above to acquire physical status information again (hereinafter, this physical status information is referred to as "2 nd physical status information").
Then, the physical status information acquisition unit 111 stores the physical status information thus acquired as analyzed in the determination information storage unit 201 as determination information. The physical status information acquisition unit 111 stores the average value of the histories of the physical status information acquired in the past in the determination information storage unit 201 as the physical status information serving as a reference (hereinafter, this physical status information is referred to as "reference physical status information").
The determination unit 112 is a part that performs determination based on the physical condition information stored in the determination information storage unit 201. As described above, in the present embodiment, the value of R corresponding to the redness of the face of the user is used as the physical status information. For this purpose, the determination unit 112 determines the face of the user based on the value of R corresponding to the redness.
The determination unit 112 first compares the 1 st physical condition information with the reference physical condition information, and determines whether the physical condition information of the user at the start of the physical condition notification process is better than the reference based on the comparison result.
Specifically, the determination unit 112 compares the 1 st physical condition information with the reference physical condition information, and determines that the physical condition of the user at the start of the physical condition notification process is good when the value of R of the 1 st physical condition information is higher than the reference physical condition information. The value of R is used to compare the 1 st physical condition information with the reference physical condition information, but the comparison is not limited to this, and may be performed using at least 1 or more parameter among RGB, or may be performed using image analysis or biometric information obtained from a biometric sensor.
The determination unit 112 compares the 1 st physical condition information and the 2 nd physical condition information, and determines whether or not the physical condition information is improved based on the comparison result.
Specifically, the determination unit 112 compares the 1 st physical condition information and the 2 nd physical condition information, and determines that the physical condition information of the user is improved by performing the physical condition notification process when the value of R of the 2 nd physical condition information is increased by a predetermined value or more compared with the value of R of the 1 st physical condition information. The value of R is also used for comparison between the 1 st physical condition information and the 2 nd physical condition information, but the comparison is not limited to this, and may be performed using at least 1 or more parameters among RGB, or may be performed using image analysis or biometric information obtained from a biometric sensor.
The determination unit 112 outputs the determination result obtained by such determination to a control unit 114 described later.
The selection unit 113 is a part for selecting a notification unit used for notification in the physical condition notification process. The selection unit 113 selects an appropriate notification unit according to the contents of the notification determined based on the physical condition information, and the like.
The control unit 114 is a part that controls the notification unit selected by the selection unit 113 based on the determination result of the determination unit 112 to perform notification.
The details of the notification executed in cooperation with each of these units will be described in detail in the following description of [ operation ].
[ actions ]
Next, the operation of the notification device 1 will be described.
Fig. 5 is a flowchart illustrating a flow of the physical status notification process executed by the notification device 1. Fig. 6 is a pictorial view illustrating transition of the mirror unit 30 and the notification unit (i.e., the light emitting unit 17 and the display unit 19) during the physical status notification process.
The physical status notification process is started together with the activation of the notification apparatus 1.
In step S11, the physical status information acquisition unit 111 acquires the 1 st physical status information when the user is detected in front of the mirror unit 30 by analyzing the image captured by the imaging unit 16.
In step S12, the determination unit 112 compares the 1 st physical condition information acquired in step S11 with the reference physical condition information stored in the determination information storage unit 201, and determines whether or not the physical condition of the user at the start of the physical condition notification process is good.
If the physical condition of the user is good, the determination at step S12 is yes, and the process ends. That is, since the physical condition of the user is good, the process is terminated without particularly notifying it. In this case, the notification device 1 functions as an illuminated mirror by the light emitting portion 17 that emits light and the mirror portion 30 that reflects the user. On the other hand, if it is determined that the physical condition of the user is not good, no is performed in step S12, and the process proceeds to step S13.
In step S13, the selection unit 113 and the control unit 114 cooperate with each other to notify the user. In this notification, for example, the selection unit 113 selects the display unit 19 as a notification unit. Then, the control section 114 displays "the facial color is not like" on the display section 19. "the look" looks tired ". "and the like, including suggested messages or illustrations relating to the physical condition. In this case, for example, "do you want to make up? The message such as "on" is further displayed on the display unit 19 as a message for prompting setting to the makeup mode, which will be described later.
Fig. 6(a) shows the states of the mirror unit 30 and the notification unit (i.e., the light emitting unit 17 and the display unit 19) when the notification in step S13 is performed. In this case, the light emitting unit 17 emits light by adjusting the color components of RGB to a predetermined state together with the start-up of the notification device 1. The face of the user illuminated by the light emission adjusted to the predetermined state is displayed on the mirror portion 30. Further, the message as described above is displayed on the display unit 19.
The user can recognize that his/her physical condition is not good by receiving such notification.
In step S14, control unit 114 determines which mode is currently set. Here, in the present embodiment, 2 modes of "makeup mode" and "care mode" are set. Then, the control section 114 sets to an arbitrary mode based on the user's operation of the input section 18a or the input section 18b shown in fig. 1. The setting based on the user operation may be performed in advance before the physical condition notification process or may be performed during the physical condition notification process. For example, the mode may be set by the user' S operation of receiving the notification at step S13.
When the user selects the makeup mode and sets the makeup mode, it is determined as "makeup mode" at step S14, and the process proceeds to step S15. On the other hand, when the care mode is set by the user selecting the care mode, the determination of step S14 is "care mode", and the process proceeds to step S18.
When the makeup mode is selected, in step S15, the selection unit 113 and the control unit 114 cooperate to notify the user. In this notification, for example, the selection unit 113 selects the light emitting unit 17 as the notification unit. Then, the control section 114 controls the light emitting section 17 to intentionally suppress the R component of the light emitting section 17. This emphasizes a lack of flush and poor complexion on for the user. That is, the control unit 114 emphasizes the physical condition information acquired by the physical condition information acquisition unit 111. In addition, the physical condition information such as the redness of the face reflected on the mirror portion 30 is also emphasized to the user. Further, the selection unit 113 selects the display unit 19 as a notification unit. Then, the control unit 114 notifies the user by displaying the makeup example image and the color sample used in the example on the display unit 19.
Fig. 6(B-1) shows the states of the mirror unit 30 and the notification unit (i.e., the light emitting unit 17 and the display unit 19) when the notification in step S15 is performed. In this case, the light emitting unit 17 emits light with a reduced R component. The mirror unit 30 reflects the face of the user illuminated by the light emission with the R component suppressed. Further, the image of the makeup example and the color sample used in the example are displayed on the display unit 19.
The user can apply appropriate makeup by receiving such notification. In particular, since the color difference of the face is emphasized, makeup can be performed to compensate for the color difference. In addition, for example, it may be: when the user is detected to start makeup by analyzing the image captured by the imaging unit 16, the light emission of the R component of the light emitting unit 17 is switched to natural light emission in which the suppression of the R component is removed. Thus, the user can apply makeup while accurately grasping the progress and degree of makeup applied to the user under natural light emission with the suppression of the R component removed.
In step S16, the determination unit 112 compares the 1 st physical condition information and the 2 nd physical condition information, and determines whether or not the physical condition information is improved based on the comparison result. The 2 nd physical condition information is the physical condition information of the user after the user has colored a red face by applying a cosmetic such as a red-based powder (for example, a red-based lipstick, a blush, or the like) to the face of the user in the makeup performed by the user after the notification in step S15.
In the case where the value of R of the 2 nd physical condition information is not increased by the given value or more compared with the value of R of the 1 st physical condition information, since the physical condition information is not improved, it is determined as no at step S16, and step S16 is repeated. This means that the makeup by the user is not yet sufficient.
On the other hand, in the case where the value of R of the 2 nd physical condition information increases by the given value or more as compared with the value of R of the 1 st physical condition information, since the physical condition information is improved, it is determined as yes at step S16, and the process proceeds to step S17. This indicates that makeup is properly performed by the user.
The determination unit 112 may obtain information on the face of the user illuminated with the emission of the R component suppressed, perform comparison between the 1 st physical condition information and the 2 nd physical condition information after eliminating the influence on the color tone of the face illuminated with the emission by post-processing, stop the emission of light when obtaining the face image of the user used for the comparison, or illuminate the face with natural light with which suppression of the R component is stopped. By performing such processing, it is no longer necessary to perform comparison determination while suppressing the influence of the emission of the red component at the time of determination by the determination unit 112.
In step S17, the selection unit 113 and the control unit 114 cooperate with each other to notify the user. In this notification, for example, the selection unit 113 selects the light emitting unit 17 as the notification unit. Then, the control unit 114 controls the light emitting unit 17 to return the light emitting state of the light emitting unit 17 to the predetermined state similar to that at the time of starting the notification device 1. The predetermined state is a state in which the face of the user reflected on the mirror unit 30 is naturally visible as described above. The selection unit 113 selects the display unit 19 as a notification unit. Then, the control unit 114 notifies the user of "make-up improvement of the face color" by displaying characters or illustrations on the display unit 19, for example. "and the like contain suggested messages that are related to the physical condition.
Fig. 6(B-2) shows the states of the mirror unit 30 and the notification unit (i.e., the light emitting unit 17 and the display unit 19) when the notification in step S17 is performed. In this case, the light emitting unit 17 returns from the state where the R component is intentionally suppressed to the state where the user's face reflected on the mirror unit 30 can be naturally seen. The message as described above is further displayed on the display unit 19.
The user can grasp the facial complexion by confirming the message and improve. Further, the user can confirm the face after makeup in a state in which the user's face reflected on the mirror portion 30 is naturally visible. Thereby, the physical status notification processing is ended.
On the other hand, when the care mode is selected, the selection unit 113 and the control unit 114 cooperate with each other to notify the user at step S18. In this notification, for example, the selection unit 113 selects the display unit 19 as a notification unit. Then, the control unit 114 notifies the user by displaying a massage method for improving blood circulation on the display unit 19, for example.
Fig. 6(C-1) shows the states of the mirror unit 30 and the notification unit (i.e., the light emitting unit 17 and the display unit 19) when the notification in step S18 is performed. In this case, a massage method for improving blood circulation is displayed on the display unit 19.
By receiving such notification, the user can perform a massage for improving blood circulation.
In step S19, the determination unit 112 determines whether or not a predetermined time has elapsed since the notification of step S18 was performed. The given time is set to a time sufficient for the blood to circulate well by the massage. If the predetermined time has not elapsed, the determination at step S19 is no, and the determination at step S19 is repeated. On the other hand, in the case where the given time has elapsed, the determination is yes at step S19, and the process proceeds to step S20.
In step S20, the determination unit 112 compares the 1 st physical condition information and the 2 nd physical condition information, and determines whether or not the physical condition information is improved based on the comparison result. The 2 nd physical condition information is the physical condition information of the user who has performed a massage for improving blood circulation.
In the case where the value of R of the 2 nd physical condition information increases by a given value or more compared with the value of R of the 1 st physical condition information, since the physical condition information is improved, it is determined as yes at step S20, and the process proceeds to step S21. This indicates that the blood circulation is improved by the user performing a massage for improving the blood circulation, and that a red halo appears on the face.
In step S21, the selection unit 113 and the control unit 114 cooperate to notify the user. In this notification, for example, the selection unit 113 selects the display unit 19 as a notification unit. Then, the control unit 114 notifies the user of "the face color is improved by the massage" by displaying characters or illustrations on the display unit 19, for example. "and so on.
The states of the mirror unit 30 and the notification unit (i.e., the light emitting unit 17 and the display unit 19) when the notification in step S21 is performed are shown as fig. 6 (C-2). In this case, the message as described above is displayed on the display unit 19.
The user can grasp the complexion by confirming the message and improve. Thereby, the physical status notification processing is ended.
On the other hand, in the case where the value of R of the 2 nd physical status information does not increase by the given value or more compared with the value of R of the 1 st physical status information, since the physical status information is not improved, it is determined as no at step S20, and the process proceeds to step S22. This indicates that the blood circulation was not well maintained and that the face was not ruddy even though the massage was sufficiently performed. At this point, in the case of the makeup mode, since it is considered that the physical condition information is finally improved by continuing the makeup (that is, the value of the final R is increased by a given value or more), the determination is continued until the physical condition information is improved as in step S16. However, in the case of the care mode, if the physical condition information is not improved even if the massage is sufficiently performed, the user is in a poor current physical condition, and it is considered that the physical condition information is not improved even if the massage is further continued. For this reason, the determination is not continued as in step S16, but the process proceeds to step S22.
In step S22, the selection unit 113 and the control unit 114 cooperate with each other to notify the user. In this notification, for example, the selection unit 113 selects the display unit 19 as a notification unit. Then, the control unit 114 notifies the user that "although the massage is performed, the face is not improved" by displaying characters or illustrations on the display unit 19, for example. "and so on. The user can grasp that the complexion is not improved by confirming the message. Therefore, the user can know that the user is in a state of anemia, gastric ulcer or the like. In addition, reception in a hospital can be considered. Thereby ending the physical status notification process.
Through the processing described above, the notification device 1 can notify the user of the acquired support information for improving the physical condition information. The user can grasp the method for improving the physical condition information.
The notification device 1 configured as described above or the notification device 1 configured as a modification example described below includes a physical condition information acquisition unit 111 or a makeup information acquisition unit 115, a light emitting unit 17 and a display unit 19 as notification units, a control unit 114, and a determination unit 112.
The physical condition information acquisition unit 111 or the makeup information acquisition unit 115 acquires face information.
The light emitting unit 17 and the display unit 19, which are notification units, perform notification based on the face information.
The determination unit 112 determines a change between the face information before the notification and the face information after the notification.
The control unit 114 controls notification of the light emitting unit 17 and the display unit 19 as notification units based on the determination result of the determination unit 112.
Thus, the notification device 1 can notify the user of the change of the user himself based on the acquired face information. Therefore, the user can grasp the change in the user's own state (e.g., a change in physical condition information or a change in makeup). That is, according to the notification device 1, the user can recognize the change in the state of the user more appropriately.
The notification device 1 includes a plurality of types of light emitting units 17 as notification units, a display unit 19, and a selection unit 113.
The selection unit 113 selects the types of the light emitting unit 17 and the display unit 19 as notification units.
The control unit 114 controls the light emitting unit 17 and the display unit 19 selected by the selection unit 113 as notification units.
Thus, the notification device 1 can perform notification by the selected notification unit. Therefore, the notification device 1 can perform notification in accordance with the mode set by the user, for example.
The determination unit 112 calculates a difference between the face information before notification and the face information after notification.
The control unit 114 notifies the light emitting unit 17 and the display unit 19, which are the notification units, of the difference calculated by the determination unit 112.
Thus, the notification device 1 can notify the user of the difference in face information, which is information indicating the change of the user himself/herself based on the acquired face information. Therefore, the user can grasp the difference of the face information, which is information indicating the change of the user himself.
The control unit 114 causes the light emitting unit 17 and the display unit 19 as notification units to notify them so that the face information is emphasized.
This enables the user to more appropriately grasp the state of the user himself, which is usually difficult to perceive.
The control unit 114 causes the light emitting unit 17 and the display unit 19, which are notification units, to notify information prompting a predetermined action corresponding to the face information.
Thus, the user can perform a predetermined action according to the face information, and accordingly, the physical condition can be improved or makeup can be performed appropriately.
The physical condition information acquisition unit 111 or the makeup information acquisition unit 115 acquires an image of a subject that is an acquisition target of face information as face information, and the determination unit 112 determines a change between the face information before notification and the face information after notification based on the subject image.
Thus, the notification device 1 can determine whether or not the physical condition information is improved and whether or not makeup is appropriately performed based on the face image of the user or the like. For example, the determination can be made based on a change in the face color determined by analyzing the face image.
The light emitting unit 17 and the display unit 19 as notification units give notification by emitting light, and the control unit 114 controls the light emission of the light emitting unit 17 and the display unit 19 as notification units based on the determination result of the determination unit 112.
The notification device 1 can thereby perform notification by light emission. For example, by reducing the control of the R component among the RGB components in the light emission, it is possible to report a face color emphasizing the user.
The light emitting unit 17 and the display unit 19 as notification units notify characters, and the control unit 114 controls the characters notified by the light emitting unit 17 and the display unit 19 as notification units based on the determination result of the determination unit 112.
This enables the notification device 1 to notify the user of a message sentence using characters.
The light emitting unit 17 and the display unit 19 as notification units notify the user of the character, and the control unit 114 controls the display position of the character notified by the light emitting unit 17 and the display unit 19 as notification units based on the determination result of the determination unit 112.
Thus, the notification device 1 can perform notification while controlling the display position of a message sentence or the like using characters.
The notification device 1 includes a display unit 19 or a mirror unit 30 for displaying an object to be acquired of face information.
Thus, the notification device 1 allows the user to visually recognize the face of the user through the image display unit such as a mirror or a display.
The display surface of the display unit, the light emitting unit 17 as the notification unit, and the display surface of the display unit 19 are arranged so as to be superimposed in the visual recognition direction so as to be simultaneously visually recognized.
This allows the user to visually recognize the face of the user reflected by the display unit and the like and the information to be notified at the same time.
The physical condition information acquisition unit 111 or the makeup information acquisition unit 115 acquires information related to a physical condition or information related to makeup as face information.
This makes it possible to determine a change in physical condition or a change in makeup before and after the report. For this purpose, a notification based on a change related to the physical condition or a change related to makeup can be made.
The light emitting unit 17 and the display unit 19, which are notification units, notify the support information or the evaluation information based on the face information.
This allows the user to receive notification of support information (for example, advice for improving physical conditions) and evaluation information (for example, evaluation related to makeup) based on his/her own face information.
[ Effect ]
Next, effects achieved by the above-described embodiment will be described.
As a premise, for example, a person in job hunting activities or a person engaged in a customer service business, it is undesirable from a standpoint to let others see a tired face or an unhealthy face. Therefore, these people cover a tired face or an unhealthy face by applying makeup while checking their own face reflected on a mirror at home or the like.
But do not know how to cover a tired face, except for the cosmetic professional or the person who teaches the cosmetic method. Particularly in poor physical condition or insufficient sleep, no consideration is given to "how to cover? "and so on. In the case of chronic fatigue that accumulates every day due to continuous work, housework, child care, nursing care, etc., many people do not know at all that their faces are tired (a difference in face color) and are only noticed by others. If the user does not recognize that the face is moving tired, the user may be involved in a vicious circle of job hunting, or a victory or business negotiation, or may be a cause of a disease that impairs health.
Therefore, the user can obtain the following effects by using the above-described embodiment.
In the case of chronic fatigue, i have difficulty perceiving poor facial color or just tired expression. There is also a potential for disease in chronic exhausted shadows. However, according to the present embodiment, since makeup or massage can be promoted to make the blood look good, not only the look is improved, but also the user can be promoted to perceive the effect.
In addition, people in job hunting activities or the customer service industry may fail if the first impression gives an unhealthy impression. However, according to the present embodiment, even if there is a tired face in the mirror of the washstand when getting up, the mirror is proposed to make up mental looking on the spot, so that makeup with a good impression can be performed without confusion even in busy morning.
Further, according to the present embodiment, a method of applying makeup and a method of massaging can be known by pointing with a mirror without special investigation.
Further, according to the present embodiment, since a disease is suspected in the case where the complexion is not deviated even by a massage, a physical condition failure can be perceived before a serious disease occurs.
[ modified examples ]
The present invention is not limited to the above-described embodiments, and modifications, improvements, and the like within a range that can achieve the object of the present invention are also included in the present invention.
[ modification example Using past physical Condition information ]
For example, in the above-described embodiment, the determination unit 112 performs the determination at step S16 and step S20 after the notification at step S13. In this determination, the determination unit 112 compares the 1 st physical condition information and the 2 nd physical condition information, and determines whether or not the physical condition information is improved based on the comparison result.
In a modification, the determination unit 112 may compare the past acquired physical condition information (hereinafter, this physical condition information is referred to as "past physical condition information") with the 2 nd physical condition information in steps S16 and S20 after the notification in step S13. Here, the past physical condition information is physical condition information acquired by the physical condition information acquisition unit 111 by analyzing the image captured by the imaging unit 16 before the notification in step S13.
Specifically, for example, the physical status information acquisition unit 111 stores the 1 st physical status information and the 2 nd physical status information acquired in the previous physical status report process in the determination information storage unit 201. In the current physical condition notification process, the determination unit 112 uses the 1 st physical condition information and the 2 nd physical condition information stored in the determination information storage unit 201 as the past physical condition information. With this, the determination unit 112 can compare the physical condition information of the previous day (past physical condition information) with the current physical condition information (2 nd physical condition information), for example.
In this comparison, whether or not the physical condition information is improved may be determined based on the comparison result as in the above-described embodiment, but the presence or absence of the change and the degree of the change may be determined. In this case, for example, the determination unit 112 performs the determination by calculating a difference between the past physical condition information and the 2 nd physical condition information. The calculated difference is, for example, the difference in the value of R.
After the comparison is completed, a message indicating the difference may be notified to the user by the selection unit 113 and the control unit 114 in cooperation, instead of the notification in step S17, step S21, and step S22.
In this notification, for example, the selection unit 113 selects the display unit 19 as a notification unit. Then, the control unit 114 displays a character or an illustration on the display unit 19, for example, to notify the user that "the face is not good compared to yesterday". "the complexion is more mental than yesterday. "etc. represent messages of difference.
In this way, by comparing with the past physical condition information or reporting the difference from the past physical condition information, the user can recognize the change in the physical condition of the user himself more appropriately.
The determination unit 112 has been described with respect to an example in which the difference between the values of R is calculated in order to compare the past physical condition information and the 2 nd physical condition information, and the difference between the values of R is reported, but the present invention is not limited thereto. The determination unit 112 may calculate a difference between at least 1 parameter of RGB, and may further calculate a difference between biological information obtained from image analysis or a biosensor. Further, the control unit 114 may notify the user of the difference. Further, the past physical condition information may be physical condition information acquired at a certain point in time in the past, but may be an average value of a plurality of pieces of physical condition information acquired at a plurality of points in time in the past.
[ modification example Using other physical Condition information ]
In the above-described embodiment, for example, the physical status notification process is performed using the value of R in the face area of the user as the physical status information. This may be modified so that the physical status notification process is performed using other physical status information. For example, the body condition information acquisition unit 111 acquires the body temperature of the user by using an infrared filter or the like in the imaging unit 16. The body temperature of the user is then used as the physical condition information.
[ variants that differ in reporting method ]
In the above-described embodiment, for example, when the notification is made, the selection unit 113 selects the display unit 19 when a predetermined message is notified. Then, the control unit 114 displays a message on the display unit 19 to notify. The information may be reported by other methods by changing the method. For example, the selection unit 113 may select a speaker not shown. Then, the control section 114 may output a sound corresponding to the given message from the speaker.
[ modified example in which the light-emitting part is used for other purposes ]
Alternatively, the light emitting unit 17 may be used for other applications, for example. For example, the light emitting unit 17 may be used for applications that act on the blood pressure of the user.
In this case, the physical status information acquisition unit 111 acquires the blood pressure information of the user by, for example, communicating with an external blood pressure measurement device. Then, the control unit 114 controls the light emission of the light emitting unit 17 in accordance with the acquired blood pressure information. Specifically, when the acquired blood pressure value is high, the control unit 114 controls the light emitting unit 17 to increase the B component and to emit light in the blue system. This is because the effect of reducing blood is obtained by observing blue light emission. On the other hand, when the acquired blood pressure value is low, the control unit 114 controls the light emitting unit 17 to increase the R component and to emit light in a red color system. This is because the blood pressure is increased by the red-based luminescence.
[ modification examples suitable for evaluation of makeup ]
In the above-described embodiment, the notification device 1 notifies the user of support information for improving the acquired physical condition information. The notification device 1 may notify the user of other information. For example, the notification device 1 may notify the evaluation information including the evaluation of the makeup performed by the user, instead of the support information for improving the physical condition information. Therefore, in the present modification, the notification device 1 performs the "cosmetic notification process" instead of the "physical condition notification process" performed in the above-described embodiment. The functional configuration and operation for performing the makeup notifying process will be described below with reference to fig. 7 and 8.
(functional Structure)
Fig. 7 is a functional block diagram showing a functional configuration for executing the makeup notifying process among the functional configurations of the notifying device 1.
The makeup notifying process is a series of processes in which the notifying device 1 notifies evaluation information including evaluation of makeup performed on the user.
When the makeup notifying process is executed, as shown in fig. 7, the CPU11 functions as the makeup information acquiring unit 115, the determining unit 112, the selecting unit 113, and the control unit 114. The physical condition information acquisition unit 111 is different from the case of executing the physical condition notification process in the above-described embodiment in that it is replaced with the makeup information acquisition unit 115.
In addition, the determination information storage unit 201 and the notification information storage unit 202 are set in one area of the storage unit 20.
The determination information storage unit 201 stores information for determining by the determination unit 112 the makeup information indicating the state of makeup. Specifically, the makeup information (hereinafter referred to as "makeup information target value") serving as a criterion for determination in the makeup report processing and the user's makeup information (hereinafter referred to as "makeup information measured value") acquired in the makeup report processing are stored.
Here, as each piece of makeup information such as a makeup information target value and a makeup information actual measurement value, for example, information indicating a color of each pixel of each region (hereinafter referred to as "evaluation target region") corresponding to a face portion can be used.
For example, regions corresponding to organs or parts to be made up, such as cheeks, lips, eyes, eyelashes, and T-regions, can be set as the evaluation target regions. In this case, the evaluation target area may be set in plural or only 1 according to the setting of the user, the content of makeup, or the like.
In addition, for example, RGB values of each pixel in these evaluation target regions can be used as makeup information. Here, in the above-described embodiment, as the physical condition information of the user, the value of R indicating the redness of the face of the user among the RGB values of the respective pixels is used.
On the other hand, makeup is also performed using cosmetics corresponding to each color of red, yellow, or black other than red. Therefore, not only the value of R but also the value of G and the value of B may be used as each makeup information such as the makeup information target value or the makeup information actual measurement value.
However, this is an example, and as makeup information, at least 1 or more of RGB values may be used, and further, other information obtained by image analysis or the like may be used as makeup information.
The acquisition of these respective cosmetic information is performed by the cosmetic information acquisition unit 115 described later. The makeup information acquisition unit 115 acquires a makeup information target value by extracting from an image of a person such as a model serving as a template. Alternatively, the makeup information acquisition unit 115 performs a predetermined process such as a skin makeup process on the image of the user of the notification device 1, and extracts the image from the processed image to acquire the makeup information target value.
The makeup information acquisition unit 115 analyzes the image captured by the imaging unit 16, and acquires a measured makeup information value based on the analysis result.
The makeup information acquisition unit 115 stores each of the acquired makeup information in the determination information storage unit 201.
The cosmetic information target value acquired by the cosmetic information acquiring unit 115 may be acquired every time the cosmetic notifying process is executed, or may be acquired before the cosmetic notifying process. When the makeup report processing is performed before, the makeup information target value is stored in the determination information storage unit 201 in the form of a database or the like provided for each user. The makeup information acquisition unit 115 reads and uses the stored makeup information target value each time the makeup report process is executed.
The notification information storage unit 202 stores information for notifying the light emitting unit 17 or the display unit 19 based on the control of the control unit 114. Specifically, control information for controlling light emission of the light emitting section 17, character data (text data) for displaying on the display section 19, and image data are stored.
The various data such as the makeup information and the report information stored in the judgment information storage unit 201 and the report information storage unit 202 may be stored only in the storage unit 20, or may be appropriately read from the removable medium 100 by the drive 21.
The makeup information acquisition unit 115 is a part that acquires each piece of makeup information, such as the makeup information target value and the makeup information actual measurement value, as described above in the description of the determination information storage unit 201.
As described above, the makeup information acquisition unit 115 acquires the makeup information target value by extracting or the like from the image of the person such as a model serving as a template.
The makeup information acquisition unit 115 analyzes the image captured by the imaging unit 16, and acquires a measured makeup information value based on the analysis result. Here, referring to fig. 1, as described above, the imaging unit 16 is disposed at a position where the face of the user facing the mirror unit 30 is imaged. Therefore, the imaging unit 16 analyzes the image of the face of the user imaged, and acquires the actual measurement value of the makeup information based on the analysis result.
Then, the makeup information acquiring unit 115 stores each of the thus acquired makeup information items as determination information in the determination information storage unit 201.
The determination unit 112 is a unit that determines whether makeup is appropriately performed based on each piece of makeup information stored in the determination information storage unit 201.
Here, in the present modification, the determination is performed using the measured makeup information value (hereinafter referred to as "1 st makeup information measured value") measured by the makeup information acquisition unit 115 at the 1 st time point and the measured makeup information value (hereinafter referred to as "2 nd makeup information measured value") measured by the makeup information acquisition unit 115 at the 2 nd time point after the 1 st time point.
For this purpose, the determination unit 112 calculates a difference between the target makeup information value and each of the actual makeup information measurement values. The determination unit 112 compares the difference between the makeup information target value and the 1 st makeup information actual measurement value (hereinafter referred to as "1 st difference") with the difference between the makeup information target value and the 2 nd makeup information actual measurement value (hereinafter referred to as "2 nd difference"). When the 2 nd difference is changed from the 1 st difference, the determination unit 112 determines that the measured makeup information value is changed. Thus, the change of the measured value of the makeup information of the user can be determined.
In this case, for example, when the 2 nd difference becomes smaller than the 1 st difference, the determination unit 112 determines that the change is closer to the cosmetic information target value. Thus, the determination unit 112 can determine that: at time point 2, makeup was properly performed as compared to time point 1.
On the other hand, the determination unit 112 determines that the change is further away from the cosmetic information target value when, for example, the 2 nd difference becomes larger than the 1 st difference. Thus, the determination unit 112 can determine that: at time point 2, makeup was not properly performed (i.e., improper makeup was performed) compared to time point 1.
In these cases, the determination unit 112 can determine the change in the measured value of the makeup information of the user before and after the notification by, for example, setting the time before the notification as the 1 st time and the time after the notification as the 2 nd time.
In the present modification, the determination unit 112 also performs determination using the makeup information target value and each of the actual measurement values of the makeup information. For this purpose, the determination unit 112 compares the difference between the target value of the makeup information and each measured value of the makeup information (for example, the 2 nd measured value of the makeup information which is the most recent measured value) with a predetermined threshold value. Then, the determination unit 112 determines that the target value in makeup is achieved when the difference between the target value of makeup information and the measured value of makeup information becomes smaller than a predetermined threshold value.
Here, the case where the difference between the target value of the makeup information and the actual measurement value of the makeup information is smaller than the predetermined threshold value corresponds to the case where the user completes makeup close to the target value by appropriately performing makeup. When there are a plurality of evaluation target areas, the determination unit 112 performs the above-described determination for each of the plurality of evaluation target areas.
The determination unit 112 outputs the determination results obtained by these determinations to the selection unit 113 and the control unit 114, which will be described later. Thus, the selection unit 113 and the control unit 114, which will be described later, can control notification based on the results of these determinations.
The selection unit 113 is a part for selecting a notification unit for notification in the makeup notification process. The selection unit 113 selects an appropriate notification unit in accordance with the contents of the notification determined based on the determination result of the determination unit 112, and the like.
The control unit 114 is a part that controls the notification unit selected by the selection unit 113 based on the determination result of the determination unit 112 to perform notification.
The details of the notification executed by the cooperation of these components will be described in detail in the following description of [ operation ].
(action)
Next, the operation of the notification device 1 according to the present modification will be described. Fig. 8 is a flowchart illustrating a flow of the makeup notifying process executed by the notifying device 1.
The makeup notifying process is started, for example, together with the start of the notifying device 1.
In step S31, the cosmetic-information obtaining unit 115 analyzes the image captured by the imaging unit 16 to detect the user who is facing the mirror unit 30.
In step S32, the cosmetic information acquisition unit 115 detects the evaluation target area by performing face tracking on the user who is facing the mirror unit 30. Face tracking can be achieved by determining the user's outline, the user's feature points (e.g., feature points representing the location of organs, etc.) using image analysis.
At step S33, the makeup information acquisition unit 115 acquires the makeup information target value associated with each of the evaluation target areas detected at step S32.
In step S34, the makeup information acquisition unit 115 acquires measured makeup information values associated with the evaluation target areas detected in step S32.
In step S35, the determination unit 112 calculates the difference between the target value of the makeup information obtained in step S33 and the measured value of the makeup information obtained in step S34. This difference corresponds to the 2 nd difference described above. The difference calculated in step S35 performed at a time point before step S35 performed this time corresponds to the 1 st difference described above.
In step S36, the determination unit 112 determines whether or not the measured value of the makeup information has changed based on the difference. Specifically, the determination unit 112 compares the 1 st difference and the 2 nd difference as described above. When the 2 nd difference changes from the 1 st difference, it is determined that the measured makeup information value has changed. In this case, as described above, it is further determined whether a change is made such that the 2 nd difference becomes smaller than the 1 st difference or whether a change is made such that the 2 nd difference becomes larger than the 1 st difference.
When the measured value of the makeup information has changed (both large and small changes are included), the determination at step S36 is yes, and the process proceeds to step S37. On the other hand, when the actual measurement value of the makeup information is not changed (both large and small changes are included) or is not substantially changed, the determination of no is made in step S36, the process returns to step S34, and the current 2 nd difference is set as a new 1 st difference, and the actual measurement value of the makeup information is measured again.
In addition, since the present time is the first time processing, the 1 st difference is not present and only the 2 nd difference calculated this time is present, in this case, the determination is also made as no in step S36, the processing returns to step S34, and the present 2 nd difference is set as the new 1 st difference, and the makeup information actual measurement value is measured again.
In step S37, the selection unit 113 and the control unit 114 cooperate to notify the user. In this notification, for example, the selection unit 113 selects the display unit 19 as a notification unit. Then, the control unit 114 notifies the user by displaying, on the display unit 19, for example, the content determined to have changed at step S37, a message indicating the evaluation target area determined to have changed, or an illustration.
For example, when the 2 nd difference is changed to be smaller than the 1 st difference, the control portion 114 can further display, for example, "the makeup is properly performed |", on the display portion 19! "and the like contain a message that is an appropriate evaluation of makeup performed by the user. On the other hand, when the 2 nd difference becomes larger than the 1 st difference, control unit 114 may further display, for example, "make it look like" not too much!on display unit 19! "and the like include a message of evaluation of makeup performed by the user.
The control unit 114 may notify the user of the value itself of the 1 st difference or the 2 nd difference, or the amount of change in the 1 st difference or the 2 nd difference as the achievement value of makeup, for example. These values may be reported to the user as an achievement rate, for example.
In this case, for example, the user may be notified which value of the RGB values is originally a large (or small) difference from the cosmetic information target value. For example, when the evaluation target region is the lips of the user and the difference between the R values is large, the display unit 19 may display "it is not enough to be red in the lips. Please apply a little bit more red lipstick. And a message including an evaluation of the meaning of making up unsuitable for the user and a method of improving the makeup.
In step S38, the determination unit 112 determines whether or not there is an evaluation target area that reaches the target (i.e., whether or not there is an evaluation target area that has been properly made) by comparing the difference between the target value of the makeup information and the 2 nd actual measurement value of the makeup information (i.e., the 2 nd difference) calculated in step S35 with a predetermined threshold value. If there is an evaluation target area that reaches the target, the determination at step S38 is yes, and the process proceeds to step S39. On the other hand, if the measured value does not reach the target evaluation target area, the determination is no at step S38, and the process returns to step S34, where the current 2 nd difference is set as a new 1 st difference, and the measured value of the makeup information is measured again.
In step S39, the selection unit 113 and the control unit 114 cooperate to notify the user. In this notification, for example, the selection unit 113 selects the display unit 19 as a notification unit. Then, the control unit 114 notifies the user by displaying, on the display unit 19, a message or an illustration indicating the evaluation target area determined to have reached the target at step S38, for example. In this case, the display portion 19 can further display, for example, "the cosmetic has been properly applied! "and the like contain a message for evaluation of makeup performed by the user.
In step S40, the determination unit 112 determines whether or not all of the evaluation target areas have been notified in step S39 (i.e., whether or not all of the evaluation target areas have reached the target).
When the notification of step S39 is made for all the evaluation target areas, the determination of step S40 is yes, and the present process ends. On the other hand, if the notification of step S39 is not performed for all the evaluation target areas, the determination is no in step S40, the process returns to step S34, and the measurement of the actual measurement value of the cosmetic information is performed again with the current 2 nd difference as the new 1 st difference.
Through the makeup notification processing described above, the notification device 1 can notify evaluation information including an evaluation of makeup performed on the user. The user can thus grasp whether or not makeup is appropriate for himself/herself.
In the makeup report processing, whether or not the user has reached the target and which area the evaluation target area has reached is reported as evaluation information including an evaluation of the makeup performed by the user. The evaluation may be further performed in stages, and the results of the evaluation may be reported. For example, a plurality of predetermined threshold values are set in stages, and different contents are reported every time the difference between the target value of the makeup information and the measured value of the makeup information becomes a certain threshold value or less.
In this case, for example, the 1 st threshold and the 2 nd threshold smaller than it are set. Then, the makeup is started, and at the time point when the difference between the target value of the makeup information and the actual measurement value of the makeup information becomes the 1 st threshold value or less, the "arrival at the target la at the difference! "and so on. Further, when makeup is continued, the "reaching target! "and so on.
This makes it possible to report evaluation information including a stepwise evaluation to the user. Such a stepwise notification may be performed for each of the plurality of evaluation target areas. Further, the number of stages may be set not to 2, but to more.
In the makeup notification processing described above, when there is no change in the 1 st difference and the 2 nd difference, the determination at step S36 is no, and no notification relating to the determination is particularly made. For example, "there is no change in the cosmetic condition" may be reported. Please try to apply makeup to the designated area. "and so on. Note that, if the 1 st difference and the 2 nd difference do not change for a predetermined period of time and no determination continues at step S36, the makeup notification processing may be ended.
In the makeup report processing, whether or not the measured value of the makeup information has changed is determined by comparing the 1 st difference and the 2 nd difference at step S36. The present invention is not limited to this, and whether or not the measured makeup information value changes may be determined by comparing the measured makeup information value itself of the 1 st and the measured makeup information value itself of the 2 nd. In this case, it is also possible to determine whether makeup has been performed properly at the 2 nd time point and the 1 st time point by comparing how the 2 nd makeup information actual measurement value changes from the 1 st makeup information actual measurement value (for example, whether the RGB values increase or decrease) with the makeup information target value.
In the makeup report processing, the difference between the target value of the makeup information and the 2 nd actual measurement value of the makeup information (i.e., the 2 nd difference) is compared with a predetermined threshold value at step S38, and thereby it is determined whether or not there is an evaluation target region that reaches the target (i.e., an evaluation target region where makeup has been appropriately completed). The present invention is not limited to this, and whether or not there is an evaluation target area that reaches the target may be determined by comparing the 2 nd cosmetic information measured value itself with a predetermined threshold value.
That is, in the makeup report processing, the determination may be performed without calculating the difference between the target values of the respective makeup information.
Further, the method of obtaining the cosmetic information target value may be different. In this case, for example, the notification device 1 performs virtual makeup (hereinafter, referred to as "virtual makeup") before the makeup notification processing described above. In the virtual makeup, the image of the user captured by the imaging unit 16 and the color corresponding to the virtual makeup are displayed on the display unit 19 in a superimposed manner. Further, the virtual makeup states of the cosmetics are made different based on the user operation. For example, in the case where the virtual cosmetic product is a virtual blush, the position and shape of the carrying blush are made different corresponding to the user's operation. Alternatively, regions of the user's face are detected, and the position and shape of the load-bearing blush are automatically matched to the detected positions and shapes of the regions of the user's face.
Thereby, the user can perform appropriate virtual makeup to be targeted. Then, the RGB value of the makeup part and the shape of the makeup part in the appropriate virtual makeup are acquired as the makeup information target value.
In addition, by actually applying makeup in the makeup notifying process, it is possible to evaluate actual makeup using the target value of the makeup information obtained by the virtual makeup. Thus, the user can obtain the target value of the makeup information using the face of the user as a template without actually using various cosmetics to make up (i.e., performing only virtual makeup).
However, the present invention is not limited to this, and the image pickup unit 16 may be configured to capture an image of a face actually made by an expert such as a beauty specialist at a shop or the like of a cosmetics sales shop, and acquire a makeup information target value from the image. Thus, a target value of makeup information using the face of the user as a template on which makeup by an expert is performed can be obtained.
In addition, when the makeup information target value using the face of the user as a template is acquired in this manner, the makeup information target value can be displayed extremely shallowly in the makeup report processing described above. For example, the makeup information target value may be displayed at a high transmittance (for example, a transmittance of 90 to 95%), or a band having a shape corresponding to the makeup information target value may be displayed, or a makeup template image may be displayed. In this way, in the makeup report processing described above, the user can make up while referring to a specific target (template). Thus, the makeup information target value can be used not only for evaluation of makeup but also for improving the convenience of makeup for the user.
In addition, the present modification and the above-described embodiments may be combined. That is, the physical condition notification processing and the makeup notification processing can be selectively performed in accordance with the selection operation, setting, and the like of the user.
[ modification of the physical condition report treatment ]
The physical status notification processing described above is an example and not limited to the flowchart of fig. 5.
For example, in step S16, the determination unit 112 compares the 1 st physical condition information and the 2 nd physical condition information, and determines whether or not the physical condition information is improved based on the comparison result. Alternatively, the determination unit 112 may compare the difference between the reference physical condition information and the 1 st physical condition information and the difference between the reference physical condition information and the 2 nd physical condition information in step S16. And then determines how the physical condition information changes based on the comparison result of the difference. Specifically, if the difference between the 2 nd physical condition information and the reference physical condition information is smaller than the difference between the 1 st physical condition information and the reference physical condition information, it is determined that the physical condition is improved, and it is determined as yes at S16, and if the difference between the 2 nd physical condition information and the reference physical condition information is greater than the difference between the 1 st physical condition information and the reference physical condition information or does not change, it may be determined that the physical condition is not improved, and it is determined as no at S16.
Then, in step S17, the selection unit 113 and the control unit 114 cooperate with each other to notify the user that the physical condition information is improved by the makeup. Further, after the notification in step S17, whether or not the current state of the body condition is in the target evaluation target region (that is, the evaluation target region indicating that the body condition is certainly improved) can be determined by comparing the difference between the 2 nd body condition information and the reference body condition information with a predetermined threshold value. In this determination, when it is determined that the state at the current time point has reached the target, the body condition notification process may be terminated after the notification is performed to that effect.
By performing the modification in this manner, the physical condition notification process can be continued until the physical condition information of the user is improved by makeup.
Similarly, in step S20, the determination unit 112 may compare the reference physical condition information with the 2 nd physical condition information. In this case, the difference between the reference physical condition information and the 1 st physical condition information and the difference between the reference physical condition information and the 2 nd physical condition information may be compared. Then, it is possible to determine how the physical condition information changes based on the comparison result of the difference. Specifically, if the difference between the 2 nd physical condition information and the reference physical condition information is smaller than the difference between the 1 st physical condition information and the reference physical condition information, it is determined that the physical condition is improved, and it is determined as yes at S20, and if the difference between the 2 nd physical condition information and the reference physical condition information is greater than the difference between the 1 st physical condition information and the reference physical condition information or does not change, it is determined that the physical condition is not improved, and it is determined as no at S20.
Then, in step S21, the selection unit 113 and the control unit 114 cooperate with each other to notify the user of notification indicating that the physical condition information is improved by the massage. Further, after the notification in step S21, whether or not the current state of the body condition is in the target evaluation target region (that is, the evaluation target region indicating that the body condition is certainly improved) can be determined by comparing the difference between the 2 nd body condition information and the reference body condition information with a predetermined threshold value. In this determination, when it is determined that the state at the current time point has reached the target, the body condition notification process may be terminated.
By such a modification, the physical status notification process can be continued until the physical status information of the user is improved by the massage.
As another modification, step S12 and step S13 may be omitted. That is, the processing from step S14 onward may be performed without comparing the reference physical condition information with the 1 st physical condition information.
By performing the modification in this manner, for example, the physical status notification process can be performed to further improve the physical status information of the user even if the physical status of the user is good.
As another modification, step S18 to step S22 may not be performed by setting in advance that the makeup mode is to be performed in accordance with an operation by the user or the like. That is, the care mode may not be performed. In this case, steps S12 to S15 are further omitted, and on the other hand, steps S16 and S17 may be modified as follows.
That is, it is possible to transform: it is determined whether or not the physical condition information is improved at step S16, support information or evaluation information based on the result is reported at step S17, and the process returns to step S16 again.
As another modification, the steps S15 to S17 may not be performed by setting in advance the care mode to be performed in accordance with the operation of the user or the like. That is, the makeup mode may not be performed. In this case, steps S18 to S19 are further omitted, and on the other hand, steps S20 and S21 may be modified as follows.
That is, it is possible to transform: it is determined whether or not the physical condition information is improved at step S20, and support information or evaluation information based on the result is reported at step S21 and step S22, and the process returns to step S20 again.
In the physical status notification process described above, the determination at step S12, step S16, and step S20 is performed using the physical status information of the user (for example, the value of R indicating the redness of the user' S face). That is, it is determined whether the physical condition of the user is improved.
The determination at step S12, step S16, and step S20 may be performed using makeup information of the user (for example, each value of RGB indicating the state of makeup using cosmetics of various colors). That is, it is determined whether makeup by the user is properly performed.
In this case, as described above, step S18 to step S22 may not be performed by setting in advance that the makeup mode is to be performed in accordance with an operation by the user or the like. That is, the care mode may not be performed.
As described above [ modification of the physical status reporting process ], as variously exemplified, the above-described physical status reporting process may be partially modified (including omission and replacement), or a part of the physical status reporting process and the cosmetic reporting process may be appropriately combined, without departing from the gist of the above embodiment.
[ modification of cosmetic report treatment ]
The makeup report processing described above is an example, but not limited to, with reference to the flowchart of fig. 8.
For example, the additional determination unit 112 may perform the process of comparing the difference (i.e., the 1 st difference) between the measured value of the makeup information calculated in step S35 and the target value of the makeup information with a predetermined threshold value, which is performed first in step S35. If it is determined that the 1 st difference is smaller than the predetermined threshold value and each evaluation target area has reached the target, the processing from step S36 onward is not performed, and the makeup report processing is terminated. Alternatively, only the notification of step S39 may be performed to end the makeup notification processing. If it is determined that the 1 st difference is larger than the predetermined threshold and each of the evaluation target areas does not reach the target, the processing of step S36 and subsequent steps are performed, but the processing of comparing the 1 st difference and the predetermined threshold as described above may not be performed as the subsequent flow.
By such a modification, when makeup is not properly performed from the beginning, unnecessary processing such as calculation of the 2 nd difference and comparison of the 1 st difference and the 2 nd difference can be omitted. The target makeup information value used to determine whether or not to perform the processing of step S36 and thereafter may be a target value determined based on an actual measured value of makeup information of the face of the user, which has been achieved in the past.
In addition, as another modification, step S38, step S39, and step S40 may be omitted. In this case, the addition determination unit 112 determines whether or not a predetermined time period has elapsed from the start of the makeup notification process, instead of step S38. Then, from step S34 to step S37 are repeated during the period until the given time elapses. When a predetermined time has elapsed, the selection unit 113 and the control unit 114 cooperate to notify. In this notification, if there is an evaluation target area that achieves the target, the notification is given. On the other hand, if there is an evaluation target area that does not reach the target, the evaluation target area is notified. In this case, the value of the difference between the makeup information target value and the 2 nd makeup information actual measurement value (i.e., the 2 nd difference) may be further reported as the makeup achievement value with respect to the undertargeted evaluation target area.
By modifying in this manner, the time for the makeup notification treatment can be determined in advance. And can inform whether appropriate makeup is completed within that time. In addition, when the appropriate makeup is not completed, the achievement value of the makeup can be reported.
In the makeup report processing, the determination at step S36 and step S38 is performed using makeup information of the user (for example, each value of RGB indicating the state of makeup using cosmetics of various color schemes). That is, whether or not makeup is appropriately performed by the user can be determined.
The determination at step S36 and step S38 may be performed using the physical condition information of the user (for example, the value of R indicating the redness of the user' S face). That is, it can be determined whether the physical condition of the user is improved.
As described above [ modification of the makeup notifying process ], as variously exemplified, the makeup notifying process described above may be partially modified (including omission and replacement), or a part of the physical condition notifying process and the makeup notifying process may be appropriately combined, without departing from the gist of the above embodiment.
[ other modifications ]
In the above-described embodiment, the portable self-standing mirror has been described as an example of the notification device 1 to which the present invention is applied, but the present invention is not particularly limited thereto.
For example, the present invention can be applied to an electronic device incorporating a large-sized mirror such as a dressing mirror, an electronic device incorporating a stationary toilet table, and a mirror-shaped electronic device installed in a bathroom.
The series of processes described above can be executed by either hardware or software.
In other words, the functional configurations of fig. 4 and 7 are merely examples, and are not particularly limited. That is, it is sufficient that the notification device 1 has a function that can execute the series of processes as a whole, and what functional blocks are used to realize the function is not particularly limited to the examples of fig. 4 and 7.
The 1 functional block may be constituted by a single hardware, a single software, or a combination thereof.
The functional configuration in the present embodiment is realized by a processor that executes arithmetic processing, and the processor in the present embodiment includes a combination of various processing devices and a processing Circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array), in addition to the single processor, the multiprocessor, the multicore processor, and the like.
In the case where a series of processes is executed by software, a program constituting the software is installed from a network or a recording medium to a computer or the like.
The computer may be a computer that is loaded into dedicated hardware. In addition, the computer may be a computer, such as a general-purpose personal computer, capable of executing various functions by installing various programs.
The recording medium containing such a program is not only constituted by the removable medium 100 of fig. 2 distributed separately from the apparatus main body in order to provide the program to the user, but also constituted by a recording medium or the like provided to the user in a state of being previously loaded in the apparatus main body. The removable medium 231 is, for example, a magnetic disk (including a flexible disk), an optical disk, or an opto-magnetic disk. The optical Disk is constituted by, for example, a CD-ROM (Compact Disk-Read Only Memory), a DVD (Digital Versatile Disk), a Blu-ray (registered trademark) Disc, and the like. The magneto-optical Disk is constituted by, for example, MD (Mini-Disk). The recording medium provided to the user in a state of being previously loaded in the apparatus main body is constituted by, for example, the ROM12 of fig. 2, the hard disk included in the storage unit 20 of fig. 2 and 3, and the like, on which programs are recorded.
In this specification, the steps describing the program recorded in the recording medium include not only the processing performed in time series in accordance with the order thereof, but also processing performed in parallel or individually, instead of performing the processing in time series.
In the present specification, the term "system" refers to an entire apparatus including a plurality of apparatuses, a plurality of units, and the like.
The embodiments of the present invention have been described above, but the embodiments are merely examples and do not limit the technical scope of the present invention. The present invention can take other various embodiments, and various modifications such as omission and replacement can be made without departing from the scope of the present invention. These embodiments and modifications thereof are included in the scope and gist of the invention described in the present specification and the like, and are included in the invention described in the scope of the claims and the equivalent scope thereof.

Claims (8)

1. A notification device is characterized by comprising:
a face information acquisition unit that acquires face information of a user;
a notification unit configured to control a light emission color of the light emission unit to highlight and notify the body information when it is determined that the physical condition of the user is not good based on the face color in the face information; and
and a control unit that performs control so that, when a difference between a face color in the face information at the time of the notification and a face color after a change in body information has occurred by performing a predetermined process on a face corresponding to the face information after the notification is equal to or greater than a predetermined value, a light emission color of the light emission unit is changed from a color before the predetermined process is performed.
2. The notification device according to claim 1,
the control unit causes the notification unit to notify information prompting a given action corresponding to the face information.
3. The notification device according to claim 1,
the face information acquisition unit acquires an image of a subject that is an acquisition target of the face information as the face information,
the determination determines a change in the face information before notification and the face information after notification based on an image of the subject.
4. The notification device according to any one of claims 1 to 3,
the notification unit also makes a notification by characters,
the control unit controls the character notified by the notification unit based on a determination result of the determination.
5. The notification device according to any one of claims 1 to 3,
the notification unit also makes a notification by characters,
the control unit controls a display position of the character notified by the notification unit based on a determination result of the determination.
6. The notification device according to claim 1,
the notification device includes: a display unit for displaying the object to be acquired of the face information,
the display section is a mirror or an image display section.
7. A notification method, comprising:
a face information acquisition step of acquiring face information of a user;
a notification step of controlling a light emission color of a light emission unit to highlight and notify the body information when it is determined that the physical condition of the user is not good based on the face color in the face information; and
and a control step of controlling the light emission color of the light emission unit to be changed from a color before the predetermined processing is performed, when a difference between a face color in the face information at the time of the notification and a face color after the predetermined processing is performed on the face corresponding to the face information after the notification and the body information is changed is equal to or greater than a predetermined value.
8. A recording medium storing a program for causing a computer to realize:
a face information acquisition function for acquiring face information of a user;
a notification function of controlling a light emission color of a light emission unit to highlight and notify the body information when it is determined that the physical condition of the user is not good based on a face color in the face information; and
and a control function of controlling the light emission color of the light emission unit to be changed from a color before the predetermined processing is performed, when a difference between a face color in the face information at the time of the notification and a face color after the predetermined processing is performed on the face corresponding to the face information after the notification and the body information is changed is equal to or greater than a predetermined value.
CN201910025846.3A 2018-01-11 2019-01-10 Notification device, notification method, and recording medium Active CN110025116B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2018002653 2018-01-11
JP2018-002653 2018-01-11
JP2018231743A JP7225749B2 (en) 2018-01-11 2018-12-11 Notification device, notification method and program
JP2018-231743 2018-12-11

Publications (2)

Publication Number Publication Date
CN110025116A CN110025116A (en) 2019-07-19
CN110025116B true CN110025116B (en) 2022-08-23

Family

ID=67140239

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910025846.3A Active CN110025116B (en) 2018-01-11 2019-01-10 Notification device, notification method, and recording medium

Country Status (2)

Country Link
US (1) US11191341B2 (en)
CN (1) CN110025116B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005044283A (en) * 2003-07-25 2005-02-17 Seiko Epson Corp Cosmetics guidance system, server apparatus, terminal device and program
CN103690149A (en) * 2013-12-30 2014-04-02 惠州Tcl移动通信有限公司 Mobile terminal for recognizing physical conditions by facial photographing and implementing method for mobile terminal
CN104203042A (en) * 2013-02-01 2014-12-10 松下电器产业株式会社 Makeup application assistance device, makeup application assistance method, and makeup application assistance program
CN104205162A (en) * 2013-02-01 2014-12-10 松下电器产业株式会社 Makeup application assistance device, makeup application assistance method, and makeup application assistance program
CN105188466A (en) * 2013-03-22 2015-12-23 松下知识产权经营株式会社 Makeup support device, makeup support method, and makeup support program
CN105188467A (en) * 2013-03-22 2015-12-23 松下知识产权经营株式会社 Makeup support device, makeup support method, and makeup support program
EP2962597A1 (en) * 2013-02-28 2016-01-06 Panasonic Intellectual Property Management Co., Ltd. Makeup assistance device, makeup assistance method, and makeup assistance program
US9814297B1 (en) * 2017-04-06 2017-11-14 Newtonoid Technologies, L.L.C. Cosmetic applicator

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3582458B2 (en) * 2000-06-07 2004-10-27 花王株式会社 Makeup advice system
JP4056443B2 (en) 2003-08-21 2008-03-05 Necフィールディング株式会社 Health checkup system and program
JP4809056B2 (en) * 2005-12-28 2011-11-02 株式会社 資生堂 Face classification device for cheek makeup, face classification program, and recording medium on which the program is recorded
JP2013182062A (en) 2012-02-29 2013-09-12 Nikon Corp Display device and projection device
JP5754439B2 (en) 2012-12-21 2015-07-29 カシオ計算機株式会社 Information notification apparatus, information notification method, and program
US10430985B2 (en) * 2014-03-14 2019-10-01 Magic Leap, Inc. Augmented reality systems and methods utilizing reflections
CN104683692B (en) * 2015-02-04 2017-10-17 广东欧珀移动通信有限公司 A kind of continuous shooting method and device
JP6467966B2 (en) 2015-02-13 2019-02-13 オムロン株式会社 Health care assistance device and health care assistance method
US20160357578A1 (en) * 2015-06-03 2016-12-08 Samsung Electronics Co., Ltd. Method and device for providing makeup mirror
US10628663B2 (en) * 2016-08-26 2020-04-21 International Business Machines Corporation Adapting physical activities and exercises based on physiological parameter analysis
CN106529445A (en) * 2016-10-27 2017-03-22 珠海市魅族科技有限公司 Makeup detection method and apparatus
CN107358025A (en) * 2017-06-12 2017-11-17 美的集团股份有限公司 Control method, control device, Intelligent mirror and computer-readable recording medium
US10646022B2 (en) * 2017-12-21 2020-05-12 Samsung Electronics Co. Ltd. System and method for object modification using mixed reality
CN111788623B (en) * 2018-01-06 2023-02-28 凯尔Os公司 Intelligent mirror system and using method thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005044283A (en) * 2003-07-25 2005-02-17 Seiko Epson Corp Cosmetics guidance system, server apparatus, terminal device and program
CN104203042A (en) * 2013-02-01 2014-12-10 松下电器产业株式会社 Makeup application assistance device, makeup application assistance method, and makeup application assistance program
CN104205162A (en) * 2013-02-01 2014-12-10 松下电器产业株式会社 Makeup application assistance device, makeup application assistance method, and makeup application assistance program
EP2962597A1 (en) * 2013-02-28 2016-01-06 Panasonic Intellectual Property Management Co., Ltd. Makeup assistance device, makeup assistance method, and makeup assistance program
CN105188466A (en) * 2013-03-22 2015-12-23 松下知识产权经营株式会社 Makeup support device, makeup support method, and makeup support program
CN105188467A (en) * 2013-03-22 2015-12-23 松下知识产权经营株式会社 Makeup support device, makeup support method, and makeup support program
CN103690149A (en) * 2013-12-30 2014-04-02 惠州Tcl移动通信有限公司 Mobile terminal for recognizing physical conditions by facial photographing and implementing method for mobile terminal
US9814297B1 (en) * 2017-04-06 2017-11-14 Newtonoid Technologies, L.L.C. Cosmetic applicator

Also Published As

Publication number Publication date
CN110025116A (en) 2019-07-19
US11191341B2 (en) 2021-12-07
JP2023057107A (en) 2023-04-20
US20190208894A1 (en) 2019-07-11

Similar Documents

Publication Publication Date Title
JP6288404B2 (en) Makeup support device, makeup support method, and makeup support program
KR102008023B1 (en) Web site providing cosmetic and nutrition regimen from color images
JP7248820B2 (en) Apparatus and method for determining cosmetic skin attributes
JP7235895B2 (en) Apparatus and method for visualizing cosmetic skin characteristics
US20120044335A1 (en) Makeup simulation system, makeup simulation apparatus, makeup simulation method, and makeup simulation program
EP1189536A1 (en) Skin imaging and analysis systems and methods
EP3518710B1 (en) Apparatus and method for supporting at least one user in performing a personal care activity
JP2008003724A (en) Cosmetics simulation system
CN112741609B (en) Electronic device, control method of electronic device, and medium
KR20210084102A (en) Electronic apparatus, scalp care system and method for controlling the electronic apparatus and the server
JP2012152389A (en) Furniture with mirror
JP6530703B2 (en) Skin condition evaluation method
CN110025116B (en) Notification device, notification method, and recording medium
JP2019114251A (en) Method for face feature analysis and transmission of personal advice and system
JP2006081847A (en) Skin analysis network system
JP7225749B2 (en) Notification device, notification method and program
JP2013178789A (en) Beauty simulation system
CN110297720B (en) Notification device, notification method, and medium storing notification program
JP7135466B2 (en) Display device, display method and display program
JP6826733B2 (en) Signal control device, signal control program, and signal control method
JP7485114B2 (en) Notification device, notification method, and program
JP7207857B2 (en) How to provide a facial beauty treatment
CN117440849A (en) Skin care device
JP2021151304A (en) Electronic device, control program for electronic device, and control method for electronic device
JP2005235137A (en) System, method and program for objectively evaluating face

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant