KR101753633B1 - Terminal device and method for comparing skin dignosis results using thereof - Google Patents

Terminal device and method for comparing skin dignosis results using thereof Download PDF

Info

Publication number
KR101753633B1
KR101753633B1 KR1020150109116A KR20150109116A KR101753633B1 KR 101753633 B1 KR101753633 B1 KR 101753633B1 KR 1020150109116 A KR1020150109116 A KR 1020150109116A KR 20150109116 A KR20150109116 A KR 20150109116A KR 101753633 B1 KR101753633 B1 KR 101753633B1
Authority
KR
South Korea
Prior art keywords
skin diagnosis
skin
user
diagnosis result
results
Prior art date
Application number
KR1020150109116A
Other languages
Korean (ko)
Other versions
KR20170014992A (en
Inventor
윤정한
김성민
신현진
조세나
명고운
Original Assignee
주식회사 엘지유플러스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 엘지유플러스 filed Critical 주식회사 엘지유플러스
Priority to KR1020150109116A priority Critical patent/KR101753633B1/en
Publication of KR20170014992A publication Critical patent/KR20170014992A/en
Application granted granted Critical
Publication of KR101753633B1 publication Critical patent/KR101753633B1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/442Evaluating skin mechanical properties, e.g. elasticity, hardness, texture, wrinkle assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/444Evaluating skin marks, e.g. mole, nevi, tumour, scar
    • G06F19/3431

Abstract

Disclosure of Invention The present invention discloses a terminal apparatus and a control method thereof that differ in information to be provided to a user in accordance with a time zone, a login status, or the like. To this end, a skin diagnosis result comparison method includes: photographing a user; Analyzing the skin condition of the user based on the photographed photographs; And storing the photographed photograph and the skin diagnosis result analyzing the skin condition of the user. At this time, if a first skin diagnosis result and a second skin diagnosis result among a plurality of skin diagnosis results accumulated for a predetermined period are selected, a comparison screen for comparing the first skin diagnosis result and the second skin diagnosis result is outputted .

Description

TECHNICAL FIELD [0001] The present invention relates to a terminal device and a skin diagnosis result comparison method using the same,

The present invention relates to a terminal device capable of comparing a plurality of skin diagnosis results and a comparison method using the same.

With increasing interest in cosmetics, public interest in skin care is also increasing. Each individual is trying to maintain optimal skin condition, either by receiving medical treatment or using cosmetics to prevent skin aging or skin troubles.

However, in order for an individual to search for a dermatology or skin care center for skin care, it is necessary to involve a considerable expense and effort in receiving medical treatment. Various cosmetics for preventing skin aging are also available. There is a problem that it is difficult to find.

Accordingly, if the skin condition can be checked from the photographs taken through the camera included in the mobile terminal or personal computer, which are popularized in the past, each individual can check their skin condition regardless of time and place.

However, if the skin condition check is completed only once, it is difficult for the user to understand whether his or her skin condition is improving or deteriorating. Accordingly, there is a growing need to provide a user interface that can inform the user of changes in skin condition.

SUMMARY OF THE INVENTION The present invention has been made to solve the above problems, and it is an object of the present invention to provide a terminal device and a control method therefor which improve user convenience.

Specifically, the present invention provides a terminal device capable of providing and comparing skin conditions of a user over time.

It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, unless further departing from the spirit and scope of the invention as defined by the appended claims. It will be possible.

According to an aspect of the present invention, there is provided a method of comparing skin diagnosis results, comprising: photographing a user; Analyzing the skin condition of the user based on the photographed photographs; And storing the photographed photograph and the skin diagnosis result analyzing the skin condition of the user. At this time, if at least one skin diagnosis result of a plurality of skin diagnosis results accumulated for a predetermined period is selected, a comparison screen for comparing the first skin diagnosis result and the second skin diagnosis result may be output.

According to an aspect of the present invention, there is provided a terminal apparatus including: a display unit; A wireless communication unit for communicating with a server; A camera for photographing; And a controller for analyzing the skin condition of the user based on the photographed photograph and controlling the photographed photograph and the skin condition of the user to be stored in the server when the photograph of the user is photographed through the camera can do. At this time, when at least one skin diagnosis result among the plurality of skin diagnosis results is selected while a plurality of skin diagnosis results accumulated for a predetermined period are output, the first skin diagnosis result and the second skin diagnosis result A comparison screen for comparison may be output through the display unit.

The technical solutions obtained by the present invention are not limited to the above-mentioned solutions, and other solutions not mentioned are clearly described to those skilled in the art from the following description. It can be understood.

SUMMARY OF THE INVENTION The present invention has been made in order to solve the above problems, and it is an object of the present invention to provide a terminal device and a control method thereof for improving user convenience.

Specifically, the present invention provides a terminal device capable of providing and comparing skin conditions of a user over time.

The effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description will be.

1 and 2 are views showing a terminal device.
3 is an exploded perspective view of a terminal device according to an embodiment of the present invention.
4 is a diagram showing an example in which the first light source unit and the second light source unit are disposed.
5 is a block diagram of a terminal device based on electronic components that can be inserted into a terminal device.
6 and 7 are diagrams showing an example in which user log-on proceeds.
8 is a diagram illustrating an example in which user information is input through a setting screen.
9 is a diagram showing an example in which a photograph for skin diagnosis is taken.
10 is a diagram showing an example in which a skin diagnosis result is output.
11 is a diagram showing another example in which a skin diagnosis result is output.
12 is a flowchart for explaining the operation of the terminal device according to the present invention.
13 is a diagram showing an example in which a statistical result is output.
14 is a diagram showing an example in which a detailed skin diagnosis result is output.
15 is a diagram showing an example in which a photo statistical result is outputted.
16 is a diagram showing an example in which a detailed skin diagnosis result is output.
17 is a diagram showing an example in which a comparison screen is output.

Hereinafter, a terminal device related to the present invention will be described in detail with reference to the drawings.

The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The embodiments to be described later can be applied to various electronic apparatuses. That is, the makeup recommendation method according to the present invention can be applied not only to mobile terminals that are carried by a user such as mobile phones, smart phones, digital cameras, PDAs, laptops, MP3s, The present invention can be applied to a fixed type terminal which is normally used in a fixed position. For convenience of explanation, in the following embodiments, an electronic device to which the cosmetic recommendation method according to the present invention can be applied is exemplified by a terminal device capable of storing cosmetics.

1 and 2 are views showing a terminal device according to the present invention. Referring to the example shown in FIG. 1, the mirror 120 may be exposed on the front surface of the terminal device. At this time, at least a part of the mirror 120 can be utilized as the display unit 125 that outputs information. Accordingly, the user can check various information outputted through the display unit 125 while viewing his or her own image on the mirror 120.

The terminal device may include a light source unit 140. The light source 140 may be disposed to illuminate the user's face when the user is positioned in front of the terminal device. As the user's face brightens, the effect of the user's skin tone and makeup state on the mirror can be more apparent. 1 and 2, the light source unit 140 is located at the left and right edges of the terminal device. However, the position of the light source unit 140 is not limited to the illustrated example.

On the rear surface of the mirror 120, a case 110 on which an electronic component can be mounted may be located. As will be described later, the case 110 may include a front case, a rear case, and a main body. In the case 110, electronic parts such as a communication unit, a camera, a memory, and a control unit can be mounted. The camera 130 mounted on the case 110 may perform a function of photographing the outside of the display unit. To this end, it is preferable that the back surface portion of the mirror 120 (i.e., the surface abutting the case 110) is in a state capable of transmitting light.

A part of the terminal device may be equipped with a cosmetic refrigerator for storing cosmetics. A mirror may be attached to the door 150 of the cosmetic refrigerator, but this is not necessarily so.

Hereinafter, for convenience of explanation, the portion corresponding to the cosmetic refrigerator of the terminal device will be referred to as a refrigerator area 20, and the remaining area excluding the area occupied by the cosmetic refrigerator will be referred to as a terminal device area 10. [

3 is an exploded perspective view of a terminal device according to an embodiment of the present invention. Referring to FIG. 3, a mirror 310 may be disposed on a front surface of a terminal device. Reflecting portions 312 for reflecting light generated in the light source portion may be coupled to both ends of the mirror 310. [ The reflector 312 reflects a part of light generated in the light source. By reflecting a part of the light generated in the light source part, glare due to the generation of light can be reduced.

The mirror 310 likes the transmittance and the reflectance so that one side can operate as a reflective surface which is invisible to the other side and the other side can operate as a transmissive surface through which the other side is transmissive. For example, when the transmissivity of the mirror 310 is adjusted to 70% and the reflectance is adjusted to 30%, when the mirror 310 is viewed from either side, the opposite side is not seen. However, when the mirror 310 is viewed from the other side, The opposite side can be seen.

At this time, the reflecting surface of the mirror 310 is preferably exposed to the outside of the terminal device. This is for the user to be able to illuminate his / her face reflected on the terminal device.

It is preferable that the transmission surface of the mirror 310 is disposed so as to face the inside of the terminal apparatus. In this case, even if a camera is installed in the terminal device, a user facing the mirror 310 can be photographed through the camera.

The mirror 310 may be manufactured by optical deposition using a non-metallic material such as SIO2 or CIO2. When the touch screen panel is attached to the mirror 310 made of a non-metallic material, a part of the mirror 310 may function as a display unit for outputting information and receiving a touch input.

The mirror 310 may be partitioned into a first mirror part 310-1 forming a terminal device area 10 and a second mirror part 310-2 forming a refrigerator area 20. [

A part of the back surface of the first mirror part 310-1 can engage with the front case 320. An opening 324 for coupling with the touch screen panel 350 may be formed in a part of the front case 320.

When the touch screen panel 350 is coupled to the opening 324 of the front case 320, the touch screen 350 and the mirror 310 directly come into contact with each other through the opening 324. Accordingly, a part of the entire area of the first mirror part 310-1 which directly contacts the touch screen panel 350 can function as the display part 125 that outputs information or receives the touch input.

A groove 322 for inserting the light source unit 330 may be formed at one end of the front case 320. When the front case 322 is coupled with the mirror 310, the reflection portion 312 may be positioned on the groove 322. Accordingly, a part of the light generated through the light source unit 330 placed in the groove 322 will be reflected by the reflection unit 312.

A door 340 for opening and closing the refrigerating chamber 364 may be coupled to the rear surface of the second mirror part 310-2. At this time, a heat insulating material (for example, styrofoam) 345 may be inserted between the second mirror part 310-2 and the door 340 to keep the refrigerating chamber 364 at a constant temperature. A groove 347 for inserting the light source unit 330 may be formed at either end of the heat insulating material 345 or the door 340. [ When the mirror 310 is attached on the heat insulating material 345 and the door 340, the reflecting portion 312 may be positioned on the groove 347. Accordingly, a part of the light generated through the light source part 330, which is seated in the groove 347, will be reflected by the reflection part 312. In FIG. 3, a groove 347 for inserting the light source unit 330 is illustrated at one end of the heat insulating material 345.

The light source unit 330 may include a light emitting unit capable of emitting light, such as an LED (Light Emitting Unit), an incandescent lamp, a fluorescent lamp, or the like. The lighting of the light source unit and the lighting of the light source unit may be controlled by the control unit or may be controlled by manual input (for example, operation of a switch for turning on / off the light source unit). The user will be able to make makeup more easily by using the light from the light source.

The light source unit 330 may include a first light source unit emitting a first color and a second light source unit emitting a second color. The first light source unit and the second light source unit are turned on during the makeup of the user, and the second light source unit is turned on after the makeup of the user is finished. The first light source unit and the second light source unit emit light of different colors of daylight, white, warm white, .

The light emitting units constituting the first light source unit and the light emitting units constituting the second light source unit may be alternately arranged or the arrays constituting the first light source unit and the arrays constituting the second light source unit may be arranged side by side .

For example, FIG. 4 shows an example in which the first light source unit and the second light source unit are disposed. As in the example shown in Fig. 4 (a), the light emitting units constituting the first light source unit and the light emitting units constituting the second light source unit may be arranged alternately one by one, As in the example, the arrays constituting the first light source section and the arrays constituting the second light source section may be arranged side by side.

The first light source unit and the second light source unit may be disposed in a manner different from the illustrated example. For example, the light emitting units constituting the first light source unit and the light emitting units constituting the second light source unit may be alternately arranged at a ratio of N: M (where N and M are different natural numbers), and the first light source unit and the second light source unit may be arranged alternately Or may be disposed at another position. For example, the first light source unit may be disposed in the front case, and the second light source unit may be disposed in the door.

The front case 320 and the door 340 may be coupled to the body 360. The body 360 determines the overall shape of the terminal device and functions to mount electronic components. Specifically, the body 360 may be provided with a storage box 362 for storing various electronic components for operating a terminal device, and a refrigerating chamber 364 for storing cosmetics. The storage box 362 may be protected by the front case 320 and the refrigerator compartment 364 may be protected by the door 340. At this time, the door 340 can be hinged to be rotatable in association with the body 360.

Electronic parts such as a touch screen, a communication unit, a memory, a control unit, and a camera can be mounted on the storage box 362. At this time, a part of the front case corresponding to the position where the camera is to be disposed may be perforated (326) so that the camera can shoot the outside of the mirror 310.

The refrigerating chamber 364 may be formed to protrude toward the back of the body 360. The thermal management unit 380 can be coupled to the protruding portion of the refrigerating chamber 364. The heat management unit 380 may supply cold air to the refrigerating chamber 364 and discharge the generated heat. For example, the thermal management unit 380 may include a thermoelectric element. When the electric current is applied, the thermoelectric element forms a low-temperature part having a low temperature at one side and the other part is an object constituting a high-temperature part at a high temperature, and cold air can be supplied to the refrigerating chamber 364 through the low temperature part of the thermoelectric element.

In order to dissipate the heat of the high temperature portion of the thermoelectric element, the heat management portion 380 may include a heat sink. For this purpose, it is preferable that the heat sink is made of a material that can easily transfer heat to receive the heat generated in the high temperature portion by conduction. That is, the heat sink may be made of a metal material or the like.

5 is a block diagram of a terminal device based on electronic components that can be inserted into a terminal device. 5, a terminal device according to the present invention includes a communication unit 510, a display unit 520, a camera 530, a light source unit 540, a memory 550, a sensing unit 560, and a control unit 570 .

The communication unit 510 plays a role of receiving data from outside or transmitting data to the outside. Through the communication unit 510, the terminal device can communicate with an external server or an external terminal.

(E.g., Bluetooth, Zigbee or NFC (Near Field Communication)), a mobile communication technology (e.g., LTE (Long Term Evolution), HSDPA (High Speed Downlink Packet Access) (Wireless Local Area Network), Wi-Fi (Wireless Fidelity), etc.) may be applied, or wired communication technology may be applied.

The display unit 520 plays a role of outputting information under the control of the controller 570. At this time, the display unit 520 may be formed on at least a part of the mirror. In this case, the mirror 120 plays a role of illuminating the user by using reflection of incident light, and at the same time, plays a role of outputting information through at least a part of the area (i.e., the area where the display 520 is combined) You can do it.

The display unit 520 functions as an output device for outputting information and can function as an input device for receiving a touch input.

The camera 530 can take a picture of the front surface of the mirror 120. For example, the camera 530 may photograph a user looking at the mirror 120. The front side of the mirror 120 (i.e., the side facing the user) reflects the light, and the back side (i.e., the case 12 and the case 12) Face) can have a translucent form that allows light to pass through. As the reflective surface reflecting the light is exposed to the outside, the user will be able to illuminate his / her appearance in the mirror. The camera 530 can photograph the outside of the terminal device as the transmission surface through which the light passes is directed toward the camera 530. [

The image data collected via the camera 530 may be analyzed and processed with user control commands. In addition, the controller 570 may analyze the video signal input to the camera 530 to determine the face type, the lip type, the eye shape, the skin tone type, the skin type, etc. of the user. At this time, in order to reduce power consumption by the camera, the control unit may keep the camera in an inactive state, and may control the camera to be activated only when the user is close to the terminal device.

The light source unit 540 can emit light. The light source unit 540 may include a light emitting unit such as an LED (Light Emmiting Unit), an incandescent lamp, a fluorescent lamp, or the like. The lighting of the light source unit 540 may be controlled by the control unit 570 or may be controlled by a manual input (for example, operation of a switch for turning on / off the light source unit 540). For example, the control unit maintains the light source unit in the off state, and when the user is close to the terminal device, when the user intends to shoot the user through the camera, or when it is determined that the user intends to start makeup, .

The memory 550 stores data supporting various functions of the terminal. The memory 550 may store a plurality of application programs, for example, an application for skin analysis, and various data driven by the mobile terminal. In addition, the memory 550 may store moving images for guiding makeup. At this time, moving pictures can be classified and stored according to a predetermined criterion.

The sensing unit 560 may sense an object approaching the terminal device. When it is detected that the user is close to the terminal device, the controller 570 activates the camera 530, analyzes the image input through the camera 530, and determines whether a nearby user is in a state of using the terminal device .

The control unit 570 controls the overall operation of the terminal device. The control unit 570 analyzes the face region of the user through the image input through the camera 530, extracts moving images suitable for the analysis result, and outputs the moving image list including the extracted moving images. The control unit 570 processes signals, data, information, and the like input or output through the above-described components, or drives an application program stored in the memory 550 to provide or process appropriate information or functions to the user .

The operation of the terminal according to the present invention will be described in detail with reference to the above description.

In order to use the service provided by the terminal device according to the present invention, the user can first try to log on (or log in) through the registered account. Specifically, when account information such as an ID and a password for using a service provided by the terminal device is input, the control unit can attempt to log on the user based on the inputted account information. The control unit may transmit the user ID and the password to the skin management server and receive the user login result from the skin management server.

6 and 7 are diagrams showing an example in which user log-on proceeds.

When the terminal apparatus is started, the control unit can first control the plurality of guide messages to sequentially output the guidance messages guiding the service provided by the terminal apparatus, as in the example shown in FIG.

6A to 6C illustrate that a message indicating that services such as skin diagnosis, skin condition management, skin care method, and customized cosmetic recommendation can be provided are sequentially output.

When a predetermined time has elapsed since a touch input in which a pointer touching the display unit is dragged in a predetermined direction is received or a predetermined guidance message is output in a state in which any one of the plurality of guidance messages is output, It is possible to control so that the guidance message of the order number is outputted.

For example, as in the example shown in FIG. 6 (a), in a state in which a first guidance message indicating that a skin diagnosis service can be provided is being output, a pointer that touches the display unit is dragged in a predetermined direction When a predetermined time has elapsed since the touch input is received or the first guidance message is output, the control unit stops outputting the first guidance message to inform that the skin diagnosis service can be provided, As shown in the illustrated example, it is possible to control the second guidance message to be outputted to guide that the skin condition management service can be provided.

When all the guidance messages are output, the output of the guidance message is stopped, and a user input for starting the service request (for example, a touch input for touching the start button shown in Fig. 6) is received, A log-on screen can be output as in the example shown in FIG.

When the account information such as the ID and the password is inputted, the control unit can request the user to log on to the skin diagnosis server based on the inputted account information.

If the user is in the non-member state, the user can click the member registration button 710 to proceed with the member registration procedure. The general description of the membership procedure is already known, so a detailed description thereof will be omitted.

When the user logs on to the terminal device for the first time, the control unit can control to output a setting screen for receiving user information.

For example, FIG. 8 shows an example in which user information is input through a setting screen. First, as in the example shown in FIG. 8A, the control unit may output a screen for requesting a user's profile image. The face registration request screen may include a photographing button 810 for photographing a user through a camera and a gallery button 820 for selecting any one of photographs already photographed.

When a photograph is photographed through the photographing button 810 or a photographed photograph is selected through the gallery button 820, the control unit displays the photographed photograph or the selected photograph as shown in the example of FIG. 8 (b) You can register as your profile image.

If the profile image associated with the user account is already registered, the profile image registration procedure shown in Figs. 8A and 8B may be omitted.

Thereafter, the control unit may control to output a selection screen for acquiring skin information of the user.

In Figs. 8 (c) to 8 (e), screens for selecting a skin type of a user, a screen for selecting a skin tone, and a screen for selecting a skin anomaly are illustrated. Based on the user input for each screen, the skin information of the user can be obtained.

When the acquisition of the user information is completed, the control unit can provide skin diagnosis information to the logged-in user.

However, the log-on process is not a necessary procedure for receiving a service provided by the terminal device. For example, the control unit may provide a skin diagnosis service to a user who is not logged on through a guest account.

When the skin diagnosis service is activated by user input, the control unit can diagnose the skin condition of the user based on the user's face photographed through the camera. Specifically, when the user's face is recognized through the camera, the control unit can perform skin diagnosis on skin troubles, pores, dirt, wrinkles, and skin tones based on the recognized face.

For example, FIG. 9 is a view showing an example in which a photograph for skin diagnosis is taken.

When the skin diagnosis service is activated, the control unit can output a preview image input through the camera through the display unit, and control the guide line 910 to guide the user's face position on the preview image. In FIG. 9A, a guideline 910 is shown as a dotted line.

It is recognized that the face of the user is located within the guide line 910, and when the predetermined time has elapsed without the user's face being out of the guide line, the control unit can take a picture. For example, the control unit may take a picture after three seconds have elapsed since the face of the user in the guide line 910 was recognized. At this time, if the user's face goes out of the guide line 910 while counting 3 seconds, the control unit may stop counting and wait until the user's face re-enters the guide line 910.

When the photograph is photographed, the control unit can diagnose the user's skin condition based on the photographed photograph. For skin diagnosis, the control unit can set a skin diagnosis area in the photographed photograph. For example, a ball, a T zone, a forehead, a jaw, or the like may be set as a skin diagnosis area. The control unit can quantify the user's skin condition for one or more evaluation items through analysis of the skin diagnosis area.

During skin analysis, the control unit can control to output a message 920 indicating that skin diagnosis is performed, as in the example shown in FIG. 9B. If the stop button 930 for stopping the skin diagnosis is touched, the control unit can stop the skin analysis.

When the skin diagnosis is completed, the control unit may quantify the results of at least one diagnosis item and provide the skin diagnosis result to the user.

For example, FIG. 10 is a diagram showing an example in which a skin diagnosis result is output.

When the skin analysis is completed, the control unit can control the evaluation value for each analysis item to be output as in the example shown in Fig. In FIG. 10, evaluation values for five analysis items such as the pores 1001, the troubles 1003, the dirt 1005, the skin tone 1007, and the wrinkles 1009 are illustrated as being output. The control unit may calculate an overall score based on the average value of skin diagnosis results for a plurality of items.

The control unit can output weather information 1010, time information 1020, and the like. Although not shown, the weather information may include UV index information and precipitation information along with weather and temperature information. In addition, the control unit may also output the information of the logged-in user (which is illustrated in FIG. 10 as a picture of the user being output) 1030.

Although not shown, the control unit may further output a photograph of the user photographed for skin diagnosis on the skin diagnosis result screen.

The control unit can control so that the figure objects 1041, 1043, 1045, 1047, and 1049 representative of each analysis item are additionally output. Each figure object may have the name of the analysis item mapped to the figure object.

At this time, the size of the graphic object representing the analysis item and the color of the graphic object may vary depending on the evaluation value. For example, the larger the value of the evaluation value, the larger the size of the figure.

11 is a diagram showing another example in which a skin diagnosis result is output.

After each skin diagnosis item displays the reference figure 1110 located at the vertex, the controller can reflect the distance from the center of the reference figure to each vertex as the evaluation score for each skin diagnosis item. For example, in FIG. 11, the regular pentagon 1110 is displayed as a reference figure, and the distance from the center of the regular pentagon to each vertex is reflected as an evaluation score for each skin diagnosis item.

When the skin diagnosis is performed while the user is logged on, the control unit can store the user's skin diagnosis result in association with the user account information. Specifically, the control unit can transmit the skin diagnosis result of the user to the skin diagnosis server. At this time, the skin diagnosis result of the user transmitted to the skin diagnosis server may include a photograph used for skin diagnosis, an evaluation value for each item, an integrated value, and the like. Accordingly, the user can inquire at any time the skin diagnosis result associated with his or her account accumulated for a predetermined period of time.

Alternatively, when the user is not logged on, and the skin diagnosis is made, the control unit can discard the user's skin diagnosis result without storing it in the server. Accordingly, the user will not be able to inquire about the result of the skin diagnosis proceeded without being logged on at a later time.

Hereinafter, a method of outputting a comparison result of a plurality of diagnostic results when the user diagnoses skin multiple times through the terminal device will be described in detail.

12 is a flowchart for explaining the operation of the terminal device according to the present invention.

When the skin diagnosis is performed through the terminal device according to the present invention in the logged-in state, the controller can store the skin diagnosis result of the user in the skin diagnosis server in cooperation with the user account. If at least one skin diagnosis result is stored in the skin diagnosis server, the control unit may control the display unit to output a statistical result of the skin diagnosis result for a predetermined period (or a predetermined number of times) in operation S1210.

For example, FIG. 13 shows an example in which a statistical result is output.

When a user input requesting to output the cumulative skin diagnosis result is received from the user, the control unit controls the control unit to output a graph that accumulates skin diagnosis results for a predetermined time, as in the example shown in FIG. 13 (a) can do. The control unit requests the skin diagnosis result corresponding to the set statistical period to the skin diagnosis server, and based on the skin diagnosis result during the statistical period received from the skin diagnosis server, a graph as shown in FIG. 13 (a) And the like.

At this time, the statistical period can be adjusted by the user. The statistical period can be changed by the user. 13, when the period button 1310 is touched, the control unit can output a menu 1320 in which a statistical period can be set, as in the example shown in FIG. 13 (b) have. If a new statistical period is selected from the menu 1320, the controller may output a graph that accumulates skin diagnosis results for the selected period.

The control unit can plot the graph based on the comprehensive score or the evaluation score of each item. The control unit can output a graph based on the selected item when another item is selected by user input while outputting an item as a graph.

For example, in (a) of FIG. 13, as the comprehensive score item 1330 is selected, it is illustrated that a graph created based on the comprehensive score is output. In (c) of FIG. 13, as the wrinkle item 1340 is selected, it is illustrated that a graph created based on the evaluation score of wrinkles is output.

If the number of items to be displayed on the graph is large, there is a problem that the visibility of the graph is poor. The control unit sets a unit period shorter than the statistical period so that when the number of times of skin diagnosis for a unit period is plural times, the control unit selects one of the skin diagnosis results as a representative plural times and outputs only the selected skin diagnosis result to the graph Can be controlled.

For example, if the unit period is five days, the control unit can group the skin diagnosis evaluation results in units of five days based on the current date. At this time, when there are a plurality of skin diagnosis results in a specific group, the control unit can select any one of a plurality of skin diagnosis results as representative skin diagnosis results of the group.

For example, if the skin diagnosis results from March 1 to March 5 are grouped into one group, and if there are two skin diagnosis results from March 1 to March 5, Any one of the results can be selected as a representative, and the selected skin diagnosis result can be displayed on the graph. At this time, the control unit may select the earliest or the slowest one among the plurality of skin diagnosis results on the basis of the date on which the skin diagnosis is performed, or may select the earliest one among the plurality of skin diagnosis results (for example, You may choose to represent one or the lowest.

Accordingly, if one month (ie, 30 days) is set as the statistical period, one skin diagnosis every five days, and up to six skin diagnosis results may be displayed on the graph.

Instead of selecting any one of a plurality of skin diagnosis results during a unit period, the control unit may display an average evaluation score obtained by averaging a plurality of skin diagnosis results during a unit period on a graph. For example, if there are two skin diagnosis results between March 1 and March 5, the control unit may control the average evaluation score obtained by averaging the evaluation scores of the two skin diagnosis results to be displayed on the graph.

When a specific item on the graph is selected, the control unit can control the details of the skin diagnosis result corresponding to the selected item to be output.

For example, FIG. 14 is a diagram showing an example in which a detailed skin diagnosis result is output.

In the example shown in FIG. 14 (a), skin analysis results of February 5, February 10, February 15, February 27, March 1 and March 3 on the x axis of the graph As shown in FIG. At this time, if an item corresponding to Feb. 15 is touched, the control unit can control the detailed result of the skin diagnosis performed on February 15 to be output as in the example shown in FIG. 14 (b).

The detailed skin diagnosis result is outputted through FIGS. 10 and 11, and a detailed description thereof will be omitted.

When the user input for requesting the photograph statistics is received while the statistical result is outputted, the control unit can control the output of the photograph statistical result including the photograph of the user used for skin diagnosis and the evaluation result for each photograph during the statistical period have.

For example, FIG. 15 is a diagram showing an example in which a photograph statistical result is outputted.

When a user input (e.g., a touch input for touching the photo statistics button 1350 shown in FIG. 13A) requesting photo statistics is output while the statistical result is output, , It is possible to control so that a photo statistics screen including a list of pictures used for skin diagnosis and a skin diagnosis score for each picture during the statistical period, as in the example shown in Fig.

If the number of times a skin diagnosis is performed is large, the number of pictures to be output through the picture statistics screen will increase. However, if the skin diagnosis is performed multiple times in a short time interval, they may be meaningless because they do not properly reflect changes in skin condition.

Accordingly, the control unit sets a unit period shorter than the statistical period. If the number of times of skin diagnosis for the unit period is a plurality of times, the controller selectively selects one of the skin diagnosis results for a plurality of times, The evaluation score can be controlled so as to be output on the photo statistics screen.

For example, if the unit period is five days, the control unit can group the skin diagnosis evaluation results in units of five days based on the current date. At this time, when there are a plurality of skin diagnosis results in a specific group, the control unit can select any one of a plurality of skin diagnosis results as representative skin diagnosis results of the group.

For example, if the skin diagnosis results from March 1 to March 5 are grouped into one group, and if there are two skin diagnosis results from March 1 to March 5, Any one of the results can be selected as a representative, and the selected skin diagnosis result can be displayed on the graph. At this time, the control unit may select the earliest or the slowest one among the plurality of skin diagnosis results on the basis of the date on which the skin diagnosis is performed, or may select the earliest one among the plurality of skin diagnosis results (for example, May be selected as representative.

Accordingly, on the photo statistic screen, one month (i.e., 30 days) may be included in the statistical period, one in every five days, and a maximum of six pictures.

In the above-described embodiments, the unit period is set to 5 days, but the present invention is not limited thereto. The unit period can be set to various periods such as one day or one week, and may be set to a time unit such as morning or afternoon.

For example, if the unit period is set to 1 day and the user performs the skin diagnosis twice in the morning and the afternoon, the control unit displays, on the basis of the diagnosis result of either the morning skin diagnosis result or the afternoon skin diagnosis result, Or create a picture statistics screen.

In addition, the unit period at the time of graph creation and the unit period at the time of constructing the photo statistic screen may be set to different values.

If any one of the photographs included in the photo statistics screen is selected, the control unit can control the detailed contents of the skin diagnosis result corresponding to the selected photo to be output.

For example, FIG. 16 is a diagram showing an example in which a detailed skin diagnosis result is output.

In the example shown in FIG. 16 (a), on the photo statistic screen, the date of March 3, March 1, February 27, February 15, February 10, February 5, February 1 , January 27, January 15, and January 10 are displayed. If the photograph corresponding to March 3 is touched, the control unit can control the detailed result of the skin diagnosis performed on March 3 to be outputted as in the example shown in FIG. 16 (b).

The detailed skin diagnosis result is outputted through FIGS. 10 and 11, and a detailed description thereof will be omitted.

If more than one picture is selected from the picture statistics screen (S1220), the controller may control to output a comparison screen comparing the skin diagnosis results of the selected pictures (S1230).

For example, FIG. 17 shows an example in which a comparison screen is output.

When a user input for requesting comparison of at least two skin diagnosis results (for example, a touch input for touching the comparison button shown in Fig. 15) is received while the photo statistics screen is being output, , It is possible to switch to a standby state for waiting for selection of two or more pictures. In the waiting state, a message 1710 may be output to guide the user to select two or more pictures.

When a user input for selecting two or more pictures on the picture statistics screen is received, the control unit can control to output a comparison screen for comparing the skin diagnosis results of the selected pictures.

For example, FIG. 17 (b) shows an example in which a comparison screen is output.

Assuming that the display unit can be divided into a first area and a second area divided by a virtual line, a skin analysis result corresponding to one of two pictures selected by the user is output through the first area Through the second region, the skin analysis result corresponding to the other one of the two photographs selected by the user can be output.

In FIG. 17 (b), a skin diagnosis result of March 3 and a photograph used for skin diagnosis on March 3 are output through the first region, a skin diagnosis result of February 15 and a It is exemplified that the photograph used for skin diagnosis on the 15th of the month is outputted.

The control unit may output a message 1720 comparing the skin diagnosis results in the first area and the second area. For example, in (b) of FIG. 17, a message indicating that the skin diagnosis result on March 3 is poor compared to Feb. 15 is illustrated.

In displaying the skin diagnosis result, the control unit can highlight items that are superior to the analysis items of the skin diagnosis results. For example, referring to FIG. 17 (b), since the score of the pore item of February 15 is better than that of March 3, the pore item of February 15 may be highlighted. Here, the highlighting may be implemented by changing the font (e.g., font, font size or format (e.g., bold), etc.) or changing the color.

If any of the evaluation items of the skin diagnosis results is selected, the control unit can control the sample area for comparing the selected evaluation items in both photographs to be identified and displayed.

For example, if the pore item is selected, the control unit may control the ball area 1730 to be identified and displayed so that the user can compare the size of the pore, as in the example shown in FIG. 17B.

For a more detailed comparison, the control unit may also magnify the sample area 1730.

In the above example, it is exemplified that a comparison screen for comparing two or more skin diagnosis results can be outputted only when the user selects two or more pictures. Unlike the illustrated example, the control unit may output a comparison screen of two or more skin diagnosis results even when one picture is selected.

For example, when a user input for selecting any one of the photo statistics screens is received, the control unit may output a skin diagnosis result of a user-selected photo and a comparison screen comparing the most recent skin diagnosis results. In another example, the control unit may output a skin diagnosis result of the user's selected photo and a comparison screen comparing the skin diagnosis result with the best diagnosis result.

According to an embodiment of the present invention, the above-described methods (operation flow charts) can be implemented by a program such as a computer program or an application, or can be implemented as a code readable by a processor on a medium on which the program is recorded . Examples of the medium that can be read by the processor include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, etc., and may be implemented in the form of a carrier wave (e.g., transmission over the Internet) .

The terminal device 10 described above can be applied to a configuration and a method of the embodiments described above in a limited manner, but the embodiments may be modified such that all or some of the embodiments are selectively combined .

510:
520:
530: Camera
540: Light source
550: memory
560:
570:

Claims (24)

Photographing a user's photograph;
Analyzing the skin condition of the user based on the photographed photographs; And
Storing the photographed photograph and the skin diagnosis result obtained by analyzing the skin condition of the user
, ≪ / RTI &
A comparison screen for comparing the first skin diagnosis result and the second skin diagnosis result is output when at least one skin diagnosis result among a plurality of skin diagnosis results accumulated during a predetermined period is selected, And wherein, when any one of the evaluation items is selected, a sample area in the photograph corresponding to the selected evaluation item is identified and displayed.
The method according to claim 1,
Wherein the first area on the comparison screen includes a first evaluation value of the first skin diagnosis result and a first photograph used to calculate the first evaluation value,
Wherein the second area on the comparison screen includes a second evaluation value of the second skin diagnosis result and a second photograph used to calculate the second evaluation value.
3. The method of claim 2,
Wherein the first evaluation value and the second evaluation value include at least one of an evaluation value for the at least one evaluation item and an average value obtained by averaging evaluation values of the plurality of evaluation items .
delete The method according to claim 1,
Wherein the evaluation item includes at least one of pore, skin tone, wrinkles, dullness, and trouble.
The method according to claim 1,
If the first skin diagnosis result among the plurality of skin diagnosis results is selected by user input, the second skin diagnosis result is automatically selected in consideration of at least one of the date of skin diagnosis and the evaluation value for skin diagnosis And comparing the results of skin diagnosis.
The method according to claim 1,
The skin diagnosis result comparison method includes:
Further comprising the step of outputting a statistical screen including a plurality of skin diagnosis results accumulated for a predetermined period of time.
8. The method of claim 7,
Wherein the first skin diagnosis result or the second skin diagnosis result is selected by user input on the statistical screen.
8. The method of claim 7,
Wherein the statistical screen includes a graph plotting evaluation values of each of the plurality of skin diagnosis results.
8. The method of claim 7,
Wherein the statistical screen includes a list of pictures used for analyzing each of the plurality of skin diagnosis results.
8. The method of claim 7,
Characterized in that when a plurality of skin diagnosis results are included in a unit period shorter than the predetermined period, the statistical screen typically includes any one of a plurality of skin diagnosis results during the unit period. Results comparison method.
12. The method of claim 11,
Wherein one of the plurality of skin diagnosis results during the unit period is selected as a representative based on at least one of a date of performing the skin analysis and an evaluation value.
A display unit;
A wireless communication unit for communicating with a server;
A camera for photographing; And
A controller for analyzing the skin condition of the user based on the photographed photograph and controlling the photographed photograph and the skin condition of the user to be stored in the server when the photograph of the user is photographed through the camera,
, ≪ / RTI &
Wherein the control unit controls to output a comparison screen for comparing the first skin diagnosis result and the second skin diagnosis result when at least one skin diagnosis result among a plurality of skin diagnosis results accumulated during a predetermined period is selected, And controls to identify and display a sample area in the photograph corresponding to the selected evaluation item when any one of the evaluation items displayed on the screen is selected.
14. The method of claim 13,
Wherein the control unit selects the first skin diagnosis result or the second skin diagnosis result by user input for the plurality of skin diagnosis results.
14. The method of claim 13,
The control unit controls to output, through the first region on the comparison screen, the first evaluation value of the first skin diagnosis result and the first photo used for calculating the first evaluation value,
And a second evaluation value of the second skin diagnosis result and a second photograph used for calculating the second evaluation value are outputted through the second region on the comparison screen.
16. The method of claim 15,
Wherein the first evaluation value and the second evaluation value include at least one of an evaluation value for the at least one evaluation item and an average value obtained by averaging evaluation values of the plurality of evaluation items.
delete 14. The method of claim 13,
Wherein the evaluation item includes at least one of pore, skin tone, wrinkles, dullness, and trouble.
The method according to claim 13 or 14,
If the first skin diagnosis result among the plurality of skin diagnosis results is selected by user input, the second skin diagnosis result is automatically selected in consideration of at least one of the date of skin diagnosis and the evaluation value for skin diagnosis And the terminal device.
14. The method of claim 13,
Wherein the control unit controls the display unit to output a statistical image including a plurality of skin shear results accumulated for a predetermined period of time through the display unit.
21. The method of claim 20,
Wherein the statistical screen includes a graph plotting evaluation values of each of the plurality of skin diagnosis results.
21. The method of claim 20,
Wherein the statistical screen includes a list of photographs used for analyzing each of the plurality of skin diagnosis results.
21. The method of claim 20,
Wherein when a plurality of skin diagnosis results are included in a unit period shorter than the predetermined period of time, the control unit controls so that any one of a plurality of skin diagnosis results during the unit period is included in the statistical image as a representative .
24. The method of claim 23,
Wherein the control unit selects one of a plurality of skin diagnosis results for the unit period as a representative based on at least one of a date and an evaluation value of skin analysis.
KR1020150109116A 2015-07-31 2015-07-31 Terminal device and method for comparing skin dignosis results using thereof KR101753633B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150109116A KR101753633B1 (en) 2015-07-31 2015-07-31 Terminal device and method for comparing skin dignosis results using thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150109116A KR101753633B1 (en) 2015-07-31 2015-07-31 Terminal device and method for comparing skin dignosis results using thereof

Publications (2)

Publication Number Publication Date
KR20170014992A KR20170014992A (en) 2017-02-08
KR101753633B1 true KR101753633B1 (en) 2017-07-04

Family

ID=58155883

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150109116A KR101753633B1 (en) 2015-07-31 2015-07-31 Terminal device and method for comparing skin dignosis results using thereof

Country Status (1)

Country Link
KR (1) KR101753633B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200006782A (en) 2018-07-11 2020-01-21 주식회사 컬러스 Manufacturing customized cosmetics based on user's skin analysis information and customized cosmetics manufacturing system therefor
KR20200006781A (en) 2018-07-11 2020-01-21 주식회사 컬러스 Providing customized cosmetics recipe based on user's skin analysis information and system for providing customized cosmetics recipe

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013122233A1 (en) * 2012-02-15 2013-08-22 日立マクセル株式会社 System for management of skin condition measurement analysis information, and method for management of skin condition measurement analysis information

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013122233A1 (en) * 2012-02-15 2013-08-22 日立マクセル株式会社 System for management of skin condition measurement analysis information, and method for management of skin condition measurement analysis information

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200006782A (en) 2018-07-11 2020-01-21 주식회사 컬러스 Manufacturing customized cosmetics based on user's skin analysis information and customized cosmetics manufacturing system therefor
KR20200006781A (en) 2018-07-11 2020-01-21 주식회사 컬러스 Providing customized cosmetics recipe based on user's skin analysis information and system for providing customized cosmetics recipe

Also Published As

Publication number Publication date
KR20170014992A (en) 2017-02-08

Similar Documents

Publication Publication Date Title
KR101668348B1 (en) Method for analyzing skin surface and apparatus therefor
CN111788623B (en) Intelligent mirror system and using method thereof
KR101661588B1 (en) Method for analyzing skin surface and apparatus therefor
US10614921B2 (en) Personalized skin diagnosis and skincare
US11819108B2 (en) Smart mirror system and methods of use thereof
US9195816B2 (en) Intelligent graphics interface in a handheld wireless device
JP2010004118A (en) Digital photograph frame, information processing system, control method, program, and information storage medium
CN102577367A (en) Time shifted video communications
JP2004357103A (en) Mirror device with display device
KR20180080140A (en) Personalized skin diagnosis and skincare
CN208013970U (en) A kind of living creature characteristic recognition system
KR101753633B1 (en) Terminal device and method for comparing skin dignosis results using thereof
KR20160084752A (en) Dressing table
KR101701210B1 (en) Method for outputting an skin analysing result, apparatus and application therefor
KR101684272B1 (en) Terminal device and controlling method thereof
US8831300B2 (en) Time-lapsing data methods and systems
KR101648049B1 (en) Dressing table and controlling method thereof
EP3641319A1 (en) Displaying content on a display unit

Legal Events

Date Code Title Description
A201 Request for examination
E701 Decision to grant or registration of patent right
GRNT Written decision to grant