CN112165895A - Unmanned analysis device for skin of face and back of hand - Google Patents

Unmanned analysis device for skin of face and back of hand Download PDF

Info

Publication number
CN112165895A
CN112165895A CN201980035090.9A CN201980035090A CN112165895A CN 112165895 A CN112165895 A CN 112165895A CN 201980035090 A CN201980035090 A CN 201980035090A CN 112165895 A CN112165895 A CN 112165895A
Authority
CN
China
Prior art keywords
skin
module
user
face
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980035090.9A
Other languages
Chinese (zh)
Inventor
朴玧奎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ykc Tech
Ykc Tech Co
Original Assignee
Ykc Tech Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ykc Tech Co filed Critical Ykc Tech Co
Publication of CN112165895A publication Critical patent/CN112165895A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Dermatology (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

According to an embodiment, there is provided an unmanned analysis device for skin of face and back of hand, including: the self-service terminal comprises a touch screen; an opening part which is positioned at the self-service terminal and is used for inputting the hand of the user; the first skin measuring module is positioned at the upper end of the self-service terminal and comprises a first illuminating module used for illuminating the face of the user and a first camera module used for shooting the face of the user; a second skin measurement module located at the opening, and including a second illumination module for illuminating the hand of the user and a second camera module for photographing the hand of the user; and a control module for calculating the skin diagnosis result of the user based on at least one of the first image shot by the first skin measuring module and the second image shot by the second skin measuring module.

Description

Unmanned analysis device for skin of face and back of hand
Technical Field
The present invention relates to a skin diagnosis device, and more particularly, to an unmanned analysis device for skin of face and back of hand for beauty use using a self-service terminal.
Background
In the past, various countries around the world have a high interest in beauty care, and there is an increasing interest in skin management methods for diagnosing skin conditions and providing the results of the diagnosis.
Thus, the existing facial skin diagnosers are used for beauty, i.e., for cosmetic counseling, thereby inducing a global fashion trend. However, the conventional facial skin diagnosis device has inconvenience in that when a user performs skin diagnosis, assistance of an expert or a counselor is required to operate a device, and the expert or the counselor is required to display an analysis result and provide the user with 1: 1.
Disclosure of Invention
Technical problem
The purpose of the present invention is to realize an unmanned analysis device for skin of face and back of hand for beauty use, which can measure skin conditions of face and hand and perform skin diagnosis by automatically photographing the face and hand with a self-service terminal. In more detail, when people wait in a crowded place, the user can simply diagnose the skin state during the waiting time and directly output and receive the measured diagnosis result data from the unmanned analysis device for the skin of the face and the back of the hand or transmit it to the user mobile device.
Means for solving the problems
The unmanned analysis device for skin of a face and a back of the hand of the embodiment further includes: the self-service terminal comprises a touch screen; an opening part which is positioned at the self-service terminal and is used for inputting the hand of the user; the first skin measuring module is positioned at the upper end of the self-service terminal and comprises a first illuminating module used for illuminating the face of the user and a first camera module used for shooting the face of the user; a second skin measurement module located at the opening, and including a second illumination module for illuminating the hand of the user and a second camera module for photographing the hand of the user; a control module for calculating a skin diagnosis result of the user based on at least one of a first image captured by the first skin measurement module and a second image captured by the second skin measurement module; a storage module for storing at least one of the first image and the second image, the operation result of the control module, and the skin diagnosis result; the printing module is positioned at the self-service terminal and used for printing the operation result of the control module and the skin diagnosis result; and a communication module transmitting the skin diagnosis result to at least one of a mobile terminal, an e-mail, a Short Message Service (SMS), and a Social Network Service (SNS) through a wired or wireless network, the touch screen starting the unmanned analysis device for the skin of the face and the back of the hand through a touch input, and displaying the skin diagnosis result.
The skin diagnosis apparatus may further include a third skin measurement module, wherein the third skin measurement module includes a third illumination module and a third camera module, the third illumination module is located at an upper end or a lower end of the touch screen and illuminates the neck of the user, the third camera module is located at an upper end or a lower end of the touch screen and photographs the neck of the user, the control module measures a neck wrinkle index of the user based on a third image photographed by the third skin measurement module and measures the skin age of the user based on the neck wrinkle index, and the skin diagnosis result may include the measured skin age.
The first and second illumination modules may be respectively formed of at least one of a Light Emitting Diode (LED) module, a general optical module, a polarization module, and an ultraviolet optical module.
The control module may measure at least one skin index related to a skin condition of the user and analyze the measured skin index to derive a skin diagnosis result.
Wherein the at least one skin index includes pores, elasticity, oil and moisture content, and oil and moisture balance rate, and the skin diagnosis result may include at least one of general skin health, items to be centrally managed, skin age, skin brightness, and recommended cosmetic information.
The skin diagnosis result further includes a unique user name and a password assigned to the user, the user name and the password are received through touch input of the touch screen before starting skin diagnosis of the user to which the user name and the password are assigned, and the control module may compare at least one existing skin diagnosis result corresponding to the user name and the password with the skin diagnosis result newly acquired by the user.
The control module may collect at least one existing skin diagnosis result corresponding to the user name and the password from the at least one face and back skin unmanned analysis device synchronized with the communication module, and compare the skin diagnosis result newly obtained by the user with the collected skin diagnosis result.
The system further comprises a sample providing module for cosmetic samples, wherein the sample providing module comprises at least one cosmetic included in the recommended cosmetic information, and derives the samples according to the recommended cosmetic information from the self-service terminal.
ADVANTAGEOUS EFFECTS OF INVENTION
The invention has the following effects: the diagnosis result may be received by the unmanned analysis device for facial and dorsal skin for beauty treatment in places where many people gather and wait, such as airport tax free stores, university hospitals, pharmacies, glasses stores, subways, and the like, in addition to daily places in the beauty industry. Further, the present apparatus is widely installed in places such as airport tax free stores, university hospitals, pharmacies, glasses stores, subways, and the like to establish a skin diagnosis infrastructure, thereby providing a skin diagnosis result and recommending cosmetics suitable for the skin diagnosis result.
Drawings
Fig. 1 is a diagram illustrating an unmanned analysis device for skin of a face and a back of a hand according to an embodiment.
Fig. 2 is a diagram illustrating an unmanned analysis device for skin on the face and back of the hand and a user according to an embodiment.
Fig. 3 is a diagram illustrating a configuration of an unmanned analysis device for skin of a face and a back of a hand according to an embodiment.
Fig. 4 is a diagram illustrating an exterior of a unmanned analysis device for skin of a face and a back of a hand including a third skin measurement module according to another embodiment.
Fig. 5 is a diagram illustrating a structure of a face and back skin unmanned analysis device including a third skin measurement module according to another embodiment.
Detailed Description
While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. However, it should be understood that the present invention is not limited to the specific embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the present invention.
Terms such as "first," "second," and the like may be used to describe various structural elements, but the structural elements should not be limited by the terms. The term is used only to distinguish one structural element from another.
The terminology used in the description herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Unless a different meaning is explicitly stated in context, an expression in the singular includes an expression in the plural. In the present application, the terms "comprises" or "comprising" are intended to specify the presence of stated features, integers, steps, actions, structural elements, components, or groups thereof, but are not intended to preclude the presence or addition of one or more other features, integers, steps, actions, structural elements, components, or groups thereof.
Hereinafter, various embodiments will be described in detail with reference to the accompanying drawings, in which the same or corresponding constituent elements are denoted by the same reference numerals, and repeated description thereof will be omitted.
In the present specification, the front face of the unmanned face and back-hand skin analysis device 1 may be a portion of the unmanned face and back-hand skin analysis device 1 where the touch panel 150, the first camera module 131, and the opening 120 are located, and the rear face of the unmanned face and back-hand skin analysis device 1 may be an opposite face to the front face of the unmanned face and back-hand skin analysis device 1. The side surface of the unmanned analysis device for skin on the face and back of the hand 1 may be a surface between the front surface and the rear surface of the unmanned analysis device for skin on the face and back of the hand 1, and may be a side surface in contact with the ground surface.
The unmanned analysis device 1 for skin of face and back of hand of an embodiment may be a device that allows the user U to photograph the face and perform skin diagnosis by himself without the help of an expert.
In the present specification, the skin conditions of the face and the hands of the user U are measured and diagnosed by the unmanned analysis device for face and back skin 1, but the site where the skin conditions can be measured and diagnosed by the unmanned analysis device for face and back skin 1 is not limited to this.
Fig. 1 is a diagram illustrating an embodiment of an unmanned analysis device for skin of a face and a back of a hand 1, and fig. 2 is a diagram illustrating an embodiment of an unmanned analysis device for skin of a face and a back of a hand 1 and a user U.
Referring to fig. 1 and 2, the unmanned analysis device for skin on the face and back of the hand 1 according to the embodiment has a housing in the form of a self-service terminal 110, and may include: a first skin measurement module 130 including a touch screen 150, an opening 120, a first illumination module 133, and a first camera module 131; a second skin measurement module 140 including a second illumination module 143 and a second camera module 141; and a printing module 330.
The touch screen 150 may be included at the upper end of the front face of the self-service terminal 110, and may be a device that operates the unmanned analysis device for skin on the face and back of the hand 1 and displays the progress status of skin diagnosis on the screen. The touch screen 150 may use one of a resistive film type, a full capacitive type, an infrared ray type, and an ultrasonic wave type according to a touch type, and the touch screen 150 may be used for a self-service terminal, a navigator, a mobile phone, an Automatic Teller Machine (ATM), and a game device. The touch panel 150 may be at least one of an input device and an output device for operating the unmanned analysis device 1 for the skin of the face and the back of the hand. For example, the input device may be used by touching a screen, and the output device may be used by outputting a video image to the screen. However, the unmanned analysis device for face and back of hand skin 1 of the embodiment does not limit the input device and the output device to the touch panel 150 including both the input function and the output function, but may separate the input device and the output device using a mouse or a keyboard as the input device and a display as the output device. In this case, the input device may be replaced with a device that can input a signal and operate the face and back skin unmanned analysis device 1, in addition to the mouse or the keyboard. Also, the display panel used for the output device is not limited to one type, and may be one of a Twisted Nematic (TN), a multi-domain Vertical Alignment (VA), an in-plane switching (IPS), and an Organic Light Emitting Diode (OLED), and the size of the display may be the size that can be included in the unmanned analysis device for skin on the face and back of the hand 1.
The opening 120 may be a portion into which a hand is inserted so that the user U photographs the hand and diagnoses the skin state of the hand. As shown in fig. 1, the opening 120 of the unmanned analysis device for skin on the face and back of the hand 1 may be included in the middle of the front face of the self-service terminal 110, and may be 2. In this case, the structure and the number of each second skin measurement module 140 positioned inside the both side opening parts 120 may be the same. However, the number of openings 120 is not limited to the example of fig. 1, and the unmanned analysis device for skin on the face and back of the hand 1 may have one or more openings 120. For example, when one opening 120 is provided, the user U can photograph both hands at one opening 120 so as to photograph both the left hand and the right hand alternately at one opening 120, or photograph both hands at one opening 120 so as to photograph both the left hand and the right hand simultaneously at one opening 120. Also, as shown in fig. 1, the position of the opening 120 is not limited to the middle of the front face of the kiosk 110, but may be all positions in the kiosk 110 where the user U can insert his or her hand.
The first skin measurement module 130 may include at least one of a first illumination module 133 for illuminating light to the face of the user U and a first camera module 131 for photographing the face of the user U. In this specification, it is exemplified that the first skin measurement module 130 mainly measures the skin state of the face of the user U, but the skin state of a body part including at least one of the face, neck, ears, and back neck of the user U may be measured by the first skin measurement module 130.
The first camera module 131 may be located on top of the front face of the kiosk 110, and may photograph a body part including at least one of the face, neck, ears, and nape of the user U. The camera included in the first camera module 131 may be one of a camera, a camcorder, and a Digital Single Lens Reflex (DSLR) commonly used in the industry, but is not limited to the above-mentioned examples, and may be replaced by a device that can capture images in general.
The first illumination module 133 may be located around the camera included in the first camera module 131 in a bilaterally symmetrical manner, and may irradiate light to a body part including at least one of the face, the neck, the ears, and the nape of the user U. The first illumination module 133 may be a light emitting diode module, and the first illumination module 133 may sequentially irradiate at least one of normal light, polarized light, and ultraviolet light.
At least one of the first camera module 131 and the first illumination module 133 rotates and focuses in a direction to recognize a body part including at least one of the face, the neck, the ears, and the back neck of the user U, and thus, the first skin measurement module 130 may acquire a first image photographed of the body part including at least one of the face, the neck, the ears, and the back neck of the user U.
The second skin measurement module 140 may include at least one of a second illumination module 143 for illuminating light to the hand of the user U and a second camera module 141 for photographing the hand of the user U. However, although it is exemplified that the second skin measurement module 140 mainly measures the skin state of the back of the hand of the user U, the skin state of the hand including at least one of the left back of the hand, the left palm, the right back of the hand, and the right palm of the user U may be measured by the second skin measurement module 140.
The second camera module 141 may be located on a top surface of the opening 120, and may photograph at least one of a left back of the hand, a left palm, a right back of the hand, and a right palm of the hand of the user U. The camera included in the second camera module 141 may be one of a camera, a camcorder, and a Digital Single Lens Reflex (Digital Single Lens Reflex) commonly used in the industry, but is not limited to the above-mentioned examples, and may be replaced by a device that can capture images in general.
The second illumination module 143 may be positioned around the camera included in the second camera module 141 in a bilaterally symmetrical manner, and may irradiate light to at least one of the left back of the hand, the left palm, the right back of the hand, and the right palm including the user U. The second illumination module 143 may be a light emitting diode module, and the second illumination module 143 may sequentially irradiate at least one of normal light, polarized light, and ultraviolet light.
At least one of the second camera module 141 and the second illumination module 143 rotates and focuses in a direction to recognize a body part including at least one of the face, the neck, the ears, and the back neck of the user U, and thus, the second skin measurement module 140 may acquire a second image in which the body part including at least one of the face, the neck, the ears, and the back neck of the user U is photographed.
The printing module 330 may print the skin diagnosis result obtained by the unmanned analysis device 1 for the skin of the face and the back of the hand to be provided to the user U. The printing module 330 may be located inside the unmanned analysis device for skin on the face and back of the hand 1, and the printing output unit 331 for printing the skin diagnosis result may be located on the front face of the unmanned analysis device for skin on the face and back of the hand 1. However, although the printing output portion 331 of the printing module 330 shown in fig. 1 is located at the lower portion of the opening portion 120, the position of the printing output portion 331 may be changed. For example, the print output unit 331 may be located above the opening 120, below the position shown in fig. 1, or on the side of the unmanned analysis device for skin of the face and back of the hand 1. Also, the printing module 330 may not be included in the kiosk 110, but exist as a separate device.
The self service terminal 110 of fig. 1 and 2 is illustrated as one housing, but the self service terminal 110 of the unmanned analysis device for facial and back skin 1 may be separated into 2 or more housings. For example, the center portion of the self-service terminal 11 may be defined as being separable into an upper end and a lower end, the control module 310, the storage module 350, and the communication module 360 included in the unmanned analysis device for facial and back-of-hand skin 1 may be included at the upper end or the lower end, and the printing module 330 may be included at the lower end including the print output portion 331. In this case, the control module 310, the storage module 350, the communication module 360, and the printing module 330 may be connected to each other through a connection line for electrically and mechanically connecting each module, and each module may be in a separated state by using a string-shaped connection line.
Fig. 3 is a diagram illustrating a configuration of the unmanned analysis device for skin of the face and back of the hand 1 according to the embodiment.
Referring to fig. 3, the unmanned analysis device for skin of face and back of hand 1 according to an embodiment may include: a touch screen 150; a first skin measurement module 130 including a first camera module 131 and a first illumination module 133; a second skin measurement module 140 including a second camera module 141 and a second illumination module 143; a printing module 330; a control module 310; a storage module 350 and a communication module 360.
The control module 310 measures at least one skin index related to a skin state of at least one of the first image captured by the first skin measurement module 130 and the second image captured by the second skin measurement module 140, and may derive a skin diagnosis result by analyzing the measured skin index.
The measured skin index may include a pore index, an elasticity ratio, oil and moisture contents, an oil and moisture balance ratio, and a skin color, and the skin diagnosis result may include the measured skin index, a comprehensive health degree of skin derived by analyzing the skin index, items requiring centralized management, skin age, individual skin color, recommended cosmetic information, and a skin management method.
For example, when the measured pore index is high and the elasticity index is low, a sebum-profused part and a dry part should be separately managed, a skin diagnosis result includes a skin management method for blocking ultraviolet rays and requiring oil and moisture balance, and at least one cosmetic information for reducing the pore index and increasing the elasticity index may be included in the skin diagnosis result.
Also, depending on the skin color, the skin diagnosis result may include a skin management method in which the user U having ivory-color skin, beige-color skin, or yellow skin needs to be coated with the ultraviolet blocking agent having the ultraviolet blocking index of 30 or more per day, may include a skin management method in which the user U having suntan skin needs to be coated with the ultraviolet blocking agent having the ultraviolet blocking index of 15 or more per day, and may include a skin management method in which the user U having brown skin or black skin needs to be coated with the ultraviolet blocking agent having the ultraviolet blocking index of 15 or more while out.
And, according to the measured skin color, the skin diagnosis result may include which of a spring warm tone, a summer cool tone, an autumn warm tone, a winter cool tone, and a neutral tone the skin color of the user U is. However, the skin color type exemplified in the present specification may be a type that has been conventionally classified in the diagnosis of a personal skin color, and the skin color type may be further subdivided.
Also, the control module 310 may further measure at least one of an octave, an eyelash length, a number of acne, a width of a spot, and a width of a mole of the user U based on the first image. For example, the control module 310 measures the length of the splay and the number of acnes and displays them in the first image, and measures the width of the spots and moles and displays them in the first image, thereby providing the user U with a skin diagnosis result showing the splay, the acne, the spots and moles. Also, the control module 310 may measure the length of the eyelashes based on the first picture image of the user U with the eyes open and the first picture image of the user U with the eyes closed, thereby providing the length of the eyelashes into the skin diagnosis result.
The skin diagnosis result may further include at least one of personal information such as the name, age, and cell phone number of the user U, and may further include the date and time when the user U performs the skin diagnosis in the unmanned analysis device for skin of face and back of hand 1, and weather information of the time point and the region where the above-described skin diagnosis is performed. For example, personal information such as the name, age, and cell phone number of the user U may be included in the skin diagnosis result by the user U inputting into the unmanned analysis device for skin of the face and back of the hand 1. Further, the date and time when the user U performs the skin diagnosis in the unmanned facial and back-hand skin analysis device 1, and the weather information of the time point and the area at which the skin diagnosis is performed can be included in the skin diagnosis result by the unmanned facial and back-hand skin analysis device 1 acquiring the information on line through the communication module 360.
In this case, the skin measurement result affected by dry or wet weather may be corrected based on weather information included in the skin diagnosis result, and a skin management method of how to perform skin management in the case of weather affecting the skin, such as dry or wet weather, strong ultraviolet weather, and dusty weather, may be included in the skin diagnosis result.
The control module 310 may induce the user U to purchase the recommended cosmetics by including the recommended cosmetics information in the skin diagnosis result. For example, the unmanned analysis device for skin on the face and back of the hand 1 further includes a sample providing module for providing a sample, so that the user U can receive recommended cosmetics from the sample providing module. Further, the user U can not only purchase a sample from the sample providing module, but also settle the recommended cosmetics on the skin of the face and back of the hand with the unmanned analysis device 1 and input the delivery address to collect the recommended cosmetics. Here, the unmanned analysis device for face and back skin 1 connects to a sales site of the recommended cosmetics based on the recommended cosmetics information included in the skin diagnosis result, so that the user U can purchase the recommended cosmetics through the unmanned analysis device for face and back skin 1.
The storage module 350 may store at least one of the first image and the second image, a skin index of the at least one of the first image and the second image, and a skin diagnosis result calculated by the control module 310. The storage module 350 may include, as a common storage medium, one or more of a Hard Disk Drive (HDD), a Read Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), and a Memory Card (Memory Card).
When storing the skin diagnosis result, the storage module 350 includes a password for protecting a user name (ID) and personal information in the skin diagnosis result of the user U, so that the user U can call up the stored skin diagnosis result of the user U in the unmanned analysis device for skin on the face and back of the hand 1. For example, when the user U performs skin diagnosis in the unmanned analysis device for skin on the face and back of the hand 1, the user U may call up the skin diagnosis result stored in the storage module 350 by inputting a user name and a password of the user U on the touch panel 150. In this case, the user name and password may be set by the user U in a conventional manner that sets the user name and password before the skin diagnosis begins.
As the user U performs a new skin diagnosis, the unmanned analysis device for face and back skin 1 can include the new skin diagnosis result of the user U and the comparison information between the skin diagnosis result of the user U and the new skin diagnosis result of the user U, which is stored in the past, in the skin diagnosis result. However, the user U can perform the skin diagnosis without inputting a user name and a password.
The communication module 360 may transmit the skin diagnosis result to at least one of the mobile terminal of the user U, an e-mail, a short message service, and a social network service through a wired or wireless network. The communication method can be a communication method commonly used in industries such as Ethernet, WIFI, LTE, 3g, 4g and 5 g.
The communication module 360 may connect and synchronize at least one of the unmanned analysis device for skin on the face and the back of the hand 1 through a wireless network. For example, as for the skin diagnosis result of the skin diagnosis performed by the first unmanned facial and dorsal skin analysis device 1, the user U can call up the skin diagnosis result stored in the first unmanned facial and dorsal skin analysis device 1 by inputting the user name and password of the user U into the unmanned facial and dorsal skin analysis device 1 different from the first unmanned facial and dorsal skin analysis device 1, for example, the second unmanned facial and dorsal skin analysis device 1 or the third unmanned facial and dorsal skin analysis device 1. Further, the user U can confirm the skin diagnosis result in the application connected to the unmanned analysis device for skin on the face and back of the hand 1.
The unmanned analysis device 1 for skin of face and back of hand of an embodiment may be a device that allows the user U to photograph the face and perform skin diagnosis by himself without the help of an expert.
For example, when a portion for "start skin diagnosis" is touched and written on the touch screen 150, skin diagnosis may be started, and the user U may select one of three sentences "face skin diagnosis", "hand skin diagnosis", and "face + hand skin diagnosis" displayed on the touch screen 150. When the user U selects "facial skin diagnosis," please stare at the camera to ensure your face appears on the screen, which may appear on the touch screen 150. "the user U can photograph the face through the first skin measuring module 130, and the first skin measuring module 130 can receive the skin diagnosis result based on the photographed first picture image. However, the skin diagnosis method of the unmanned analysis device for skin on the face and back of the hand 1 is not limited to the above example.
The unmanned analysis device for skin on the face and back of the hand 1 can perform skin diagnosis based on a signal input through the touch screen 150 of the self-service terminal 110, and can confirm the progress of skin diagnosis on the touch screen 150. Also, at least one of the first image photographed by the first skin measuring module 130 and the second image photographed by the second skin measuring module 140 may be confirmed on the touch screen 150. In this case, the first image or the second image displayed on the screen of the touch screen 150 may be a still image, or may be a real-time video frame captured by the first skin measurement module 130 or the second skin measurement module 140.
Fig. 4 is a diagram illustrating the exterior of the face and back-hand skin unmanned analysis device 1 including a third skin measurement module 510 according to another embodiment, and fig. 5 is a diagram illustrating the structure of the face and back-hand skin unmanned analysis device 1 including the third skin measurement module 510 according to another embodiment.
Referring to fig. 4 and 5, the unmanned analysis device for skin of face and back of hand 1 according to another embodiment may further include a third skin measurement module 510, and the third skin measurement module 510 includes a third illumination module 513 and a third camera module 511.
The third skin measurement module 510 may include at least one of a third illumination module 513 for illuminating light to the neck of the user U and a third camera module 511 for photographing the neck of the user U. However, it is exemplified that the third skin measurement module 510 mainly measures the skin state of the neck of the user U, but the skin state of a body part including at least one of the face, neck, ears, and back neck of the user U may be measured by the third skin measurement module 510.
The third camera module 511 may be located at an upper end above the touch screen 150 or a lower end below the touch screen 150 in the front of the kiosk 110, and may photograph a portion including at least one of the face, neck, ear, and nape of the user U. The third camera included in the third camera module 511 may be one of a camera, a camcorder, and a Digital Single Lens Reflex (Digital Single Lens Reflex) commonly used in the industry, but is not limited to the above-mentioned camera, and may be replaced by a device that can generally capture images.
The third lighting module 513 may be disposed around the third camera included in the third camera module 511 in a left-right symmetrical manner, and may irradiate light to at least one of the left back of the hand, the left palm, the right back of the hand, and the right palm including the user U. The third illumination module 513 may be a light emitting diode module, and the third illumination module 513 may sequentially irradiate at least one of normal light, polarized light, and ultraviolet light.
The unmanned analysis device for face and back of hand skin 1 of another embodiment measures the neck wrinkle index of the third picture image taken by the third skin measurement module 510, and derives the skin age of the user U from the control module 310 based on the neck wrinkle index, so that the skin age of the user U can be included in the skin diagnosis result.
In this case, the skin index of the unmanned analysis device for skin of face and back of hand 1 of another embodiment is not limited to the neck wrinkle index, but may include a pore index, an elasticity ratio, a content of oil and moisture, a balance ratio of oil and moisture, and a skin color.
Also, according to an embodiment, the storage module 350 stores at least one of the first and second video images, the skin index of at least one of the first and second video images, and the skin diagnosis result calculated by the control module 310, and similarly, the storage module 350 may store at least one of the third video image, the skin index of the third video image, and the skin diagnosis result calculated by the control module 310.
The unmanned analysis device for face and back skin 1 can compare and diagnose the skin states of the face, the hands, and the neck of the user U in consideration of the first image including the face photographed by the first skin measurement module 130, the second image including the hands photographed by the second skin measurement module 140, and the third image including the neck photographed by the third skin measurement module 510.
For example, the unmanned analysis device for face and back skin 1 may compare the skin wrinkle index of each portion according to the wrinkle state of the face, hand, and neck of the user U with the average value of the skin wrinkle indexes for each age group stored in the database of the unmanned analysis device for face and back skin 1 to derive the skin ages of the face, hand, and neck of the user U.
In this case, the unmanned analysis device for skin on the face and back of the hand 1 can include, in the skin diagnosis result, a measure and a skin management method for a portion whose skin age is larger than the actual age of the user U by comparing the actual age of the user U with the skin ages of the face, the hand, and the neck of the user U. Furthermore, the unmanned analysis device for face and back skin 1 can include, in the skin diagnosis result, at least one measure and skin management method for a part with a relatively large skin age by comparing the skin ages of the face, hand, and neck of the user U regardless of the actual age of the user U. For example, it may be recommended to apply products that slow wrinkles and skin aging, such as elastic special-purpose functional products, high moisturizing products, ultraviolet blocking agents, etc., to the user for at least one site where measures are required to be taken.
However, when the skin age of each of the face, hands, and neck of the user U is smaller than the actual age of the user U, no further measures are recommended, and a skin management method suitable for the derived skin age may be included in the skin diagnosis result.
However, as an example of comparing and diagnosing the skin condition of the face, hand, and neck of the user U in consideration of the first video image, the second video image, and the third video image, only the skin wrinkle condition is illustrated, and the unmanned analysis device for the skin of the face and back of the hand 1 can compare and diagnose the skin condition of the face, hand, and neck of the user U including at least one of the skin color, the pore condition, the oil and water content, and the pigmentation.
The embodiments of the present invention have been described above, but this is only exemplary and does not limit the scope of the present invention in any way. It will be appreciated by those of ordinary skill in the art that various modifications and equivalent arrangements can be made. Therefore, the true technical scope of the present invention should be defined by the following claims.

Claims (4)

1. An unmanned analysis device for skin of face and back of hand,
the method comprises the following steps:
the self-service terminal comprises a touch screen;
an opening part which is positioned at the self-service terminal and is used for inputting the hand of the user;
the first skin measuring module is positioned at the upper end of the self-service terminal and comprises a first illuminating module used for illuminating the face of the user and a first camera module used for shooting the face of the user;
a second illumination module which is positioned at the opening part in a bilaterally symmetrical mode and is used for illuminating the back or the palm of the hand of the user;
a second skin measurement module including a second camera module for photographing the hand of the user;
a control module for calculating a skin diagnosis result of the user based on at least one of a first image captured by the first skin measurement module and a second image captured by the second skin measurement module;
a storage module for storing at least one of the first image and the second image, the operation result of the control module, and the skin diagnosis result;
the printing module is positioned at the self-service terminal and used for printing the operation result of the control module and the skin diagnosis result; and
a communication module for transmitting the skin diagnosis result to at least one of a mobile terminal, an e-mail, a short message service and a social network service through a wired or wireless network,
the first illumination module and the second illumination module are respectively light emitting diode modules, and comprise a common light module, a polarization module and an ultraviolet light module,
the above-mentioned face and back of hand skin use the unmanned analytical equipment to also include the third skin measuring module, the above-mentioned third skin measuring module includes the third lighting module and the third camera module, the above-mentioned third lighting module locates at the upper end or lower end of the above-mentioned touch-sensitive screen, is used for illuminating the above-mentioned user's neck, the above-mentioned third camera module locates at the upper end or lower end of the above-mentioned touch-sensitive screen, is used for shooting the above-mentioned user's neck,
the control module measures at least one skin index related to a skin condition of the user, measures a neck wrinkle index of the user based on a third video image captured by the third skin measurement module, measures a skin age of the user based on the neck wrinkle index, and derives a skin diagnosis result by analyzing the at least one measured skin index and the measured skin age,
the at least one skin index includes pores, elasticity, oil and moisture content, and oil and moisture balance rate, and the skin diagnosis result includes skin general health, items to be centrally managed, skin age, skin brightness, and recommended cosmetic information.
2. The unmanned analysis device for skin of face and back of hand according to claim 1,
the skin diagnosis result further includes an inherent user name and a password given to the user,
receiving the user name and the password through a touch input of the touch screen before starting skin diagnosis of the user to which the user name and the password are given,
the control module compares at least one existing skin diagnosis result corresponding to the user name and the password with the newly acquired skin diagnosis result of the user.
3. The unmanned analysis device for skin of face and back of hand according to claim 2,
the communication module synchronizes at least one unmanned analysis device for the skin of the face and the back of the hand through a wireless network,
the control module collects at least one existing skin diagnosis result corresponding to the user name and the password from the synchronized at least one unmanned analysis device for the face and back skin, and compares the skin diagnosis result newly obtained by the user with the collected skin diagnosis result.
4. The unmanned analysis device for skin of face and back of hand according to claim 1,
also comprises a sample providing module for providing a cosmetic sample,
the sample providing module comprises at least one cosmetic included in the recommended cosmetic information, and a sample based on the recommended cosmetic information is led out from the self-service terminal.
CN201980035090.9A 2018-03-05 2019-04-26 Unmanned analysis device for skin of face and back of hand Pending CN112165895A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20180025572 2018-03-05
KR10-2018-0059629 2018-05-25
KR1020180059629A KR101927286B1 (en) 2018-03-05 2018-05-25 Unmanned skin diagnosis apparatus for face and back of hand
PCT/KR2019/005043 WO2019225870A1 (en) 2018-03-05 2019-04-26 Unmanned apparatus for analyzing skin of face and back of hand

Publications (1)

Publication Number Publication Date
CN112165895A true CN112165895A (en) 2021-01-01

Family

ID=64670840

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980035090.9A Pending CN112165895A (en) 2018-03-05 2019-04-26 Unmanned analysis device for skin of face and back of hand

Country Status (4)

Country Link
JP (1) JP2021524367A (en)
KR (1) KR101927286B1 (en)
CN (1) CN112165895A (en)
WO (1) WO2019225870A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112950858A (en) * 2021-01-29 2021-06-11 泉州市威互科技有限公司 Intelligent pharmacy based on internet and system thereof

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102226114B1 (en) 2019-10-17 2021-03-10 양경호 Photographing Device for Unmanned Automation Equipment
KR20210079090A (en) 2019-12-19 2021-06-29 (주)웰니스라이프연구소 Mobile Skin Photography Device and Image Analysis System
KR20220077277A (en) 2020-12-01 2022-06-09 (주)웰니스라이프연구소 Skin analysis and automatic prescription matching system
KR102517027B1 (en) * 2020-12-14 2023-04-03 주식회사 브이티피엘 Cosmetic marketing system through cosmetic sample providers operated based on SNS cosmetic trend analysis
KR102354086B1 (en) * 2021-09-01 2022-01-20 안효진 A method of selling custom cosmetics
WO2023078823A1 (en) 2021-11-02 2023-05-11 Trinamix Gmbh Measuring the human skin age through near-infrared spectroscopy
WO2024090218A1 (en) * 2022-10-26 2024-05-02 株式会社資生堂 Diagnosis system, diagnosis device, program, diagnosis method, method for diagnosing skin, and method for diagnosing stresses

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060095297A1 (en) * 2004-10-29 2006-05-04 Virik Ravinder S Skin assessment kiosk
KR101509991B1 (en) * 2013-11-28 2015-04-08 고려대학교 산학협력단 Skin texture measurement method and apparatus
CN106983493A (en) * 2017-03-04 2017-07-28 武汉嫦娥医学抗衰机器人股份有限公司 A kind of skin image processing method based on three spectrum
US20170270593A1 (en) * 2016-03-21 2017-09-21 The Procter & Gamble Company Systems and Methods For Providing Customized Product Recommendations
US20170340267A1 (en) * 2016-05-24 2017-11-30 Cal-Comp Big Data, Inc. Personalized skin diagnosis and skincare

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004280395A (en) * 2003-03-14 2004-10-07 Fuji Electric Retail Systems Co Ltd Cosmetic vending machine
KR101633345B1 (en) * 2009-09-07 2016-06-28 (주)아모레퍼시픽 Method and Kit for assessment of the neck skin age
US8526573B2 (en) * 2009-11-25 2013-09-03 Merge Healthcare Incorporated Systems and methods for remote diagnostic imaging
KR101615434B1 (en) * 2014-06-03 2016-04-26 고려대학교 산학협력단 Evaluation, diagnostic devices and it's methods for Hand and Foot Skin Diseases

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060095297A1 (en) * 2004-10-29 2006-05-04 Virik Ravinder S Skin assessment kiosk
KR101509991B1 (en) * 2013-11-28 2015-04-08 고려대학교 산학협력단 Skin texture measurement method and apparatus
US20170270593A1 (en) * 2016-03-21 2017-09-21 The Procter & Gamble Company Systems and Methods For Providing Customized Product Recommendations
US20170340267A1 (en) * 2016-05-24 2017-11-30 Cal-Comp Big Data, Inc. Personalized skin diagnosis and skincare
CN106983493A (en) * 2017-03-04 2017-07-28 武汉嫦娥医学抗衰机器人股份有限公司 A kind of skin image processing method based on three spectrum

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112950858A (en) * 2021-01-29 2021-06-11 泉州市威互科技有限公司 Intelligent pharmacy based on internet and system thereof

Also Published As

Publication number Publication date
KR101927286B1 (en) 2018-12-10
WO2019225870A1 (en) 2019-11-28
JP2021524367A (en) 2021-09-13

Similar Documents

Publication Publication Date Title
CN112165895A (en) Unmanned analysis device for skin of face and back of hand
US11832958B2 (en) Automatic image-based skin diagnostics using deep learning
US10614921B2 (en) Personalized skin diagnosis and skincare
US9445087B2 (en) Systems, devices, and methods for providing products and consultations
CN103164615B (en) A kind of skin image analysing computer management system
CN105455522A (en) Intelligent cosmetic mirror
US20170319065A1 (en) Information processing device, information processing method, and program
CN209074574U (en) The back of the hand, finger nail, foot skin diagnosis and health test apparatus
JP6289837B2 (en) Skin condition measurement analysis information management system and skin condition measurement analysis information management method
WO2006030600A1 (en) Skin analysis network system
CN108090422A (en) Hair style recommends method, Intelligent mirror and storage medium
KR20180080140A (en) Personalized skin diagnosis and skincare
KR20180089196A (en) Smart Cosmetic Package
KR101949152B1 (en) Method and Appartus for Skin Condition Diagnosis and System for Providing Makeup Information suitable Skin Condition Using the Same
KR101198199B1 (en) Skin Care Management System And Method thereof, and Portable Device supporting the same
JP5555073B2 (en) Beauty support device and beauty counseling method
WO2022191269A1 (en) Skin condition measurement system
KR102397824B1 (en) Smart cosmetics manager and skin care system and method thereof
CN114502061B (en) Image-based automatic skin diagnosis using deep learning
CN209059152U (en) A kind of wear-type visual function detection device
CN115423545A (en) Commodity recommendation method, system, equipment and storage medium based on skin detection result
KR20230089312A (en) Cosmetic matching system for each skin type
KR20110003005U (en) In-take device for recommending to a customer an appropriate personal care product
US20180020144A1 (en) Skin analyzer
KR20180079809A (en) Iris diagnosis system and stress diagnosing method for the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination