US20170069052A1 - Systems and Methods of 3D Scanning and Robotic Application of Cosmetics to Human - Google Patents
Systems and Methods of 3D Scanning and Robotic Application of Cosmetics to Human Download PDFInfo
- Publication number
- US20170069052A1 US20170069052A1 US14/846,000 US201514846000A US2017069052A1 US 20170069052 A1 US20170069052 A1 US 20170069052A1 US 201514846000 A US201514846000 A US 201514846000A US 2017069052 A1 US2017069052 A1 US 2017069052A1
- Authority
- US
- United States
- Prior art keywords
- face
- cosmetics
- robot
- control device
- profile
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 239000002537 cosmetic Substances 0.000 title claims abstract description 46
- 238000000034 method Methods 0.000 title claims abstract description 20
- 239000000284 extract Substances 0.000 claims description 5
- 238000005286 illumination Methods 0.000 claims 1
- 230000001815 facial effect Effects 0.000 description 7
- 210000003128 head Anatomy 0.000 description 7
- 210000004709 eyebrow Anatomy 0.000 description 4
- 239000000843 powder Substances 0.000 description 4
- 239000007788 liquid Substances 0.000 description 3
- 239000006071 cream Substances 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 239000007921 spray Substances 0.000 description 2
- 241000282373 Panthera pardus Species 0.000 description 1
- 235000010830 Prunus insititia Nutrition 0.000 description 1
- 244000039376 Prunus insititia Species 0.000 description 1
- 230000003796 beauty Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 239000000839 emulsion Substances 0.000 description 1
- 210000000720 eyelash Anatomy 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 210000004905 finger nail Anatomy 0.000 description 1
- 230000037308 hair color Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- JEIPFZHSYJVQDO-UHFFFAOYSA-N iron(III) oxide Inorganic materials O=[Fe]O[Fe]=O JEIPFZHSYJVQDO-UHFFFAOYSA-N 0.000 description 1
- 239000007934 lip balm Substances 0.000 description 1
- -1 lipsticks Substances 0.000 description 1
- 239000006210 lotion Substances 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 239000002304 perfume Substances 0.000 description 1
- 239000004926 polymethyl methacrylate Substances 0.000 description 1
- 239000011148 porous material Substances 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 239000002884 skin cream Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000011282 treatment Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0014—Image feed-back for automatic industrial control, e.g. robot with camera
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D33/00—Containers or accessories specially adapted for handling powdery toiletry or cosmetic substances
- A45D33/34—Powder-puffs, e.g. with installed container
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D34/00—Containers or accessories specially adapted for handling liquid toiletry or cosmetic substances, e.g. perfumes
- A45D34/04—Appliances specially adapted for applying liquid, e.g. using roller or ball
- A45D34/042—Appliances specially adapted for applying liquid, e.g. using roller or ball using a brush or the like
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D40/00—Casings or accessories specially adapted for storing or handling solid or pasty toiletry or cosmetic substances, e.g. shaving soaps or lipsticks
- A45D40/26—Appliances specially adapted for applying pasty paint, e.g. using roller, using a ball
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
- A45D44/005—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
- B25J13/089—Determining the position of the robot with reference to its environment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/0019—End effectors other than grippers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0003—Home robots, i.e. small robots for domestic use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1684—Tracking a line or surface by means of sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G06T7/0055—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/507—Depth or shape recovery from shading
-
- H04N13/0271—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H04N5/23229—
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
- A45D2044/007—Devices for determining the condition of hair or skin or for selecting the appropriate cosmetic or hair treatment
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/02—Arm motion controller
- Y10S901/09—Closed loop, sensor feedback controls arm movement
Definitions
- the invention relates to systems and methods for three-dimensional (3D) scanning and robotic application of cosmetics to humans.
- Cosmetics can be liquid, cream, emulsions, pressed and loose powders, and dispersions.
- the FDA that regulates use of cosmetics in the United States says that cosmetics are substances applied to the human body “for cleansing, beautifying, promoting attractiveness, or altering the appearance without affecting the body's structure or functions.”
- Some examples of cosmetics include skin creams, lotions, powders, lipsticks, perfumes, fingernail and toe polish, eye and facial makeup, hair color or spray.
- Wikipedia Cosmetics (2015) which is incorporated by reference herein.
- Decorative cosmetics used to improve the user's appearance are often referred to as makeup or make-up.
- the wearer of the makeup will apply it to their facial features (e.g., eyebrows, cheeks, or nose) in an elaborate painstaking fashion.
- eye area are usually applied by hand either with a makeup tool such as a brush, sponge, or the fingertips.
- Highlights are applied using a finger and can be used to accent facial features.
- Bronzer can add tan color, glow, and shimmer.
- Mascara can darken, lengthen, thicken, and draw attention to eyelashes.
- Eyeliner and eyebrow pencils can enlarge the appearance of eyes and eyebrows.
- a system is used to apply cosmetics to a face in accordance with face profiles, including an incoherent light projector for shining spatial varying light to the face, at least one camera spaced from the face and the incoherent light projector, to capture reflected light from the face, a structured light depth processor to communicate with the camera and the projector and to generate an depth image output, a control device that communicates with the structure light depth processor to receive the depth image output, to receive the face profiles, and to generate a plurality of motion trajectory commands, and a robot with an applicator, wherein the robot communicates with the control processor to receive the motion trajectory commands to apply the cosmetics with the applicator to the face in accordance with the face profiles.
- a method is performed in a control device of controlling a robot that applies cosmetics to a face, comprising: (a) receiving a face profile; (b) receiving a depth processor input representing a face; (c) extracting a plurality of features of the face; (d) extracting a robot position with respect to the face from the depth sensor input; (e) matching the face profile of step (a) to the features of step (c); (f) generating robot trajectory based on steps (d) and (e); (g) outputting the robot trajectory to the robot to apply the cosmetics; and (h) repeating steps (b) to (g).
- a method performed in a control device of displaying guides that show how to apply cosmetics to a face by hand comprising the steps of: (a) receiving a face profile; (b) receiving a depth sensor input representing a face; (c) extracting a plurality of features of the face; (d) matching the face profile of step (a) to the features of step (c); (e) generating guides to apply cosmetics; and (f) outputting the guides on the face to a display.
- FIG. 1 illustrates a user and an embodiment of the system where the user has no head rest.
- FIG. 2 illustrates hardware architecture of an embodiment of the system.
- FIG. 3 illustrates a method of control for a system using a feedback loop.
- FIG. 4 illustrates a user and an embodiment of the system without showing the frame.
- FIG. 5 illustrates a user and an embodiment of the system where the user has a head rest.
- FIG. 6 illustrates a method of control for a system displaying guides to apply cosmetics.
- FIG. 7 illustrates a user interface that displays and allows selection of face profiles for application of cosmetics.
- FIG. 8 illustrates the operation of the depth sensing using cameras and incoherent light projectors.
- FIG. 9 illustrates a user interface that displays makeup guides and allows selection of cosmetic applicators.
- FIG. 1 illustrates an embodiment of the system that employs 3D scanning to register the location of facial features.
- a user's head 10 is oriented so the face is in front of an incoherent light projector 14 that bathes the user's face in spatially varying light. Some of that light reflects from the user's face to a camera 12 and a camera 16 that provide information that tells in 3D the facial features.
- ambient light is reduced (e.g., blocked or lights turned off) to a level that prevents saturation of the camera 16 .
- a frame 18 with three panels is used. As shown, the frame 18 has a middle panel between a left and right side panels. Each side panel forms an obtuse angle with the middle panel.
- the middle panel of the frame 18 supports cameras 16 , 26 and a user interface 28 .
- the left panel supports the cameras 12 and 32 , the projectors 14 and 30 .
- the right panel supports another set of cameras 76 and 78 (See FIG. 8 ).
- the frame could be a convex-shaped structure (not shown) that supports and orients the cameras and the projectors toward the user's face.
- a robotic arm 20 communicating with a control device 42 will controllably apply the decorative or care cosmetics (for brevity I refer to both types of cosmetics as makeup in the application) to the user's face based on face profiles stored in the control device and the information from the 3D scanning that will be described in detail below.
- the decorative or care cosmetics for brevity I refer to both types of cosmetics as makeup in the application
- FIG. 2 illustrates a hardware architecture that implements an embodiment of the system.
- a control device 42 includes a computer that can communicate with other hardware, computers and data storage.
- the processors used in the control device 42 are not essential to the invention and could be any general-purpose processor (e.g. Intel Xeon), a graphic processor (GPU). Each processor can read and write data to memory and/or through a link, e.g., Fibre Channel (FC), Serial ATA (SATA), Serial attached SCSI (SAS), Ethernet, Wi-Fi, to a local storage device 48 or to a network accessible storage device 50 .
- the storage devices can be a hard disk drive, a disk array, and/or solid state disk.
- the network storage device 50 permits the face profiles to be accessed and/or edited by face profile editor 51 over the Internet.
- the control device 42 controls and communicates with a robot 20 , a user interface 28 , and a structured light depth sensor 41 through USB, Wi-Fi, or Ethernet.
- the control device 42 executes an operating system such as Linux or Windows.
- a suitable part for the control device 42 and the storage device 48 is Centaurus Rogue2 Gaming Computer manufactured by Centaurus Computer at 6450 Matlea Court, Charlotte, N.C. 28215. For additional details see www.centauruscomputers.com.
- the structured light depth sensor 41 includes a number of light projectors 36 , a number of cameras 38 and a structured light depth processor 40 .
- FIG. 8 further describes the structured light depth sensor 41 .
- the structured light depth processor communicates with at least one incoherent light projector 36 through USB or VGA and at least one camera 38 through the Mobile Industry Processor Interface (MIPI) specification described in MIPI Alliance Standard for Camera Serial Interface CSI -2, which is incorporated by reference herein, or other camera interfaces such as Camera Link® or USB.
- the structured light depth processor 40 may executes a real time operating system (RTOS) such as Wind River, Nucleus, ⁇ COS or another suitable operating system.
- RTOS real time operating system
- a suitable part for the structured light depth processor 40 is the Zyng7000 FPGA manufactured by Xilinx, Inc. at 2100 Logic Drive, San Jose, Calif. 95124.
- a suitable part for the incoherent light projector 36 is the HW8G3 Pico Engine, manufactured by Imagine Optix at 10030 Green Level Church Road, Suite 802-1260, Cary, N.C. 27519. For additional details see www.imagineoptix.com and http://www.picoprojector-info.com.
- a suitable part for the camera 38 is the LI-VM01CM manufactured by Leopard Imaging Inc. at 1130 Chrysler Court, Milpitas, Calif. 95035.
- a suitable part for the robot 20 is the Lynxmotion model AL5D 4DOF robotic arm manufactured by Lynxmotion, a RobotShop Inc. company, at 555 VT Route 78 Suite 367, Swanton, Vt. 05488. RobotShop Inc. is in Mirabel, Quebec, Canada. For additional details see www.robotshop.com.
- a suitable part for the user interface 28 is a touch display such as ME176CX manufactured by ASUS at 800 Corporate Way, Fremont, Calif. 94539. Additional details about implementation of touch screens are described in Wikipedia Touchscreen (2015), incorporated by reference herein.
- FIG. 3 illustrates a method that the control device 42 of FIG. 2 is configured to perform using a feedback loop.
- the control device 42 loads at least one face profile from local storage device 48 or the network storage device 50 .
- the control device 42 will get depth sensor input from the structured light depth sensor 41 .
- the control device 42 will extract the position of the robot 20 .
- the control device 42 extracts the face features obtained by the structured light depth sensor 41 .
- the control device 42 will match the face features obtained at step 56 with the face profile obtained at step 52 .
- the control device 42 will generate the trajectory of the robot 20 .
- the algorithm for step 62 can be based on a painting algorithm described in World Academy of Science, Engineering and Technology Vol: 5 2011-11-27, “Development of Roller-Based Interior Wall Painting Robot” or constructed from the Open Motion Planning Library as discussed in Open Motion Planning Library: A Primer Dec. 22, 2014, which are incorporated as reference herein.
- the control device 42 will output the trajectory of step 62 to the robot 20 , and return (i.e., the feedback loop) to step 54 to repeat the method of FIG. 3 .
- FIG. 4 illustrates the front view of the face while the robotic arm 20 is applying the makeup to the user's face.
- the control device 42 will extract a set of feature points (e.g., feature points 66 , 68 , 70 , 72 , and 74 ) for positioning.
- the feature points are small unique areas in term of shape and color on the face.
- the end of the eyebrows e.g., feature point 68
- the corner of the nose e.g., feature point 70
- the end of the lip (e.g., feature point 72 and 74 ) are feature points that can be uniquely identified and located in a face profile.
- the control device 42 FIG. 2
- additional feature points beyond those illustrated in FIG. 4 is extracted to form a 3D representation of the face.
- FIG. 5 illustrates a system that implements a second embodiment of the invention, which has a head rest. This is the same setup up as in FIG. 1 with the exception of head rest.
- the control device can use the algorithm in FIG. 3 without the step 58 .
- the face profile is captured at the beginning of the operation. Once the application starts, the robot completes the operation without further command from the control device.
- FIG. 6 illustrates a method performed in a control device 42 of FIG. 2 for displaying guides to apply makeup by hand.
- the control device 42 loads a face profile from local storage device 48 or network storage device 50 .
- the control device 42 then captures the face profile from the depth sensor 41 .
- the control device 42 uses the input obtained at the step 54 .
- the control device 42 extracts the facial feature points.
- the algorithm for feature extraction and matching is the same as described in the specification of FIG. 3 and in the OpenCV manual, which is incorporated by reference herein.
- the control device 42 matches the feature points obtained in step 52 with the feature points obtained in step 54 .
- the control device 42 generates and displays a makeup guide 122 ( FIG.
- an applicator location 120 (e.g., a circle or fingertip) is displayed on the face as shown in FIG. 9 .
- the control device 42 then continues at step 54 .
- FIG. 7 illustrates the user interface 28 in FIG. 2 , which is a computer implemented touch screen or GUI that displays face profiles with selectable buttons to permit the user to select which face profile to use for application of makeup. Face profile names 90 are shown. Selection buttons 92 are adjacent to the profile name. Once a user selected a profile 94 , that profile name 98 and the face profile 100 are shown. Then the user starts the application by clicking on the makeup button 96 . When the application is complete, the done message 102 is displayed.
- FIG. 8 illustrates the operation of the depth sensing using cameras and incoherent light projectors.
- the high resolution depth profile of the face can be acquired by using the incoherent light projectors 14 and 76 with projected area as represented by dotted lines 106 and 107 , and using the cameras 12 , 16 and 78 with the field of view represented by the solid lines 104 and 105 .
- Multiple lights from the projectors 14 and 76 can shine on the face.
- Multiple cameras 12 , 16 , and 78 acquire images of the face simultaneously.
- the structured light pattern from the incoherent light projector varies in space as well as in time. For example, a black and white stripe pattern may shift (e.g., left to right) across the field of view in time.
- the reflection of the face of the head 10 will show varying light intensity.
- the light intensity on the face is captured by at least two cameras. Correlations between the pixels of the camera(s) are obtained. In another embodiment, the light intensity on the face is captured by at least one camera. Correlations between the pixels of the camera and the projector are obtained. In either embodiment, from the correlation the distance of the face from the camera can be calculated through triangulation. For additional details, see Wikipedia Structured Light 3 D Scanner (2015), which is incorporated by reference herein.
- FIG. 9 illustrates in an embodiment the user interface 28 ( FIG. 2 ) displays face profiles with selectable buttons 110 to permit the user to select a makeup applicator 114 .
- the interface 28 displays an applicator name 116 (e.g., # 2 Blue Pen), a face profile 118 , and a makeup guide 122 .
- the user starts applying makeup following the makeup guide 122 .
- the applicator location 120 is shown on the face in the display.
- the user moves the applicator to follow the makeup guide 122 to apply the makeup.
- User can then select the next makeup applicator (e.g., # 4 Brush) to continue.
- the next makeup applicator e.g., # 4 Brush
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Systems and methods for applying cosmetics are provided using an incoherent light projector shining light on the face, capturing the reflected light using a camera and the projector, communicating with the camera and the projector and a structured light depth processor to generate a depth image output. A control device communicates with the structure light depth sensor to receive the output, to receive the face profiles and generate motion trajectory commands, and a robot communicates with the control device to receive the commands to apply the cosmetics to the face in accordance with the face profiles. Methods for applying the cosmetics include receiving a face profile, receiving a depth sensor input representing a face, extracting face features, matching the face profile to the face features, and generating a guide or outputting robot trajectory to apply the cosmetics.
Description
- The invention relates to systems and methods for three-dimensional (3D) scanning and robotic application of cosmetics to humans.
- Today, many women and men are using decorative and care cosmetics. Cosmetics can be liquid, cream, emulsions, pressed and loose powders, and dispersions. The FDA that regulates use of cosmetics in the United States says that cosmetics are substances applied to the human body “for cleansing, beautifying, promoting attractiveness, or altering the appearance without affecting the body's structure or functions.” Some examples of cosmetics include skin creams, lotions, powders, lipsticks, perfumes, fingernail and toe polish, eye and facial makeup, hair color or spray. For additional details, see Wikipedia Cosmetics (2015), which is incorporated by reference herein.
- Decorative cosmetics used to improve the user's appearance are often referred to as makeup or make-up. Typically, the wearer of the makeup will apply it to their facial features (e.g., eyebrows, cheeks, or nose) in an elaborate painstaking fashion. Because of the need to safely apply cosmetics on the face, eye area are usually applied by hand either with a makeup tool such as a brush, sponge, or the fingertips.
- Many cosmetics are applied by a finger, a hand or a tool such as sponge held by a hand. Primers are applied using a finger used to reduce pore size, extend wear and permit smooth application of makeup. Lipstick is usually applied with a brush to add color and texture to the lips. Lip gloss adds shine and color. Lip balm moisturizes and protects the lip. Concealer is often applied by finger and is thicker and more solid than foundation and can be used to cover imperfections and contour the nose, cheeks, and jaw. Foundations again are applied with a finger in moist powder, liquid, and spray form smooth imperfections and help make-up last longer. Rouge or blush are often brushed on as a moist powder, cream, and liquid forms can color and define cheeks. Highlights are applied using a finger and can be used to accent facial features. Bronzer can add tan color, glow, and shimmer. Mascara can darken, lengthen, thicken, and draw attention to eyelashes. Eyeliner and eyebrow pencils can enlarge the appearance of eyes and eyebrows.
- Although people refer to beauty advisers at retailers and magazines for the latest style, they apply cosmetics by hand daily or at intervals will splurge and hire a makeup professional or cosmetician to advise and provide facial and body treatments for clients. Daily application by a professional is usually expensive.
- In a feature of the invention, a system is used to apply cosmetics to a face in accordance with face profiles, including an incoherent light projector for shining spatial varying light to the face, at least one camera spaced from the face and the incoherent light projector, to capture reflected light from the face, a structured light depth processor to communicate with the camera and the projector and to generate an depth image output, a control device that communicates with the structure light depth processor to receive the depth image output, to receive the face profiles, and to generate a plurality of motion trajectory commands, and a robot with an applicator, wherein the robot communicates with the control processor to receive the motion trajectory commands to apply the cosmetics with the applicator to the face in accordance with the face profiles.
- In another feature of the invention, a method is performed in a control device of controlling a robot that applies cosmetics to a face, comprising: (a) receiving a face profile; (b) receiving a depth processor input representing a face; (c) extracting a plurality of features of the face; (d) extracting a robot position with respect to the face from the depth sensor input; (e) matching the face profile of step (a) to the features of step (c); (f) generating robot trajectory based on steps (d) and (e); (g) outputting the robot trajectory to the robot to apply the cosmetics; and (h) repeating steps (b) to (g).
- In still another feature of the invention, a method performed in a control device of displaying guides that show how to apply cosmetics to a face by hand, comprising the steps of: (a) receiving a face profile; (b) receiving a depth sensor input representing a face; (c) extracting a plurality of features of the face; (d) matching the face profile of step (a) to the features of step (c); (e) generating guides to apply cosmetics; and (f) outputting the guides on the face to a display.
-
FIG. 1 illustrates a user and an embodiment of the system where the user has no head rest. -
FIG. 2 illustrates hardware architecture of an embodiment of the system. -
FIG. 3 illustrates a method of control for a system using a feedback loop. -
FIG. 4 illustrates a user and an embodiment of the system without showing the frame. -
FIG. 5 illustrates a user and an embodiment of the system where the user has a head rest. -
FIG. 6 illustrates a method of control for a system displaying guides to apply cosmetics. -
FIG. 7 illustrates a user interface that displays and allows selection of face profiles for application of cosmetics. -
FIG. 8 illustrates the operation of the depth sensing using cameras and incoherent light projectors. -
FIG. 9 illustrates a user interface that displays makeup guides and allows selection of cosmetic applicators. - The following description includes the best mode of carrying out the invention. The detailed description illustrates the principles of the invention and should not be taken in a limiting sense. The scope of the invention is determined by reference to the claims. Each part (or step) is assigned its own part (or step) number throughout the specification and drawings. The method drawings illustrate a specific sequence of steps, but the steps can be performed in parallel and/or in different sequence to achieve the same result.
-
FIG. 1 illustrates an embodiment of the system that employs 3D scanning to register the location of facial features. A user'shead 10 is oriented so the face is in front of anincoherent light projector 14 that bathes the user's face in spatially varying light. Some of that light reflects from the user's face to acamera 12 and acamera 16 that provide information that tells in 3D the facial features. In operation, ambient light is reduced (e.g., blocked or lights turned off) to a level that prevents saturation of thecamera 16. - It is not essential to the invention what type of physical structure is used to support the cameras and incoherent light projectors. In the illustrated embodiment, a
frame 18 with three panels is used. As shown, theframe 18 has a middle panel between a left and right side panels. Each side panel forms an obtuse angle with the middle panel. The middle panel of theframe 18 supportscameras user interface 28. The left panel supports thecameras projectors cameras 76 and 78 (SeeFIG. 8 ). Alternatively, the frame could be a convex-shaped structure (not shown) that supports and orients the cameras and the projectors toward the user's face. - A
robotic arm 20 communicating with a control device 42 (FIG. 2 ) will controllably apply the decorative or care cosmetics (for brevity I refer to both types of cosmetics as makeup in the application) to the user's face based on face profiles stored in the control device and the information from the 3D scanning that will be described in detail below. -
FIG. 2 illustrates a hardware architecture that implements an embodiment of the system. Acontrol device 42 includes a computer that can communicate with other hardware, computers and data storage. Hennessy and Patterson, Computer Architecture: A Quantitative Approach (2012), and Patterson and Hennessy, Computer Organization and Design: The Hardware/Software Interface (2013), which are incorporated by reference herein, describe computer hardware and software, storage systems, caching, and networks. - The processors used in the
control device 42 are not essential to the invention and could be any general-purpose processor (e.g. Intel Xeon), a graphic processor (GPU). Each processor can read and write data to memory and/or through a link, e.g., Fibre Channel (FC), Serial ATA (SATA), Serial attached SCSI (SAS), Ethernet, Wi-Fi, to alocal storage device 48 or to a networkaccessible storage device 50. The storage devices can be a hard disk drive, a disk array, and/or solid state disk. In an embodiment, thenetwork storage device 50 permits the face profiles to be accessed and/or edited byface profile editor 51 over the Internet. Thecontrol device 42 controls and communicates with arobot 20, auser interface 28, and a structuredlight depth sensor 41 through USB, Wi-Fi, or Ethernet. Thecontrol device 42 executes an operating system such as Linux or Windows. - In an embodiment, a suitable part for the
control device 42 and thestorage device 48 is Centaurus Rogue2 Gaming Computer manufactured by Centaurus Computer at 6450 Matlea Court, Charlotte, N.C. 28215. For additional details see www.centauruscomputers.com. - The structured
light depth sensor 41 includes a number oflight projectors 36, a number ofcameras 38 and a structuredlight depth processor 40.FIG. 8 further describes the structuredlight depth sensor 41. The structured light depth processor communicates with at least oneincoherent light projector 36 through USB or VGA and at least onecamera 38 through the Mobile Industry Processor Interface (MIPI) specification described in MIPI Alliance Standard for Camera Serial Interface CSI-2, which is incorporated by reference herein, or other camera interfaces such as Camera Link® or USB. The structuredlight depth processor 40 may executes a real time operating system (RTOS) such as Wind River, Nucleus, μCOS or another suitable operating system. - In an embodiment, a suitable part for the structured
light depth processor 40 is the Zyng7000 FPGA manufactured by Xilinx, Inc. at 2100 Logic Drive, San Jose, Calif. 95124. - In an embodiment, a suitable part for the incoherent
light projector 36 is the HW8G3 Pico Engine, manufactured by Imagine Optix at 10030 Green Level Church Road, Suite 802-1260, Cary, N.C. 27519. For additional details see www.imagineoptix.com and http://www.picoprojector-info.com. - In an embodiment, a suitable part for the
camera 38 is the LI-VM01CM manufactured by Leopard Imaging Inc. at 1130 Cadillac Court, Milpitas, Calif. 95035. - In an embodiment, a suitable part for the
robot 20 is the Lynxmotion model AL5D 4DOF robotic arm manufactured by Lynxmotion, a RobotShop Inc. company, at 555VT Route 78 Suite 367, Swanton, Vt. 05488. RobotShop Inc. is in Mirabel, Quebec, Canada. For additional details see www.robotshop.com. - In an embodiment, a suitable part for the
user interface 28 is a touch display such as ME176CX manufactured by ASUS at 800 Corporate Way, Fremont, Calif. 94539. Additional details about implementation of touch screens are described in Wikipedia Touchscreen (2015), incorporated by reference herein. -
FIG. 3 illustrates a method that thecontrol device 42 ofFIG. 2 is configured to perform using a feedback loop. Atstep 52 thecontrol device 42 loads at least one face profile fromlocal storage device 48 or thenetwork storage device 50. Atstep 54, thecontrol device 42 will get depth sensor input from the structuredlight depth sensor 41. Atstep 58, thecontrol device 42 will extract the position of therobot 20. Atstep 56, in parallel or serially with respect to step 58, thecontrol device 42 extracts the face features obtained by the structuredlight depth sensor 41. Atstep 60, thecontrol device 42 will match the face features obtained atstep 56 with the face profile obtained atstep 52. The OpenCV Reference Manual—Release 2.4.10.0 at pages 426-427 (2014), describes a suitable extraction and tracking (i.e., matching) algorithm known as the Fast Retina Keypoint (FREAK) algorithm (hereinafter “the OpenCV manual”), which is incorporated by reference. Atstep 62, thecontrol device 42 will generate the trajectory of therobot 20. The algorithm forstep 62 can be based on a painting algorithm described in World Academy of Science, Engineering and Technology Vol: 5 2011-11-27, “Development of Roller-Based Interior Wall Painting Robot” or constructed from the Open Motion Planning Library as discussed in Open Motion Planning Library: A Primer Dec. 22, 2014, which are incorporated as reference herein. Atstep 64, thecontrol device 42 will output the trajectory ofstep 62 to therobot 20, and return (i.e., the feedback loop) to step 54 to repeat the method ofFIG. 3 . -
FIG. 4 illustrates the front view of the face while therobotic arm 20 is applying the makeup to the user's face. Using the input from the structuredlight depth sensor 41, thecontrol device 42 will extract a set of feature points (e.g., feature points 66, 68, 70, 72, and 74) for positioning. The feature points are small unique areas in term of shape and color on the face. For example, the end of the eyebrows (e.g., feature point 68), the corner of the nose (e.g., feature point 70), or the end of the lip ((e.g.,feature point 72 and 74) are feature points that can be uniquely identified and located in a face profile. In an embodiment, the control device 42 (FIG. 2 ) will use feature points to align the loaded face profile to the user's face for accurate application of makeup either at an initial position or preferably frame by frame. In an embodiment, additional feature points beyond those illustrated inFIG. 4 is extracted to form a 3D representation of the face. -
FIG. 5 illustrates a system that implements a second embodiment of the invention, which has a head rest. This is the same setup up as inFIG. 1 with the exception of head rest. When using the head rest, the control device can use the algorithm inFIG. 3 without thestep 58. The face profile is captured at the beginning of the operation. Once the application starts, the robot completes the operation without further command from the control device. -
FIG. 6 illustrates a method performed in acontrol device 42 ofFIG. 2 for displaying guides to apply makeup by hand. Atstep 52, thecontrol device 42 loads a face profile fromlocal storage device 48 ornetwork storage device 50. Atstep 54, thecontrol device 42 then captures the face profile from thedepth sensor 41. Using the input obtained at thestep 54, thecontrol device 42 extracts the facial feature points. The algorithm for feature extraction and matching is the same as described in the specification ofFIG. 3 and in the OpenCV manual, which is incorporated by reference herein. Atstep 60, thecontrol device 42 matches the feature points obtained instep 52 with the feature points obtained instep 54. Atstep 61, thecontrol device 42 generates and displays a makeup guide 122 (FIG. 9 ) on the face to show where to apply the makeup. When the user brings an applicator within field of view of at least one camera, an applicator location 120 (e.g., a circle or fingertip) is displayed on the face as shown inFIG. 9 . Thecontrol device 42 then continues atstep 54. -
FIG. 7 illustrates theuser interface 28 inFIG. 2 , which is a computer implemented touch screen or GUI that displays face profiles with selectable buttons to permit the user to select which face profile to use for application of makeup. Faceprofile names 90 are shown.Selection buttons 92 are adjacent to the profile name. Once a user selected aprofile 94, thatprofile name 98 and theface profile 100 are shown. Then the user starts the application by clicking on themakeup button 96. When the application is complete, the donemessage 102 is displayed. -
FIG. 8 illustrates the operation of the depth sensing using cameras and incoherent light projectors. The high resolution depth profile of the face can be acquired by using theincoherent light projectors dotted lines cameras solid lines 104 and 105. Multiple lights from theprojectors Multiple cameras head 10 will show varying light intensity. In an embodiment, the light intensity on the face is captured by at least two cameras. Correlations between the pixels of the camera(s) are obtained. In another embodiment, the light intensity on the face is captured by at least one camera. Correlations between the pixels of the camera and the projector are obtained. In either embodiment, from the correlation the distance of the face from the camera can be calculated through triangulation. For additional details, see Wikipedia Structured Light 3D Scanner (2015), which is incorporated by reference herein. -
FIG. 9 illustrates in an embodiment the user interface 28 (FIG. 2 ) displays face profiles withselectable buttons 110 to permit the user to select amakeup applicator 114. Once the user selectsapplicator button 112 as shown, theinterface 28 displays an applicator name 116 (e.g., #2 Blue Pen), aface profile 118, and amakeup guide 122. The user starts applying makeup following themakeup guide 122. When the applicator is within the field of view of at least one of the cameras, theapplicator location 120 is shown on the face in the display. The user moves the applicator to follow themakeup guide 122 to apply the makeup. User can then select the next makeup applicator (e.g., #4 Brush) to continue.
Claims (20)
1. A system for applying cosmetics to a face in accordance with face profiles, comprising:
a structured light depth sensor that captures a 3D representation of the face frame by frame comprising:
an incoherent light projector configured to bath the face in spatial varying light to the face;
a camera spaced from the face and the incoherent light projector configured to capture some of the reflected light from the face, and
a structured light depth processor configured to execute a real time operating system and communicate with the camera and the projector and to generate a depth image output;
a control device configured to communicate with the structured light depth processor to receive the depth image output, to receive a face profile, and to generate a plurality of motion trajectory commands; and
a robot configured to communicate with the control device to receive the motion trajectory commands to apply the cosmetics to the face in accordance with the face profile.
2. The system of claim 1 , wherein the control device is configured to construct a 3D model of the face from the depth image output, extract features of the face, match the features to the face profile, generate robot motion planning from the face profile, and output the motion planning to the robot to apply cosmetics on the face.
3. The system of claim 2 , wherein the control device is further configured to operate at time intervals that will permit motion of the face without misapplication of the cosmetics on the face.
4. The system of claim 2 , wherein the control device is further configured to extract the robot position at time intervals that will permit motion of the face without misapplication of the cosmetics on the face.
5. The system of claim 1 , further comprising a storage device configured to communicate with the control device to store the face profiles and a user interface configured to display face profile(s) that can be selected by the user as the face profile.
6. The system of claim 1 , further comprising a network storage device is configured to permit the face profiles to be accessed and/or edited by third parties and users over the Internet.
7. The system of claim 1 , further comprising a head rest for keeping the face stationary during the time the robot is applying the cosmetics to the face.
8. The system of claim 1 , wherein each of the face profiles includes a 3D model that represents the face and a cosmetics design that assigns at least one color value to the 3D model.
9. The system of claim 1 , further comprising enclosures and lighting to control the illumination on the face from ambient light to limit saturation of the camera.
10. The system of claim 1 , wherein the robot includes an applicator comprising a nozzle, a brush, or a pen.
11. The system of claim 1 , wherein the robot is configured to change the type of applicator to meet the requirements of the face profile.
12. The system of claim 1 , further comprising displaying the cosmetics designs and selecting one of the cosmetics designs to apply the cosmetics.
13. A method performed in a control device of controlling a robot that applies cosmetics to a face, comprising the steps of:
(a) receiving a face profile;
(b) receiving a depth sensor input representing spatial varying incoherent light reflected from a face;
(c) extracting a plurality of features of the face from the depth sensor input;
(d) matching the face profile of step (a) to the features of step (c);
(e) generating robot trajectory based on step (d); and
(f) outputting the robot trajectory to the robot to apply the cosmetics.
14. The method of claim 13 , wherein each of the face profiles includes a 3D model that represents the face and a cosmetics design that assigns at least one color value to the 3D model.
15. The method of claim 13 , further comprising displaying user selectable cosmetics designs to apply the cosmetics.
16. The method of claim 13 , further comprising extracting a robot position with respect to the face from the depth sensor input.
17. The method of claim 16 , further comprising repeating steps (b) to (f).
18. A method performed in a control device of displaying guides that applies cosmetics to a face by hand, comprising the steps of:
(a) receiving a face profile;
(b) receiving a depth sensor input representing incoherent light reflected from a face;
(c) extracting a plurality of features of the face;
(d) matching the face profile of step (a) to the features of step (c);
(e) generating guides to apply cosmetics; and
(f) outputting the guides on face to a display.
19. The method of claim 18 , wherein each of the face profiles includes a 3D model that represents the face and a cosmetics design that assigns at least one color value to the 3D model.
20. The method of claim 18 , further comprising displaying user selectable cosmetics applicators.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/846,000 US9607347B1 (en) | 2015-09-04 | 2015-09-04 | Systems and methods of 3D scanning and robotic application of cosmetics to human |
US15/463,206 US9811717B2 (en) | 2015-09-04 | 2017-03-20 | Systems and methods of robotic application of cosmetics |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/846,000 US9607347B1 (en) | 2015-09-04 | 2015-09-04 | Systems and methods of 3D scanning and robotic application of cosmetics to human |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/463,206 Continuation-In-Part US9811717B2 (en) | 2015-09-04 | 2017-03-20 | Systems and methods of robotic application of cosmetics |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170069052A1 true US20170069052A1 (en) | 2017-03-09 |
US9607347B1 US9607347B1 (en) | 2017-03-28 |
Family
ID=58190107
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/846,000 Active US9607347B1 (en) | 2015-09-04 | 2015-09-04 | Systems and methods of 3D scanning and robotic application of cosmetics to human |
Country Status (1)
Country | Link |
---|---|
US (1) | US9607347B1 (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9814297B1 (en) * | 2017-04-06 | 2017-11-14 | Newtonoid Technologies, L.L.C. | Cosmetic applicator |
CN108185624A (en) * | 2018-02-09 | 2018-06-22 | 武汉技兴科技有限公司 | The method and device that a kind of human body hair style is intelligently trimmed |
CN108466265A (en) * | 2018-03-12 | 2018-08-31 | 珠海市俊凯机械科技有限公司 | Mechanical arm path planning and operational method, device and computer equipment |
CN108673526A (en) * | 2018-06-03 | 2018-10-19 | 迈博知识产权代理秦皇岛有限公司 | A kind of automatic pedicure robot |
US20190053608A1 (en) * | 2017-04-06 | 2019-02-21 | Newtonoid Technologies, L.L.C. | Cosmetic applicator |
CN109381165A (en) * | 2018-09-12 | 2019-02-26 | 维沃移动通信有限公司 | A kind of skin detecting method and mobile terminal |
CN109807881A (en) * | 2017-11-20 | 2019-05-28 | 广明光电股份有限公司 | The introduction system and method for robotic arm operation track |
CN109976148A (en) * | 2017-12-28 | 2019-07-05 | 深圳市优必选科技有限公司 | Robot motion path planning method and device, storage medium and terminal equipment |
US10404967B2 (en) * | 2016-02-26 | 2019-09-03 | Florian Willomitzer | Optical 3-D sensor for fast and dense shape capture |
JP2019536651A (en) * | 2016-11-16 | 2019-12-19 | ウィンク・ロボティクス | Machines for beauty salons |
CN110605714A (en) * | 2019-08-06 | 2019-12-24 | 华中科技大学 | Hand-eye coordination grabbing method based on human eye fixation point |
CN111152232A (en) * | 2018-11-08 | 2020-05-15 | 现代自动车株式会社 | Service robot and method for operating the same |
US10757394B1 (en) | 2015-11-09 | 2020-08-25 | Cognex Corporation | System and method for calibrating a plurality of 3D sensors with respect to a motion conveyance |
CN111571611A (en) * | 2020-05-26 | 2020-08-25 | 广州纳丽生物科技有限公司 | Facial operation robot track planning method based on facial and skin features |
US10812778B1 (en) * | 2015-11-09 | 2020-10-20 | Cognex Corporation | System and method for calibrating one or more 3D sensors mounted on a moving manipulator |
CN112402819A (en) * | 2020-11-18 | 2021-02-26 | 深圳市萨米医疗中心 | Ultrasonic treatment device based on facial recognition and use method thereof |
US11040452B2 (en) | 2018-05-29 | 2021-06-22 | Abb Schweiz Ag | Depth sensing robotic hand-eye camera using structured light |
US11116303B2 (en) * | 2016-12-06 | 2021-09-14 | Koninklijke Philips N.V. | Displaying a guidance indicator to a user |
WO2021223735A1 (en) * | 2020-05-07 | 2021-11-11 | 上海振华港机重工有限公司 | Control system and control method for automatic spraying robot |
CN114947345A (en) * | 2022-05-31 | 2022-08-30 | 南昌远彡戴创新研发有限公司 | Automatic make-up machine of 3D |
WO2023278679A1 (en) * | 2021-06-30 | 2023-01-05 | L'oreal | Cosmetic design applicator system |
US11562502B2 (en) | 2015-11-09 | 2023-01-24 | Cognex Corporation | System and method for calibrating a plurality of 3D sensors with respect to a motion conveyance |
FR3126289A1 (en) * | 2021-08-25 | 2023-03-03 | L'oreal | Addressable Electroactive Polymer Sets for Cosmetic Drawing Application |
US11793290B2 (en) | 2021-06-30 | 2023-10-24 | L'oreal | Addressable electroactive polymer arrays for cosmetic design application |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20240215705A1 (en) * | 2022-12-30 | 2024-07-04 | L'oreal | Assistive stand for cosmetic applicator configured for users with limited mobility |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8162125B1 (en) * | 1996-05-29 | 2012-04-24 | Cummins-Allison Corp. | Apparatus and system for imaging currency bills and financial documents and method for using the same |
FR2810761B1 (en) | 2000-06-26 | 2003-09-05 | Oreal | COSMETIC TREATMENT PROCESS AND DEVICE, IN PARTICULAR FOR CARE, MAKE-UP OR COLORING |
AU2006279800B2 (en) | 2005-08-12 | 2012-03-29 | Tcms Transparent Beauty Llc | System and method for medical monitoring and treatment through cosmetic monitoring and treatment |
TW201212852A (en) * | 2010-09-21 | 2012-04-01 | Zong Jing Investment Inc | Facial cosmetic machine |
KR20120087256A (en) * | 2010-12-17 | 2012-08-07 | 한국전자통신연구원 | Method For Operating Makeup Robot Based On Expert Knowledge And System Thereof |
TWI463955B (en) | 2012-02-20 | 2014-12-11 | Zong Jing Investment Inc | Eye makeup device |
TWI543726B (en) | 2012-12-07 | 2016-08-01 | 宗經投資股份有限公司 | Automatic coloring system and method thereof |
TW201424624A (en) | 2012-12-21 | 2014-07-01 | Zong Jing Investment Inc | Method for moving cosmetic tool of auto-makeup apparatus |
TW201425871A (en) | 2012-12-21 | 2014-07-01 | Zong Jing Investment Inc | Distance detecting method and computer program product |
KR20150039019A (en) * | 2013-10-01 | 2015-04-09 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
-
2015
- 2015-09-04 US US14/846,000 patent/US9607347B1/en active Active
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11562502B2 (en) | 2015-11-09 | 2023-01-24 | Cognex Corporation | System and method for calibrating a plurality of 3D sensors with respect to a motion conveyance |
US10812778B1 (en) * | 2015-11-09 | 2020-10-20 | Cognex Corporation | System and method for calibrating one or more 3D sensors mounted on a moving manipulator |
US10757394B1 (en) | 2015-11-09 | 2020-08-25 | Cognex Corporation | System and method for calibrating a plurality of 3D sensors with respect to a motion conveyance |
US10404967B2 (en) * | 2016-02-26 | 2019-09-03 | Florian Willomitzer | Optical 3-D sensor for fast and dense shape capture |
JP2019536651A (en) * | 2016-11-16 | 2019-12-19 | ウィンク・ロボティクス | Machines for beauty salons |
US11116303B2 (en) * | 2016-12-06 | 2021-09-14 | Koninklijke Philips N.V. | Displaying a guidance indicator to a user |
US20190053608A1 (en) * | 2017-04-06 | 2019-02-21 | Newtonoid Technologies, L.L.C. | Cosmetic applicator |
US9814297B1 (en) * | 2017-04-06 | 2017-11-14 | Newtonoid Technologies, L.L.C. | Cosmetic applicator |
CN109807881A (en) * | 2017-11-20 | 2019-05-28 | 广明光电股份有限公司 | The introduction system and method for robotic arm operation track |
CN109976148A (en) * | 2017-12-28 | 2019-07-05 | 深圳市优必选科技有限公司 | Robot motion path planning method and device, storage medium and terminal equipment |
CN108185624A (en) * | 2018-02-09 | 2018-06-22 | 武汉技兴科技有限公司 | The method and device that a kind of human body hair style is intelligently trimmed |
CN108466265A (en) * | 2018-03-12 | 2018-08-31 | 珠海市俊凯机械科技有限公司 | Mechanical arm path planning and operational method, device and computer equipment |
US11040452B2 (en) | 2018-05-29 | 2021-06-22 | Abb Schweiz Ag | Depth sensing robotic hand-eye camera using structured light |
CN108673526A (en) * | 2018-06-03 | 2018-10-19 | 迈博知识产权代理秦皇岛有限公司 | A kind of automatic pedicure robot |
CN109381165A (en) * | 2018-09-12 | 2019-02-26 | 维沃移动通信有限公司 | A kind of skin detecting method and mobile terminal |
CN111152232A (en) * | 2018-11-08 | 2020-05-15 | 现代自动车株式会社 | Service robot and method for operating the same |
CN110605714B (en) * | 2019-08-06 | 2021-08-03 | 华中科技大学 | Hand-eye coordination grabbing method based on human eye fixation point |
CN110605714A (en) * | 2019-08-06 | 2019-12-24 | 华中科技大学 | Hand-eye coordination grabbing method based on human eye fixation point |
WO2021223735A1 (en) * | 2020-05-07 | 2021-11-11 | 上海振华港机重工有限公司 | Control system and control method for automatic spraying robot |
CN111571611A (en) * | 2020-05-26 | 2020-08-25 | 广州纳丽生物科技有限公司 | Facial operation robot track planning method based on facial and skin features |
CN112402819A (en) * | 2020-11-18 | 2021-02-26 | 深圳市萨米医疗中心 | Ultrasonic treatment device based on facial recognition and use method thereof |
WO2023278679A1 (en) * | 2021-06-30 | 2023-01-05 | L'oreal | Cosmetic design applicator system |
US11793290B2 (en) | 2021-06-30 | 2023-10-24 | L'oreal | Addressable electroactive polymer arrays for cosmetic design application |
FR3126289A1 (en) * | 2021-08-25 | 2023-03-03 | L'oreal | Addressable Electroactive Polymer Sets for Cosmetic Drawing Application |
CN114947345A (en) * | 2022-05-31 | 2022-08-30 | 南昌远彡戴创新研发有限公司 | Automatic make-up machine of 3D |
Also Published As
Publication number | Publication date |
---|---|
US9607347B1 (en) | 2017-03-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9607347B1 (en) | Systems and methods of 3D scanning and robotic application of cosmetics to human | |
US9811717B2 (en) | Systems and methods of robotic application of cosmetics | |
US20200167983A1 (en) | Precise application of cosmetic looks from over a network environment | |
JP6985403B2 (en) | Methods for age appearance simulation | |
US9846803B2 (en) | Makeup supporting device, makeup supporting system, makeup supporting method, and non-transitory computer-readable recording medium | |
JP6435516B2 (en) | Makeup support device, makeup support method, and makeup support program | |
JP5324031B2 (en) | Beauty simulation system | |
US20120016231A1 (en) | System and method for three dimensional cosmetology imaging with structured light | |
US20140210814A1 (en) | Apparatus and method for virtual makeup | |
BR102012033722B1 (en) | system and method for makeup simulation on portable devices equipped with digital camera | |
US11222452B2 (en) | System and method of augmenting images of a user | |
US20100142755A1 (en) | Method, System, and Computer Program Product for Providing Cosmetic Application Instructions Using Arc Lines | |
CN106447739A (en) | Method for generating makeup region dynamic image and beauty makeup assisting method and device | |
CN116830073A (en) | Digital color palette | |
JP2009039523A (en) | Terminal device to be applied for makeup simulation | |
CN110738620A (en) | Intelligent makeup method, cosmetic mirror and storage medium | |
KR101719927B1 (en) | Real-time make up mirror simulation apparatus using leap motion | |
JP2013178789A (en) | Beauty simulation system | |
WO2021070698A1 (en) | Automatic makeup machine, method, program, and control device | |
JP6132248B2 (en) | Makeup support device | |
KR102372524B1 (en) | System for buying service of cosmetic object and applying selective makeup effect | |
US20210154091A1 (en) | System, devices, and methods for long lasting lip plumping | |
CN103885461A (en) | Movement method for makeup tool of automatic makeup machine | |
US20230101374A1 (en) | Augmented reality cosmetic design filters | |
Song et al. | Development of virtual makeup tool based on mobile augmented reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 4 |