WO2023056333A1 - Augmented reality cosmetic design filters - Google Patents

Augmented reality cosmetic design filters Download PDF

Info

Publication number
WO2023056333A1
WO2023056333A1 PCT/US2022/077230 US2022077230W WO2023056333A1 WO 2023056333 A1 WO2023056333 A1 WO 2023056333A1 US 2022077230 W US2022077230 W US 2022077230W WO 2023056333 A1 WO2023056333 A1 WO 2023056333A1
Authority
WO
WIPO (PCT)
Prior art keywords
cosmetic
biological surface
generating
formulation
cosmetic formulation
Prior art date
Application number
PCT/US2022/077230
Other languages
French (fr)
Other versions
WO2023056333A4 (en
Inventor
David B. Kosecoff
Original Assignee
L'oreal
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/491,061 external-priority patent/US20230101374A1/en
Priority claimed from FR2113547A external-priority patent/FR3130423B1/en
Application filed by L'oreal filed Critical L'oreal
Priority to KR1020247005743A priority Critical patent/KR20240037304A/en
Priority to CN202280053466.0A priority patent/CN117769723A/en
Publication of WO2023056333A1 publication Critical patent/WO2023056333A1/en
Publication of WO2023056333A4 publication Critical patent/WO2023056333A4/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D44/005Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes

Definitions

  • One or more non-transitory computer memory devices can store computer-readable instructions that, when executed by computing circuitry, cause the computing circuitry to perform operations for generating augmented reality cosmetic design filters.
  • the operations can include defining a baseline description of a biological surface using one or more radiation sensors, the radiation sensors being electronically coupled with the computing circuitry.
  • the operations can include identifying an application of a cosmetic formulation to the biological surface using one or more of the radiation sensors.
  • the operations can include generating a trace describing the application in reference to the baseline description.
  • the operations can also include outputting the trace.
  • identifying the application of the cosmetic formulation includes detecting the cosmetic formulation relative to the biological surface using one or more of the radiation sensors and the baseline description.
  • Generating the trace can include receiving information describing a plurality of shape types, attributing a first shape type of the shape types to at least a portion of the biological surface using the baseline description, generating a numerical representation of the application of the cosmetic formulation, the numerical representation describing position information relative to the baseline description, and transforming the numerical representation from the first shape type to a second shape type of the shape types.
  • the biological surface can be a face.
  • the first shape type and the second shape type can describe a facial feature.
  • Defining the baseline description of the biological surface can include projecting invisible electromagnetic radiation onto the biological surface using a radiation source electronically coupled with the computing circuitry.
  • the one or more radiation sensors can include a camera configured to detect electromagnetic radiation from ultraviolet, visible, or infrared spectral ranges, or a combination thereof.
  • generating the trace includes tracking a motion of the application relative to the biological surface and generating a numerical representation of the motion relative to the baseline description.
  • the operations can further include identifying the cosmetic formulation and defining a color of the cosmetic formulation using identifier information of the cosmetic formulation.
  • the operations can further include estimating a surface tone of the biological surface, estimating a formulation tone of the cosmetic formulation using, and determining a color of the cosmetic formulation using the surface tone and the formulation tone.
  • the one or more non-transitory computer memory devices are electronically coupled with a smart phone. Outputting the trace can include communicating the trace to a remote computing system.
  • a method for generating an augmented reality cosmetic design filter can include defining a baseline description of the biological surface using one or more radiation sources and one or more radiation sensors.
  • the method can include identifying an application of a cosmetic formulation to the biological surface using the radiation sensors.
  • the method can include generating a trace describing the application in reference to the baseline description.
  • the method can also include outputting the trace.
  • generating the trace includes receiving shape information describing a plurality of shape types, attributing a first shape type of the shape types to at least a portion of the biological surface using the shape information, generating a numerical representation of the application of the cosmetic formulation, the numerical representation describing position information relative to the baseline description, transforming the numerical representation from the first shape type to a second shape type of the shape types.
  • identifying the application of the cosmetic formulation includes detecting an applicator of the cosmetic formulation, estimating a position of the applicator of the cosmetic formulation relative to the biological surface, tracking a motion of the applicator relative to the biological surface, and generating a numerical representation of the motion relative to the baseline description.
  • the method can further include estimating a surface tone of the biological surface, estimating a formulation tone of the cosmetic formulation, and determining a color of the cosmetic formulation using the surface tone and the formulation tone.
  • FIGURE 1 is a schematic diagram illustrating an embodiment of a system for extrapolating a cosmetic design, in accordance with various embodiments.
  • FIGURE 2 is a schematic diagram illustrating an example technique for preparing an augmented reality cosmetic design filter, in accordance with various embodiments.
  • FIGURE 3 is a schematic diagram illustrating an example technique for deploying an augmented reality cosmetic design filter, in accordance with various embodiments.
  • FIGURE 4 is a schematic diagram illustrating an example technique for transforming an augmented reality cosmetic design filter from one shape type to another shape type, in accordance with various embodiments.
  • FIGURE 5 is a schematic diagram illustrating an example technique for generating an augmented reality cosmetic design filter in multiple layers, in accordance with various embodiments.
  • FIGURE 6 is a schematic diagram illustrating an example technique for determining a formulation color using surface tones and formulation tones, in accordance with various embodiments.
  • FIGURE 7 is a block diagram that illustrates an example system, including components of the system of FIG. 1, in accordance with various embodiments.
  • FIGURE 8 is a block diagram that illustrates aspects of an example computing device, in accordance with various embodiments.
  • Described embodiments employ image sensors to define one or more 3D contour mappings of the target body surface.
  • described embodiments provide improved precision and greater ease of use over complex manual routines. The techniques, therefore, improve the manual application of cosmetics through distribution of dynamic augmented reality cosmetic design filters that can be automatically (e.g., without human intervention) adapted for shape type and skin tone.
  • the filters can reproduce step-by-step cosmetic routines, including augmented reality renderings of specific tools, application traces, and finished effects in a manner that is sensitive to shape type and skin tone of the end user of the augmented reality filters.
  • a user of a smartphone creates a video of their makeup routine that produces a particular aesthetic effect.
  • the smartphone captures images and a surface mapping of the user’s face. Initial images are used to generate a baseline with which subsequent applications of cosmetics, such as foundation, highlight, filler, eyeshadow, blush, mascara, or the like, can be detected.
  • the smartphone iteratively redefines the baseline as part of defining each new application as a new layer of a cosmetic design and recognizes new applications by detecting motion of the user’s hand, identifying a new tool or formulation, and/or detecting a color shift on the user’s face in reference to the updated baseline.
  • a multi-layer dynamic filter is generated including motion, texture, and/or transparency information.
  • the dynamic filter is referenced to a face shape of the creator of the makeup video and is stored on a distributed storage network.
  • the filter is available to different users for virtual projection through an augmented reality interface.
  • the filter can be used to reproduce layer effects, to realistically map a cosmetic design onto a different face shape, and to guide a viewer through a cosmetic routine in a way that is specific to the viewer’s face shape and skin tone, as described in the paragraphs, below.
  • FIG. 1 is a schematic diagram illustrating an embodiment of an example system 100 for generating an augmented reality cosmetic design filter 110, in accordance with various embodiments.
  • the example system 100 includes one or more client computing devices 130, a camera 140, one or more remote computer systems 160, also referred to as server(s), and one or more user devices 190.
  • the constituent system components can be operably coupled through wireless communication and/or through wired communication.
  • constituent components can communicate directly through wireless pairing (e.g., Bluetooth) and/or via a local wireless network (e.g., through a wireless router).
  • constituent components can communicate over one or more networks 170, which can be or include a public network (e.g., the internet) or a private network (e.g., an intranet).
  • networks 170 can be or include a public network (e.g., the internet) or a private network (e.g., an intranet).
  • the example system 100 can include multiple distinct devices configured to communicate electronic information through wireless connections. Additionally or alternatively, some of the constituent components can be integrated into a single device.
  • the augmented reality cosmetic design filter 110 is also referred to as the cosmetic design 110.
  • the cosmetic design 110 is a numerical representation of a cosmetic routine including a set of textures, mapping data, surface information, and metainformation that is stored in memory of the client computing device(s) 130, the server(s) 160 and/or the user device(s) 190.
  • the cosmetic design 110 includes one or more traces 115 that are referenced to contour maps of relevant facial features 120 and/or to baseline features 125 of the target region.
  • the cosmetic design 110 can also include data describing dynamic elements 117 of the traces 115. In this way, the cosmetic design 110 can be deployed to a user device 190 as an animated augmented reality filter reproducing, step-by-step, the lay er- wise application of the cosmetic design 110.
  • the client computing device 130 can incorporate the camera 140.
  • the example system 100 can include multiple client computing devices 130, where a first client computing device 130 is a mobile electronic device (e.g., a tablet, smartphone, or laptop) that is configured to host user interface elements and to connect to the server(s) 160 over the network(s) 170, while a second client computing device 130 is integrated with the camera 140 and is configured to coordinate operation of the camera 140 with the first client computing device 130.
  • the constituent components of the example system 100 can be provided with computer-executable instructions (e.g., software, firmware, etc.) to implement and coordinate the operation of one or more features of the example system 100. In this way, the operation of the example system 100 can be coordinated via a user interface, accessed via one or more of the constituent components.
  • the client computing device(s) 130 can be or include a purpose-built mobile computing device including the camera 140, one or more electromagnetic (EM) radiation sources (illustrated as part of the camera 140), and one or more user interface elements 131 to prompt a biological surface 180 with visual and/or auditory prompts.
  • the interface elements 131 can be or include a display 133 to generate a visual representation of the cosmetic design 110.
  • the interface elements 131 can also include user input components including, but not limited to, touch screen, keyboard, trackpad, or mouse.
  • the components of the client computing device 130 can be integrated into a housing resembling a consumer cosmetic product such as, for example, a bathroom or vanity mirror.
  • the housing can conceal power sources, heat management systems, and other components.
  • the system 100 can include a smartphone or tablet computer in communication with the client computing device 130, such that one or more computer-executable operations are undertaken by the smartphone or tablet computer.
  • image generation operations can be executed by a smartphone incorporating the camera 140 and image processing operations can be executed at least in part on a separate device, such as the server(s) 160.
  • the camera 140 can be or include multiple sensors and/or sources including, but not limited to a visible light image sensor 141, a depth sensor 143 and/or a source of invisible EM radiation 145, including but not limited to infrared or nearinfrared radiation.
  • the camera 140 can include communication circuitry 147 to enable wireless communication and/or pairing with the other constituent components of the example system 100. While the camera 140 is illustrated as a separate component of the example system 100, the camera 140 can also be integrated into one of the other constituent components of the example system 100.
  • the client computing device 130 can incorporate the camera 140 (e.g., as a smartphone or tablet)
  • the depth sensor 143 can capture one or more images of the biological surface 180, including, but not limited to, images of a target body surface 181 of the biological surface 180.
  • the biological surface 180 is a human using the example system 100 to create the cosmetic design 110 and the target body surface 181 is the face of the human in the region around the eye and eyebrow.
  • the depth sensor 143 can generate a surface mapping 183 of the target body surface 181. Contours and depth information for the target body surface 181 can vary over time or between users, and the camera can generate the surface mapping 183 as part of operations for extrapolating, modifying and/or generating the cosmetic design 110 by the client computing device(s) 130.
  • the depth sensor 185 can be an image sensor and can capture images within a field of view 185 including the target body surface 181.
  • the depth sensor 143 can be or include, but is not limited to, a laser-based sensor (e.g., LiDAR), a time-of-flight camera, a structured light generator, a visual simultaneous localization and mapping (vSLAM) sensor assembly including motion sensors and visible image cameras, or an ultrasound-based sensor assembly, such that the camera 140 can generate the surface mapping 183.
  • a laser-based sensor e.g., LiDAR
  • vSLAM visual simultaneous localization and mapping
  • the source of invisible EM radiation 145 can be or include an infrared source that exposes the biological surface 180 including the target body surface 181 to structured invisible infrared radiation.
  • the source of invisible EM radiation 145 can be or include an infrared diode laser. In this way, the EM radiation generated by the source of invisible EM radiation 145 can be scanned or otherwise directed toward the target body surface 181 over an angular spread 187, such that the target body surface 181 is exposed.
  • detection of the target body surface 181 is facilitated and/or enabled by feature and/or edge detection applied to visible spectrum (e.g., RGB) images captured by the visible light sensor 141 (e.g., by vSLAM techniques).
  • the surface mapping 183 can provide contour information and/or position information for features in the target body surface 181, for example, precise information about the relative position of the eyebrow ridge and the bridge of the nose, where the eyebrow begins and ends relative to the eye, etc.
  • the surface mapping 183 can be used to generate, and/or extrapolate the cosmetic design 110 for deploying to a social media platform or other service to distribute the cosmetic design 110 to users via user device(s) 190.
  • the cosmetic design 110 can be represented as a data file that is communicated to the server(s) 160, for example, as part of an online platform and/or database of cosmetic designs.
  • the surface mapping 183 can be used to modify the cosmetic design 110 by determining a shape type of a feature of the biological surface 180 and transforming the cosmetic design 110 to another, different shape type. In some embodiments, multiple layers are defined as part of generating or modifying the cosmetic design 110.
  • the example system 100 can implement the example techniques described in reference to FIGs. 2-6 to detect an application of multiple cosmetic formulations to the target body surface 181 and can generate one or more traces 115 describing the applications spatially and temporally that together define the cosmetic design 110.
  • the cosmetic design 110 can be distributed to the user device(s) 190 as an augmented reality filter, for example, through integration with social media, with improved accuracy and precision by mapping both the target body surface 181 and a corresponding region of a user of the user device 190, and by extrapolating the cosmetic design 110, or each constituent layer of the cosmetic design 110, to fit the corresponding region.
  • the cosmetic design 110 can be presented dynamically, in the form of a tutorial visualization, whereby both layers and traces are animated to demonstrate layers, motions, colors, and other cosmetic aspects of the cosmetic design 110.
  • the cosmetic design 110 represents an improvement to user experience by adapting the cosmetic design 110 from the creator to the user, and also represents an improvement to system performance, as the complete cosmetic design 110 can be transferred as mask data rather than as image data.
  • data volumes and computational resources involved in distributing the cosmetic design 110 over the network(s) 170 and storing on the server(s) 160, are reduced.
  • FIG. 2 is a schematic diagram illustrating an example process 200 for preparing the augmented reality cosmetic design filter 110, in accordance with various embodiments.
  • the example process 200 can be implemented as a number of operations executed or otherwise performed by constituent components of the example system 100 of FIG. 1.
  • the operations can be or include operations performed by one or more processors of a computer system, such as client computing device(s) 130 of FIG. 1, in response to execution of computer-readable instructions stored on non-transitory memory of the computer system.
  • the example process 200 can include more or fewer operations, and the order of operations can vary.
  • some operations are performed by different physical computers, as described in more detail in reference to FIG. 1.
  • some operations can be performed by different components interchangeably or can be performed by coordinated operation of two or more components.
  • the computer system defines a baseline description 210 of the biological surface 180.
  • the baseline description 210 is defined as part of generating the surface mapping 183, for example, by referencing visible color information to surface contour mapping information describing the biological surface 180.
  • the baseline description 210 can include numerical representations (e.g., digital data) of the biological surface 180 including both the surface mapping 183 and color information.
  • the example process 200 is illustrated in the context of the target body surface 181 of FIG. 1, but it is understood that the biological surface 180 can be or include a face, a hand, a portion of a face, or other surface to which a cosmetic formulation is applied.
  • defining the baseline description 210 includes generating one or more images of the biological surface 180, using the camera 140 or other radiation sensors that are configured to detect visible light or invisible EM radiation.
  • Invisible EM radiation can be projected onto the biological surface 180 using a radiation source of the camera 140, such as a structured light generator, LiDAR source, or ToF camera source.
  • the images can describe facial features 120 of the biological surface 180 as well as baseline features 125.
  • Baseline features 125 can be or include materials applied to the biological surface 180 prior to example process 200, such that at least a portion of the biological surface 180 includes a pigment, cosmetic formulation, or other feature that is detectable by the camera 140.
  • baseline features 125 are illustrated as eyelashes and eyeliner.
  • baseline features 125 can include, but are not limited to, blemishes, wrinkles, scars, or other aesthetic aspects of the biological surface 180 other than facial features 120.
  • Facial features 120 refer to organs, hair, and other topographical features of the biological surface 180 that define the surface mapping 183 and influence the baseline description 210 through physical effects including, but not limited to, shadows, perspective, or the like.
  • the computer system identifies one or more applications 215 of cosmetic formulation(s) 220 to the biological surface 180.
  • the cosmetic formulation(s) 220 can include, but are not limited to, makeup, mascara, lipstick, lipliner, eyeliner, glitter, aesthetic creams, ointments, or other materials that can be applied topically to the biological surface 180.
  • the application(s) 215 of the cosmetic formulation(s) 220 can be detected by the camera 140 using radiation sensors, such as visible light sensors, invisible EM radiation sensors, or the like. In this way, the application(s) 215 can be detected as a shift in the biological surface relative to the baseline description, for example, in terms of coloration, surface reflectivity, or absorbance/reflectance of invisible EM radiation.
  • a glossy lip coating can be detected through an increase in specular reflection of the biological surface 180 relative to the baseline description that includes negligible specular reflection.
  • a matte foundation can be detected through a decrease in specular reflection and an increase in diffuse reflection relative to the baseline description.
  • specular reflection can be detected by defining a luminance channel in visible light images and thresholding the luminance channel, where specular reflection is defined as a luminance value exceeding the threshold, on a pixel-wise basis.
  • the computer system detects an applicator 225, such as a brush, sponge, pencil, finger, or other applicator 225 in proximity to the biological surface 180.
  • the computer system recognizes an identifier of the applicator 225, for example, where the applicator 225 includes a registration mark such as a barcode, QR code, RFID tag, or other representation of the identifier information.
  • the computer system can identify the application 215 spatially and temporally and can identify the cosmetic formulation(s) 220.
  • operation 203 can optionally include identifying the cosmetic formulation 220 and defining a color of the cosmetic formulation 220 using the identifier information of the cosmetic formulation 220.
  • an unidentified cosmetic formulation 220 can be estimated using the camera 140, as described in more detail in reference to FIG. 6, identifying the applicator 225 can permit the computer system to directly determine the color of the cosmetic formulation 220 where the applicator 225 is loaded with a particular cosmetic formulation 220, rather than being a multi-purpose applicator 225.
  • applicators 220 for which the cosmetic formulation 220 can be identified in such a manner include, but are not limited to, pencils, lipsticks, gel applicators, or other applicators 220 that are available with a prepared cosmetic formulation 220.
  • the application 215 of cosmetic formulation 220 can include a motion 230 of the applicator 225 relative to the biological surface 180.
  • the motion 230 can include meaningful information with respect to the aesthetic effect of the overall cosmetic design 110, such as in blending and smudging motions. As such, motions 230 of applicators 220 can be detected and used during operations for generating traces.
  • Applications 215 can also include modifications or manipulations of formulations 220 already disposed on the biological surface 180.
  • the extent, coloration, and reflectance of a cosmetic formation 220 can be modified by smoothing, smudging, blending, spreading, or other techniques.
  • Such applications 215, not adding a new formulation 220 to the biological surface 180, but rather modifying an already disposed formulation 220, can be identified by detecting a tool or a finger, and detecting a change in the biological surface 180 relative to the baseline description 210.
  • the baseline description 210 can be relative to each respective application 215, where applications 215 are layered and/or blended.
  • the baseline description 210 for a particular region of the biological surface 180, to which a formulation 220 is already applied can be redefined to include the formulation 220. In this way, subsequent applications 215 can be more precisely and accurately detected.
  • the computer system generates one or more traces 115 describing the application 215 identified at operation 203.
  • Generating the traces 115 can include determining an extent 217, layer sequence (see FIG. 5), and coloration (see FIG. 6) of the applications 215, such that the cosmetic design 110 can realistically reproduce the aesthetic effect of the applications 215 and can also reproduce the sequence of the applications 215.
  • the extent 217, coloration, and layer sequence information, as well as other information relevant to realistically reproducing the traces 115 on user device(s) 190 can be encoded as numerical representations, such as intensity datasets referenced to the baseline description 210.
  • the extent 217 of the applications 215, illustrated as dashed lines in FIG. 2 can be detected as a shift in coloration, a shift in absorbance and/or reflectivity of EM radiation relative to the baseline description. In this way, the traces 115 can reproduce the aesthetic effect of individual applications 215.
  • the computer system tracks the motion 230 of the applicator using images and or contour data generated by the camera 140.
  • Generating the traces 115 can include generating numerical representations of the various motions 230 relative to the biological surface 180, as described in the baseline description, as well as the color information describing the cosmetic formulations 220 used in the applications 215.
  • the traces 115 once generated, can be transformed to reproduce the aesthetic effect of the cosmetic design 110 on differently shaped biological surfaces 180, as viewed through user device(s) 190, described in more detail in reference to FIG. 3.
  • Traces 115 can encode layer-blending and layering information, such that the combination of multiple traces 115 can reproduce interactions between traces 115.
  • a set of applications 215 to a region of the biological surface 180 between the eye and eyebrow of multiple eyeshadow colors can be blended to impart a color gradient in an eyeshadow region.
  • the example process 200 includes multiple iterations of operations 201-205, such that the region between the eyebrow and the eye are described by multiple traces 115 and also by interaction terms that describe how the multiple traces relate.
  • traces 115 include transparency information, such as alpha channel information, referenced to dynamic elements 117 that describe localized transparency of coloration or other aesthetic effects of the applications 215.
  • traces 115 can reproduce manipulation of the cosmetic formulations 220, such as blending, smudging, smoothing, or other techniques that are typical of cosmetic designs.
  • traces 115 and/or dynamic elements 117 can include a vector of transparency values referenced to a motion 230 and a formulation 220, such that the trace 115 reproduces a smoothing or blending of the formulation 220 based on the motion 230.
  • the computer system outputs the trace(s) 115 detected as part of operations 201-205.
  • Outputting the trace(s) 115 can include electronic operations including, but not limited to, storing the trace(s) 115 in a local data store of the client computing device 130, transferring the trace(s) 115 to server(s) 160 via the network 170, and or sharing the trace(s) 115 with the user device(s) 190 directly (e.g., through electronic pairing) or via the network 180.
  • the operations can similarly be applied to other surfaces.
  • the cosmetic design 110 can describe applications 215 of cosmetic formulations 220 to additional/al ternative surfaces including, but not limited to, lips, nose, cheeks, forehead, or hands.
  • cosmetic designs 110 can modify the appearance of facial features 120, including but not limited to eyebrows, eyes, lips, cheekbones, jawline, or hands.
  • a cosmetic design 110 can include a sequence of traces 117 for emphasizing the appearance of cheekbones through application of one or more cosmetic formulations 120.
  • FIG. 3 is a schematic diagram illustrating an example process 300 for deploying the augmented reality cosmetic design filter 110, in accordance with various embodiments.
  • the example process 300 can be implemented as a number of operations executed or otherwise performed by constituent components of the example system 100 of FIG. 1.
  • the operations can be or include operations performed by one or more processors of a computer system, such as user device(s) 190 of FIG. 1, in response to execution of computer-readable instructions stored on non-transitory memory of the computer system.
  • the example process 300 can include more or fewer operations, and the order of operations can vary.
  • some operations are performed by different physical computers, as described in more detail in reference to FIG. 1.
  • some operations can be performed by different components interchangeably or can be performed by coordinated operation of two or more components, such as user device(s) 190 and server(s) 160.
  • a user device 190 receives data 310 describing the cosmetic design 110.
  • Receiving the data 310 can include electronic communication over the network 170 (e.g., the internet, a cellular network, or the like), but can also include direct communication through pairing or other approaches.
  • the user device 190 is illustrated receiving the data 310 from server(s) 160, but it is understood that the data 310 can be received from the client computing device(s) 130.
  • the data 310 can include traces 115 generated during one or more iterations of the example process 200, as described in more detail in reference to FIG. 2, and can also include metainformation describing the biological surface 180, such as a shape-type of the biological surfacel80, as described in more detail in reference to FIG. 4.
  • Operations for receiving the data 310 can include one or more data transfer techniques including, but not limited to, wireless communication or wired communication.
  • the user device 190 can communicate wirelessly with the client computing device 130 to receive the data 210 as a wireless transmission.
  • receiving the data 310 can include a browser environment, such as a recommendation engine, whereby a user of the user device 190 can view a set of cosmetic designs 110 generated by one or more creators, rather than directly communicating with the creators to request the design. Additionally or alternatively, the data 310 can be pushed to the user device 190, for example, as part of a social media service to which the user of the user device 190 can register, follow, and/or subscribe. In some embodiments, the data 310 is recommended based at least in part on identifier information describing the user of the user device 190. Where the user of the user device 190, through historical traffic data, demonstrates an aesthetic preference, cosmetic designs 110 reflecting that preference can be preferentially recommended.
  • Demographic, socio-economic, or biometric data can also be used to recommend cosmetic designs 110.
  • cosmetic designs 110 reflecting the trends for the geographic region can be recommended.
  • cosmetic designs reflecting trends corresponding to each respective category can be identified and recommended.
  • the user device 190 maps a user surface 315 corresponding to the biological surface 180, to which the cosmetic design 110 will be virtually applied.
  • Mapping the user surface 315 can include operations similar to those described in reference to generating the surface mapping 183 of FIG. 1, including but not limited to structured light methods, LiDAR or ToF methods, vSLAM methods, ultrasonic methods, or the like.
  • the surface 315 can be described by a mesh of polygons 317 that can each define multiple properties for the corresponding region of the surface 315 including, but not limited to, tone, reflectance, temperature, or the like.
  • operation 303 can include executing one or more applications stored in memory of the user device 190 to map the surface using the depth sensors.
  • the user surface 315 can be represented as a numerical representation in data, describing surface contours and facial features 320 of the user surface 315.
  • the example process 300 includes integrating the cosmetic design 110 with an existing aesthetic look of the user surface 315. To that end, aesthetic features 321 such as makeup, mascara, eyelashes, or the like, are included during surface mapping.
  • Mapping the user surface 315 can also include estimating a surface tone, which can be a local surface tone, as described in more detail in reference to FIG. 6.
  • a surface tone which can be a local surface tone, as described in more detail in reference to FIG. 6.
  • the example process 300 can account for a difference in skin tones between the biological surface 180 and the user surface 315.
  • metadata provided as part of the data 310, or alternatively as stored in memory of the server(s) 160 and/or the user device 190 can include reference or lookup tables cross-referencing colors to skin tones, such that the cosmetic design 110 can be adapted to reproduce the aesthetic effect of the cosmetic design 110, rather than the literal transposition of exact colors.
  • the traces 115 received as part of the data 310 are used to generate filters 325, referenced to the mapping of the user surface 315 generated at operation 303.
  • the filters 325 can be or include numerical representations, such as texture objects, that are formatted to be combined with the mapping of the user surface 315.
  • the user surface 315 is described by a mesh of polygons 317 and a filter 325 is referenced to the polygons 317, such that each polygon 317 defines a vector of values describing the appearance of the filter 325.
  • the filters 325 include dynamic elements 330, such as animation effects or interlayer-effects, that can be used to demonstrate the sequence of applications 215 used to create the cosmetic design 110.
  • Dynamic elements 330 can reproduce at least a part of the dynamic elements 117 described in reference to the traces 115 in FIGs. 1-2.
  • the computer system presents the filters 325 integrated with images 335 of the user surface 315.
  • Presenting the filters 325 can include displaying the filters 325 through a screen 337.
  • the user device 190 can be a smartphone or a tablet including a touchscreen display.
  • the filters 325 can be mapped to the user surface 315 in real time or near real time using feature tracking or other augmented reality techniques, where “near real time” describes a qualitative experience by a user of the user device 190 whereby latency introduced by the operations of the example process 300 are effectively imperceptible.
  • the filters 325 can be presented as interactive graphical elements that a user of the user device 190 can manipulate through the screen 337, for example, by selecting or highlighting.
  • selecting a filter 325 can initiate animation of dynamic elements 330 at optional operation 309.
  • animating the filters 325 can include initiating a virtual tutorial whereby filters 325 are presented in sequence by layer-order, such that the sequence of applications 215 is reproduced on the user surface 315. The sequence can be paused, reversed, advanced, or otherwise manipulated via the user device 190, and specific dynamic elements can be activated by selecting a filter 325, for example, with a user action 340 on the screen 337.
  • interaction with the screen 337 while the filters 325 are being presented permits the user of the user device 190 to edit, add, or remove one or more filters 325.
  • Different user actions 340 instead of activating dynamic elements, can initiate one or more alternative processes. For example, a double finger tap can open a menu 341 that permits the user of the user device 190 to view information describing the applicator 225, the cosmetic formulation 215, or other information describing the filter 325.
  • a press-and-hold on the filter 325 can cause the user device 190 to access an online marketplace to permit the user to purchase the cosmetic formulation 220, the applicator 225, or other consumer products.
  • a menu of social media controls 343 can be generated, such that the user of the user device can share the image 335 with the filters 325, can communicate with other users, or can provide feedback or otherwise communicate with the creator of the cosmetic design 110. It is understood that the types of user actions described are exemplary. User interface elements, such as the menu 341 and/or the social media menu 343 can be summoned by alternative and/or additional interactions. In some embodiments, menu elements are presented on the display 337 by default.
  • FIG. 4 is a schematic diagram illustrating an example process 400 for transforming the augmented reality cosmetic design filter 110 from one shape type to another shape type, in accordance with various embodiments.
  • the example process 400 can be implemented as a number of operations executed or otherwise performed by constituent components of the example system 100 of FIG. 1.
  • the operations can be or include operations performed by one or more processors of a computer system, such as client computing device(s) 130, server(s) 160, and/or user device(s) 190 of FIG. 1, in response to execution of computer-readable instructions stored on non-transitory memory of the computer system.
  • client computing device(s) 130, server(s) 160, and/or user device(s) 190 of FIG. in response to execution of computer-readable instructions stored on non-transitory memory of the computer system.
  • the example process 400 can include more or fewer operations, and the order of operations can vary.
  • some operations are performed by different physical computers, as described in more detail in reference to FIG. 1.
  • some operations can be performed by different components interchangeably
  • the traces 115 are generated in reference to the baseline description 210, as described in more detail in reference to FIG. 2, and as such are specific to the biological surface 180.
  • the example process 400 therefore, describes operations for extrapolating the filters 325 from a first face shape to a second face shape, as part of generating the traces 115 and/or the filters 325. While the description in reference to FIG. 4 focuses on face shapes, it is understood that the operations can similarly be applied to shapes of individual facial features, such as eye shapes, mouth shapes, or other feature shapes, such that the traces 115 are transformed to realistically reproduce the aesthetic effect of the traces 115, rather than simply applying the traces 115 as a texture to the user surface 315.
  • the computer system generates mask data 410 using one or more traces 115.
  • generating the mask data 410 includes projecting the data describing the applications 215 onto the topographical data describing the biological surface 180, thereby permitting the modification of the traces 115 from one shape type 420 to another shape type 420.
  • the computer system attributes a first shape type 420 to at least a portion of the biological surface 180.
  • the computer system can reference shape information describing multiple shape types 420.
  • the shape information can be stored on, accessed, or otherwise received from server(s) 160 or client computing device(s) 130.
  • example process 400 can include receiving information describing the multiple shape types 420.
  • Attributing the shape type 420 to the biological surface 180 can include referencing the baseline description 210 to determine characteristic dimensions, spacings, or other aspects of the biological surface 180 that indicate the shape type 420.
  • characteristic aspects can be defined by projecting a grid or other form of guide 421 onto the baseline description 210, such that the shape type 420 can be identified, either by defining an outline 423 of the baseline description 210, by identifying a characteristic curvature 425 at one or more points of the baseline description 210, or by identifying characteristic spacings 427 between features. It is understood that approaches involving edge detection and feature detection using computer image processing techniques can be employed to attribute the shape type 420.
  • the computer system generates one or more shape filters 430 for different shape types 420.
  • the mask data 410 generated at operation 401 is transformed from the first shape type 420-1 to a second shape type 420-2, a third shape type 420-3, or additional or alternative shape types 420. Transformation of the mask data 410 can include but is not limited to projecting the mask data 410 of the first shape type 420-1 onto the topographical data of the second shape type 420-2, and applying the corresponding transformation onto the traces 115. In this way, the shape filters 430 will reproduce the aesthetic shape and coloration of the traces 115 onto shape types 420 different from the first shape type 420-1.
  • generating shape filters 430 for different shape types 420 can include additional modifications to the mask data 410, beyond spatial projection to account for different physical shapes.
  • some shape types 420 are characterized by more or less pronounced facial features, such as brow ridges, forehead curvature, chin shape, cheekbones, jaw shape, or the like.
  • shape filters 430 can reposition, divide, omit, duplicate, mirror, or otherwise transform mask data 410 as part of generating shape filters 430.
  • generating the shape filters 430 for a second shape type 420-2 corresponding to a rectangular face shape from the first shape type 420- 1 corresponding to a heart face shape includes repositioning a forehead trace upward toward the hairline and widening the forehead trace to either edge of the forehead, as well as dividing and mirroring a chin trace along the jawline.
  • Corresponding transformations can be applied to the mask data 410 for each of the traces 115 as part of generating the shape filters 430 for each shape type 420.
  • the constituent operations of the example process 400 can be applied based on similar principles.
  • generating shape filters 430 at the operation 405 can include transforming mask data 410 from the rounded eye to an almond eye shape type 420 or other eye-specific shape types 420. It is understood that corresponding operations can be applied to shapes of lips, hands, ears, or other biological surfaces.
  • FIG. 5 is a schematic diagram illustrating an example process 500 for generating an augmented reality cosmetic design filter 510 in multiple layers, in accordance with various embodiments.
  • the example process 500 can be implemented as a number of operations executed or otherwise performed by constituent components of the example system 100 of FIG. 1.
  • the operations can be or include operations performed by one or more processors of a computer system, such as client computing device(s) 130, server(s) 160, and/or user device(s) 190 of FIG. 1, in response to execution of computer-readable instructions stored on non-transitory memory of the computer system.
  • client computing device(s) 130, server(s) 160, and/or user device(s) 190 of FIG. While the operations are illustrated in a particular order, the example process 500 can include more or fewer operations, and the order of operations can vary. In some embodiments, some operations are performed by different physical computers, as described in more detail in reference to FIG. 1. For example, some operations can be performed by different components interchangeably or can be performed by coordinated operation of two or more components.
  • the traces 115 can be generated by recording the sequential application of multiple cosmetic formulations 220 to the biological surface 180, such that the augmented reality cosmetic design filter 510 can be segmented into discrete layers.
  • the presentation of the filter 510 as described in more detail in reference to FIG. 3, can reproduce the layer-wise sequence of applications.
  • the example process 400 can also incorporate the layer-wise aspects of the example process 500 where different shape types 420 implicate different layering processes, interlayer blending, as well as more complex light and shadow effects resulting from differences in shape type 420.
  • a sequence of layers are defined, where each layer includes one or more traces 115.
  • a first layer is defined that includes forehead traces 515 and cheekbone traces 517.
  • a second layer is defined that includes forehead trace(s) 520, cheek traces 521, and chin trace(s) 523.
  • highlight layer(s) are defined that include highlight trace(s) 525 to accentuate or emphasize one or more facial features, overlying the first and second layers.
  • eye layer(s) are defined including eyeshadow traces 530 and under-eye traces 531, overlying the highlight layers, the first layers, and the second layers.
  • the augmented reality cosmetic design filter 510 can dynamically add or remove layers during presentation on the user device 190.
  • the client computing device 130 can associate one or more traces 115 with each of a number of layers. Defining the layers as described in the example process 500 permits inter-layer blending rules to be defined. For example, higher-order layers can overly lower-order layers. Higher-order layers can completely occlude lower-order layers. Transparency or partial transparency, defined for example through an alpha channel of the mask data 410, can permit multiple layers to be overlaid in the images 337 to produce a net aesthetic effect including co-localized contributions from multiple layers.
  • FIG. 6 is a schematic diagram illustrating an example process 600 for determining a formulation color using surface tones and formulation tones, in accordance with various embodiments.
  • detecting the applications 215 of cosmetic formulations 220 can include determining the color or the cosmetic formulations 220 and/or identifying the cosmetic formulations 220.
  • techniques for determining the color include image processing operations based on images generated by the camera 140, the colors of the various cosmetic formations 220 can be influenced by the tone of the biological surface 180 in proximity to the applications 215. In this way, direct reading of image pixel values may produce an inaccurate color rendering when used in filters 325.
  • the computer system meters a number of regions 610 corresponding to the applications 215 and estimates a surface tone 615 of the biological surface 180 for each of the regions 610.
  • the operations of the example process 600 are executed in parallel with the example process 200, but can also be executed at a different time, using the baseline description 210, images recorded during the generation of the traces 115, or a calibration set of images generated before commencing the example process 200.
  • surface tones 615 may differ locally between the regions 610, for example, where a facial feature casts a shadow or where the surface defines one or more contours.
  • the operation 601 can include defining multiple surface tones 610 for a single application 215.
  • the computer system meters the regions 610 in images including the applications 215 and estimates one or more local formulation tones 625 of the cosmetic formulations 220.
  • Estimating the surface tones 615 and the formulation tones 625 can include sampling one or more pixels in each of the regions 610 and determining an average coloration of each of the regions 610.
  • the formulation tones 625 will be influenced by the underlying surface tones 615, such that the surface tones 615 can serve as reference baseline values for estimating formulation tones in a way that is sensitive to skin tone.
  • implementing computer image processing approaches that include estimating the surface tones 615 in this way permits the cosmetic designs 110 to be responsive to skin tones of the user, rather than presenting a static augmented reality mask.
  • the computer system determines a color 630 of the cosmetic formulation using the surface tones 615 and the formulation tones 625.
  • each cosmetic formulation 220 includes one or more pigments that are uniformly dispersed or distributed, such that the formulation expresses a single visible color, for example, through diffuse reflection of visible light.
  • the formulation color 630 can be or include a vector of color components, such as an RGB triad or another color representation approach.
  • corresponding surface tones 615 and formulation tones 625 can be compared, and the overall effect of the surface tones 615 on the formulation tones 625 can be accounted for, as part of determining the color 630 of the cosmetic formulation 220.
  • comparing the surface tones 615 and the formulation tones 625 for example, by using the surface tones 615 to correct for luminance variation in the formulation tones 625, can control for the effect of illumination on color rendering.
  • determining the formulation color 630 can improve the reproducibility of the aesthetic effect of the cosmetic design 110 on different biological surfaces, in different ambient conditions, and with different formulations.
  • the formulation color 630 can be modified to account for differences in skin tone between creator and user.
  • the formulation color 630 can be modified to improve realistic color fidelity based on ambient conditions. For example, in bright conditions, the color intensity can be increased, while in dim conditions, the color intensity can be decreased.
  • tinting can be modified dynamically to account for light and shape.
  • the formulation color 630 can be modified to include a larger blue component to reflect that the trace 115 to which it corresponds is in shade. Conversely, the blue component can be decreased when the corresponding trace 115 is moved into brighter illumination.
  • augmented reality filters can be generated and presented in near real time to users of user devices 190, integrated into social media platforms.
  • the augmented reality filters can realistically reproduce the aesthetic effects of cosmetic designs, accounting for differences in face shape, skin tone, and ambient conditions, and can be presented dynamically to improve the sharing of cometic routine sequences beyond what is currently possible with augmented reality or conventional social media.
  • FIG. 7 is a block diagram that illustrates an example system 700, including components of the system of FIG. 1, in accordance with various embodiments.
  • the example system 700 includes the client computing device 130 in electronic communication (e.g., over network 170) with the remote computer system 160.
  • Example system 700 illustrates an example of the system 100 of FIG. 1, in a context of associated system elements, and, as such, describes electronics and software executing operations as described in reference to FIGs. 2-6.
  • FIG. 7 depicts a non-limiting example of system elements, features and configurations; many other features and configurations are contemplated.
  • FIG. 1 includes a computer system 710, multiple components 720 for interacting with the biological surface 180, a computer-readable medium 730, and a client application 740, that can be stored as computer-executable instructions on the computer-readable medium 730, and, when executed by the computer system 710, can implement the operations described in reference to the system 100 of FIG. 1, and the operations of the example techniques of FIGs. 2-3.
  • the client computing device 130 incorporates subcomponents including, but not limited to, a power source 711, a human-machine interface 713, one or more processors 715, a network interface 717, and can include the computer-readable medium 730.
  • the power source 711 is a direct-current power source, for example, a rechargeable battery or a rectified power supply configured to connect to line-power (e.g., 110 VAC, 220 VAC, etc.).
  • the human-machine interface (HMI) 713 can include any type of device capable of receiving user input or generating output for presentation to a user, such as a speaker for audio output, a microphone for receiving audio commands, a push-button switch, a toggle switch, a capacitive switch, a rotary switch, a slide switch, a rocker switch, or a touch screen.
  • the one or more processors 715 are configured to execute computer-executable instructions stored on the computer-readable medium 730.
  • the processor(s) 715 are configured to receive and transmit signals to and/or from the components 720 via a communication bus or other circuitry, for example, as part of executing the client application 740.
  • the network interface 717 is configured to transmit and receive signals to and from the client computing device 130 (or other computing devices) on behalf of the processors 715.
  • the network interface 717 can implement any suitable communication technology, including but not limited to short-range wireless technologies such as Bluetooth, infrared, near-field communication, and Wi-Fi; long-range wireless technologies such as WiMAX, 2G, 3G, 4G, LTE, and 10G; and wired technologies such as USB, FireWire, Thunderbolt, and/or Ethernet.
  • the computer-readable medium 730 is any type of computer-readable medium on which computer-executable instructions can be stored, including but not limited to a flash memory (SSD), a ROM, an EPROM, an EEPROM, and an FPGA.
  • the computer-readable medium 730 and the processor(s) 715 can be combined into a single device, such as an ASIC, or the computer-readable medium
  • processor 730 can include a cache memory, a register, or another component of the processor 715.
  • the computer-readable medium 730 stores computerexecutable instructions that, in response to execution by one or more processors 715, cause the client computing device 130 to implement a control engine 731.
  • the control engine 731 stores computerexecutable instructions that, in response to execution by one or more processors 715, cause the client computing device 130 to implement a control engine 731.
  • the control engine 731 controls one or more aspects of the client computing device 130, as described above.
  • the computer-executable instructions are configured to cause the client computing device 130 to perform one or more operations such as generating a surface mapping of the target surface, generating a trace, or outputting the trace to server(s) 160.
  • the control engine 731 controls basic functions by facilitating interaction between the computer system 710 and the components 720 according to the client application 740.
  • the control engine 731 detects input from HMI 713 indicating that a cosmetic routine is to be initiated (e.g., in response to activation of a power switch or “start” button, or detection of a face in front of the camera 140 of FIG. 1), or receives signals from client computing device(s) 130, the remote computer system 160, or user device(s) 190 (e.g., over a Bluetooth paired connection).
  • the components of the client computing device 130 can be adapted to the application or can be specific to the application (e.g., ASICs).
  • the components 720 can include one or more cameras 721, a display 723, one or more radiation sources 725, and/or one or more radiation sensors 727, as described in more detail in reference to FIG 1.
  • the components 720 are integrated into a single device.
  • the client computing device 130 can be a specialized computing device, configured to execute the client application 740 in coordination with the components 720.
  • the client computing device 130 is a general purpose mobile electronic device, such as a tablet or smartphone, storing the client application 740.
  • the client application 740 also includes an image capture/3D scanning engine 741 configured to capture and process digital images (e.g., color images, infrared images, depth images, etc.) obtained from one or more of the components 720 including but not limited to stereoscopic images, LiDAR data, or other forms of surface/depth sensing information. In some embodiments, such data are used to obtain a clean and precise 3D contour mapping of the target body surface (e.g., target surface 181 of FIG. 1). In some embodiments, the digital images or scans are processed by the client computing device 130 and/or transmitted to the remote computer system 160 for processing in a 3D model engine 781.
  • digital images e.g., color images, infrared images, depth images, etc.
  • captured image data is used in position tracking engine 743 for determining the position of features, key-points, or edges on the target body surface.
  • the position tracking engine 743 tracks the contours of the target body surface in a 3D space, for example, by implementing v-SLAM techniques.
  • position information from the position tracking engine 743 is used to generate signals to be transmitted to the control engine 731, which are used to control one or more components 720 or elements of the computer system 710 including, for example, the sources 725 or the HMI 713, according to techniques described herein.
  • digital 3D models described herein are generated based on sensor data obtained the client computing device 130.
  • the digital 3D models are generated by the client computing device 130 or some other computing device, such as a cloud computing system, or a combination thereof.
  • the digital 3D models include 3D topology and texture information, which can be used for reproducing an accurate representation of a body surface, such as facial structure and skin features, as described in more detail in reference to FIGs. 1-6.
  • the client application 740 includes a user interface 745.
  • the user interface 745 includes interactive functionality including but not limited to graphical guides or prompts, presented via the display to assist a user in selecting cosmetic designs, tutorial videos, or animations.
  • the user interface 745 provides guidance (e.g., visual guides such as arrows or targets, progress indicators, audio/haptic feedback, synthesized speech, etc.) to guide a user under particular lighting conditions, angles, etc., in order to ensure that sufficient data is collected for use by mapping and projection engines.
  • the client application 740 can include a mapping module 747.
  • the mapping module 747 can be or include computer-readable instructions (e.g., software, drivers, etc.) for translating a numerical representation of a cosmetic design into an augmented reality cosmetic design filter.
  • the client application 740 can receive real-time data from the camera(s) 721 and sensors 727, which can be processed by the 3D scanning engine 741, the position tracking engine 743, and can be used to progressively update the mapping and the cosmetic design as it is developed during a sequence of applications to the target body surface.
  • mapping module 747 can respond to motion of the target body surface, thereby increasing the tolerance of the client computing device 130 for motion on the part of the user without loss of fidelity to the cosmetic design filter.
  • the computational resource demand for such real time scanning/tracking can be spread across multiple devices, such as the remote computer system 160, through parallelization or distribution routines.
  • a communication module 749 of the client application 740 can be used to prepare information for transmission to, or to receive and interpret information from other devices or systems, such as the remote computer system 160 or the user device(s) 190, As described in more detail in reference to FIG. 1.
  • Such information can include captured digital images, scans, or video, personal care device settings, custom care routines, user preferences, user identifiers, device identifiers, or the like.
  • the client computing device 130 collects data describing execution of care routines, image data of body surfaces, or other data.
  • data is transmitted via the network interface 717 to the remote computer system 160 for further processing or storage (e.g., in a product data store 783 or user profile data store 785).
  • the client computing device 130 can be used by a consumer, personal care professional, or some other entity to interact with other components of the system 700, such as the remote computer system 160 or user device(s) 190.
  • the client computing device 130 is a mobile computing device such as a smartphone or a tablet computing device equipped with the components 720 and the client application 740 or provided with the components through electronic coupling with a peripheral device.
  • the remote computer system 160 includes one or more server computers that implement one or more of the illustrated components, e.g., in a cloud computing arrangement.
  • the remote computer system 160 includes a projection engine 787, the 3D model engine 781, the product data store 783, and the user profile data store 785.
  • the 3D model engine 781 uses image data (e.g., color image data, infrared image data) and depth data to generate a 3D model of the target body surface.
  • the image data is obtained from the client computing device 130, for example, from the camera(s) 721 or the sensor(s) 727 that are integrated with or otherwise electronically coupled with client computing device 130.
  • image data and depth data associated with a user is stored in the user profile data store 785.
  • user consent is obtained prior to storing any information that is private to a user or can be used to identify a user.
  • the mapping/proj ection engine 787 performs processing of data relating to a cosmetic routine, such as generating mappings of target surfaces using image/sensor data and/or generating a projection of the cosmetic routine as an augmented reality filter, which can then be transmitted to the user device(s) 190.
  • the projection engine 787 generates cosmetic design data using user information from the user profile data store 785, the product data store 783, the 3D model engine 781, or some other source or combination of sources.
  • the 3D model engine 781 can employ machine learning or artificial intelligence techniques (e.g., template matching, feature extraction and matching, classification, artificial neural networks, deep learning architectures, genetic algorithms, or the like). For example, to generate the cosmetic design in accordance with a surface mapping of a face, the 3D model engine 781 can analyze a facial mapping generated by the 3D model engine 781 to measure or map contours, wrinkles, skin texture, etc., of the target body surface. The 3D model engine 781 can receive data describing a cosmetic design based on an identifier code provided by the user through the user device(s) 190. In such a scenario, the 3D model engine 781 can use such information to generate a projection of the cosmetic design (e.g., cosmetic design 110 of FIG. 1) onto an image of the corresponding body surface of a user the augmented reality cosmetic design filter.
  • machine learning or artificial intelligence techniques e.g., template matching, feature extraction and matching, classification, artificial neural networks, deep learning architectures, genetic algorithms, or the like.
  • the devices shown in FIG 7 can communicate with each other via the network 170, which can include any suitable communication technology including but not limited to wired technologies such as DSL, Ethernet, fiber optic, USB, Firewire, Thunderbolt; wireless technologies such as WiFi, WiMAX, 3G, 4G, LTE, 5G, 10G, and Bluetooth; and private networks (e.g., an intranet) or public networks (e.g., the Internet).
  • wired technologies such as DSL, Ethernet, fiber optic, USB, Firewire, Thunderbolt
  • wireless technologies such as WiFi, WiMAX, 3G, 4G, LTE, 5G, 10G, and Bluetooth
  • private networks e.g., an intranet
  • public networks e.g., the Internet
  • functionality described as being implemented in multiple components can instead be consolidated into a single component, or functionality described as being implemented in a single component can be implemented in multiple illustrated components, or in other components that are not shown in FIGs. 1 or 7.
  • devices in FIGs. 1 and 7 that are illustrated as including particular components can instead include more components, fewer components, or different components without departing from the scope of described embodiments.
  • functionality that is described as being performed by a particular device or subcomponent can instead be performed by one or more other devices within a system.
  • the 3D model engine 714 can be implemented in client computing device 130 or in some other device or combination of devices.
  • system 700 allows some aspects of the process to be conducted independently by personal care devices or client computing devices, while moving other processing burdens to the remote computer system 160 (which can be a relatively high- powered and reliable computing system), thus improving performance and preserving battery life for functionality provided by personal care devices or client computing devices.
  • engine refers to logic embodied in hardware or software instructions written in a programming language, such as C, C++, COBOL, JAVATM, PHP, Perl, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft .NETTM, and/or the like.
  • An engine can be compiled into executable programs or written in interpreted programming languages.
  • Software engines can be callable from other engines or from themselves.
  • the engines described herein refer to logical modules that can be merged with other engines or divided into sub-engines.
  • the engines can be stored in any type of computer-readable medium or computer storage device and be stored on and executed by one or more general purpose computers, thus creating a special purpose computer configured to provide the engine or the functionality thereof.
  • a “data store” as described herein can be any suitable device configured to store data for access by a computing device.
  • a data store is a highly reliable, high-speed relational database management system (DBMS) executing on one or more computing devices and accessible over a highspeed network.
  • DBMS relational database management system
  • Another example of a data store is a key-value store.
  • any other suitable storage technique and/or device capable of quickly and reliably providing the stored data in response to queries can be used, and the computing device can be accessible locally instead of over a network, or can be provided as a cloud-based service.
  • a data store can also include data stored in an organized manner on a computer-readable storage medium, as described further below.
  • FIG. 8 is a block diagram that illustrates aspects of an example computing device 800, in accordance with various embodiments. While multiple different types of computing devices are described in reference to the various embodiments, the example computing device 800 describes various elements that are common to many different types of computing devices. While FIG. 8 is described with reference to a computing device that is implemented as a device on a network, the description below is applicable to servers, personal computers, mobile phones, smart phones, tablet computers, embedded computing devices, and other devices that can be used to implement portions of embodiments of the present disclosure. Moreover, those of ordinary skill in the art and others will recognize that the computing device 800 can be any one of any number of currently available or yet to be developed devices.
  • the example computing device 800 includes at least one processor 802 and a system memory 804 connected by a communication bus 806.
  • the system memory 804 can be volatile or nonvolatile memory, such as read only memory (“ROM”), random access memory (“RAM”), EEPROM, flash memory, or similar memory technology.
  • ROM read only memory
  • RAM random access memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or similar memory technology.
  • system memory 804 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by the processor 802.
  • the processor 802 can serve as a computational center of the computing device 800 by supporting the execution of instructions.
  • the computing device 800 can include a network interface 810 comprising one or more components for communicating with other devices over a network.
  • Embodiments of the present disclosure can access basic services that utilize the network interface 810 to perform communications using common network protocols.
  • the network interface 810 can also include a wireless network interface configured to communicate via one or more wireless communication protocols, such as WiFi, 2G, 3G, LTE, WiMAX, Bluetooth, Bluetooth low energy, and/or the like.
  • the network interface 810 illustrated in FIG. 8 can represent one or more wireless interfaces or physical communication interfaces described and illustrated above with respect to particular components of the system 100 of FIG. 1.
  • the computing device 800 also includes a storage medium 808.
  • the storage medium 808 depicted in FIG. 8 is represented with a dashed line to indicate that the storage medium 808 is optional.
  • the storage medium 808 can be volatile or nonvolatile, removable or nonremovable, implemented using any technology capable of storing information including, but not limited to, a hard disk drive, solid state drive, CD ROM, DVD, or other disk storage, magnetic cassettes, magnetic tape, magnetic disk storage, and/or the like.
  • computer-readable medium includes volatile and non-volatile and removable and non-removable media implemented in any method or technology capable of storing information, such as computer readable instructions, data structures, program modules, or other data.
  • system memory 804 and storage medium 808 depicted in FIG. 8 are merely examples of computer-readable media.
  • FIG. 8 does not show some of the typical components of many computing devices.
  • the example computing device 800 can include input devices, such as a keyboard, keypad, mouse, microphone, touch input device, touch screen, and/or the like. Such input devices can be coupled to the example computing device 800 by wired or wireless connections including RF, infrared, serial, parallel, Bluetooth, Bluetooth low energy, USB, or other suitable connections protocols using wireless or physical connections.
  • the example computing device 800 can also include output devices such as a display, speakers, printer, etc. Since these devices are well known in the art, they are not illustrated or described further herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Architecture (AREA)
  • Multimedia (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Analysis (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Systems, devices, and methods for generating, sharing, and presenting augmented reality cosmetic design filters. One or more non-transitory computer memory devices can store computer-readable instructions that, when executed by computing circuitry, cause the computing circuitry to perform operations for generating augmented reality cosmetic design filters. The operations can include defining a baseline description of a biological surface using one or more radiation sensors, the radiation sensors being electronically coupled with the computing circuitry. The operations can include identifying an application of a cosmetic formulation to the biological surface using one or more of the radiation sensors. The operations can include generating a trace describing the application in reference to the baseline description. The operations can also include outputting the trace.

Description

AUGMENTED REALITY COSMETIC DESIGN FILTERS
CROSS-REFERENCE(S) TO RELATED APPLICATION(S)
This application claims the benefit of U.S. Patent Application No. 17/491,061, filed on September 30, 2021, and French Patent Application No. FR 2113547, filed on December 15, 2021; the contents of which are hereby incorporated by reference in their entirety.
SUMMARY
Systems, devices, and methods for generating, sharing, and presenting augmented reality cosmetic design filters. One or more non-transitory computer memory devices can store computer-readable instructions that, when executed by computing circuitry, cause the computing circuitry to perform operations for generating augmented reality cosmetic design filters. The operations can include defining a baseline description of a biological surface using one or more radiation sensors, the radiation sensors being electronically coupled with the computing circuitry. The operations can include identifying an application of a cosmetic formulation to the biological surface using one or more of the radiation sensors. The operations can include generating a trace describing the application in reference to the baseline description. The operations can also include outputting the trace.
In some embodiments, identifying the application of the cosmetic formulation includes detecting the cosmetic formulation relative to the biological surface using one or more of the radiation sensors and the baseline description. Generating the trace can include receiving information describing a plurality of shape types, attributing a first shape type of the shape types to at least a portion of the biological surface using the baseline description, generating a numerical representation of the application of the cosmetic formulation, the numerical representation describing position information relative to the baseline description, and transforming the numerical representation from the first shape type to a second shape type of the shape types. The biological surface can be a face. The first shape type and the second shape type can describe a facial feature. Defining the baseline description of the biological surface can include projecting invisible electromagnetic radiation onto the biological surface using a radiation source electronically coupled with the computing circuitry. The one or more radiation sensors can include a camera configured to detect electromagnetic radiation from ultraviolet, visible, or infrared spectral ranges, or a combination thereof.
In some embodiments, generating the trace includes tracking a motion of the application relative to the biological surface and generating a numerical representation of the motion relative to the baseline description. The operations can further include identifying the cosmetic formulation and defining a color of the cosmetic formulation using identifier information of the cosmetic formulation. The operations can further include estimating a surface tone of the biological surface, estimating a formulation tone of the cosmetic formulation using, and determining a color of the cosmetic formulation using the surface tone and the formulation tone.
In some embodiments, the one or more non-transitory computer memory devices are electronically coupled with a smart phone. Outputting the trace can include communicating the trace to a remote computing system.
A method for generating an augmented reality cosmetic design filter can include defining a baseline description of the biological surface using one or more radiation sources and one or more radiation sensors. The method can include identifying an application of a cosmetic formulation to the biological surface using the radiation sensors. The method can include generating a trace describing the application in reference to the baseline description. The method can also include outputting the trace.
In some embodiments, generating the trace includes receiving shape information describing a plurality of shape types, attributing a first shape type of the shape types to at least a portion of the biological surface using the shape information, generating a numerical representation of the application of the cosmetic formulation, the numerical representation describing position information relative to the baseline description, transforming the numerical representation from the first shape type to a second shape type of the shape types. In some embodiments, identifying the application of the cosmetic formulation includes detecting an applicator of the cosmetic formulation, estimating a position of the applicator of the cosmetic formulation relative to the biological surface, tracking a motion of the applicator relative to the biological surface, and generating a numerical representation of the motion relative to the baseline description. The method can further include estimating a surface tone of the biological surface, estimating a formulation tone of the cosmetic formulation, and determining a color of the cosmetic formulation using the surface tone and the formulation tone. This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
DESCRIPTION OF THE DRAWINGS
FIGURE 1 is a schematic diagram illustrating an embodiment of a system for extrapolating a cosmetic design, in accordance with various embodiments.
FIGURE 2 is a schematic diagram illustrating an example technique for preparing an augmented reality cosmetic design filter, in accordance with various embodiments.
FIGURE 3 is a schematic diagram illustrating an example technique for deploying an augmented reality cosmetic design filter, in accordance with various embodiments.
FIGURE 4 is a schematic diagram illustrating an example technique for transforming an augmented reality cosmetic design filter from one shape type to another shape type, in accordance with various embodiments.
FIGURE 5 is a schematic diagram illustrating an example technique for generating an augmented reality cosmetic design filter in multiple layers, in accordance with various embodiments.
FIGURE 6 is a schematic diagram illustrating an example technique for determining a formulation color using surface tones and formulation tones, in accordance with various embodiments.
FIGURE 7 is a block diagram that illustrates an example system, including components of the system of FIG. 1, in accordance with various embodiments.
FIGURE 8 is a block diagram that illustrates aspects of an example computing device, in accordance with various embodiments.
In the above-referenced drawings, like reference numerals refer to like parts throughout the various views unless otherwise specified. Not all instances of an element are necessarily labeled to simplify the drawings where appropriate. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles being described. The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings. DETAILED DESCRIPTION
Application of cosmetics and makeup in patterns and shapes can be difficult by hand. For example, intricate designs and theatrical makeup are typically applied by certified makeup professionals. Additionally, self-application can be a challenge generally for those with limited mobility or vision. Even relatively simple cosmetic designs can include layering of multiple cosmetic products, blending, and application of cosmetics that are specific to a shape type of a face or facial feature. For example, makeup or other cosmetics are typically applied to an oval face in a different configuration than an application to a rectangular or oblong face. In this way, simply mapping a static texture object, such as a texture mask, onto a surface fails to reproduce the aesthetic effect of a cosmetic design as it would be applied to the surface. Additionally, the precise order and extent of operations making up a cosmetic routine typically cannot be determined from an image of the finished design. To that end, makeup tutorials are typically provided as videos or step-by-step guides with images showing the design at multiple intermediate states, leading to the final effect. Even then, face shape-specific tutorials and skin tone specific tutorials are more effective than general tutorials at demonstrating a particular cosmetic design. Such approaches, however, can be time consuming to view and difficult to reproduce without proper tools and experience with each of the specific techniques and cosmetic formulations used.
Techniques are described for generating an augmented reality cosmetic design filter from a cosmetic routine applied to a target body surface of a biological subject, such as a subject’s face or other region of interest. Described embodiments employ image sensors to define one or more 3D contour mappings of the target body surface. In the context of such applications, described embodiments provide improved precision and greater ease of use over complex manual routines. The techniques, therefore, improve the manual application of cosmetics through distribution of dynamic augmented reality cosmetic design filters that can be automatically (e.g., without human intervention) adapted for shape type and skin tone. The filters can reproduce step-by-step cosmetic routines, including augmented reality renderings of specific tools, application traces, and finished effects in a manner that is sensitive to shape type and skin tone of the end user of the augmented reality filters. In an illustrative example, a user of a smartphone creates a video of their makeup routine that produces a particular aesthetic effect. As part of making the video, the smartphone captures images and a surface mapping of the user’s face. Initial images are used to generate a baseline with which subsequent applications of cosmetics, such as foundation, highlight, filler, eyeshadow, blush, mascara, or the like, can be detected. While the video is being made, the smartphone iteratively redefines the baseline as part of defining each new application as a new layer of a cosmetic design and recognizes new applications by detecting motion of the user’s hand, identifying a new tool or formulation, and/or detecting a color shift on the user’s face in reference to the updated baseline. Over the course of the routine, a multi-layer dynamic filter is generated including motion, texture, and/or transparency information. The dynamic filter is referenced to a face shape of the creator of the makeup video and is stored on a distributed storage network. The filter is available to different users for virtual projection through an augmented reality interface. The filter can be used to reproduce layer effects, to realistically map a cosmetic design onto a different face shape, and to guide a viewer through a cosmetic routine in a way that is specific to the viewer’s face shape and skin tone, as described in the paragraphs, below.
FIG. 1 is a schematic diagram illustrating an embodiment of an example system 100 for generating an augmented reality cosmetic design filter 110, in accordance with various embodiments. The example system 100 includes one or more client computing devices 130, a camera 140, one or more remote computer systems 160, also referred to as server(s), and one or more user devices 190. As part of the example system 100, the constituent system components can be operably coupled through wireless communication and/or through wired communication. In some embodiments, constituent components can communicate directly through wireless pairing (e.g., Bluetooth) and/or via a local wireless network (e.g., through a wireless router). In some embodiments, constituent components can communicate over one or more networks 170, which can be or include a public network (e.g., the internet) or a private network (e.g., an intranet). In this way, the example system 100 can include multiple distinct devices configured to communicate electronic information through wireless connections. Additionally or alternatively, some of the constituent components can be integrated into a single device.
For clarity of explanation, the augmented reality cosmetic design filter 110 is also referred to as the cosmetic design 110. The cosmetic design 110 is a numerical representation of a cosmetic routine including a set of textures, mapping data, surface information, and metainformation that is stored in memory of the client computing device(s) 130, the server(s) 160 and/or the user device(s) 190. Rather than a single texture reproducing a final result of a makeup routine on a particular face that is projected onto a flat plane and used subsequently as a texture mask, the cosmetic design 110 includes one or more traces 115 that are referenced to contour maps of relevant facial features 120 and/or to baseline features 125 of the target region. The cosmetic design 110 can also include data describing dynamic elements 117 of the traces 115. In this way, the cosmetic design 110 can be deployed to a user device 190 as an animated augmented reality filter reproducing, step-by-step, the lay er- wise application of the cosmetic design 110.
As an illustrative example, the client computing device 130 can incorporate the camera 140. Similarly, the example system 100 can include multiple client computing devices 130, where a first client computing device 130 is a mobile electronic device (e.g., a tablet, smartphone, or laptop) that is configured to host user interface elements and to connect to the server(s) 160 over the network(s) 170, while a second client computing device 130 is integrated with the camera 140 and is configured to coordinate operation of the camera 140 with the first client computing device 130. The constituent components of the example system 100 can be provided with computer-executable instructions (e.g., software, firmware, etc.) to implement and coordinate the operation of one or more features of the example system 100. In this way, the operation of the example system 100 can be coordinated via a user interface, accessed via one or more of the constituent components.
The client computing device(s) 130 can be or include a purpose-built mobile computing device including the camera 140, one or more electromagnetic (EM) radiation sources (illustrated as part of the camera 140), and one or more user interface elements 131 to prompt a biological surface 180 with visual and/or auditory prompts. For example, the interface elements 131 can be or include a display 133 to generate a visual representation of the cosmetic design 110. The interface elements 131 can also include user input components including, but not limited to, touch screen, keyboard, trackpad, or mouse. In this example, the components of the client computing device 130 can be integrated into a housing resembling a consumer cosmetic product such as, for example, a bathroom or vanity mirror. In this example, the housing can conceal power sources, heat management systems, and other components.
While the client computing device 130 and camera 140 are illustrated in a particular configuration, additional and/or alternative form factors are contemplated. For example, the system 100 can include a smartphone or tablet computer in communication with the client computing device 130, such that one or more computer-executable operations are undertaken by the smartphone or tablet computer. In this way, image generation operations can be executed by a smartphone incorporating the camera 140 and image processing operations can be executed at least in part on a separate device, such as the server(s) 160.
In some embodiments, the camera 140 can be or include multiple sensors and/or sources including, but not limited to a visible light image sensor 141, a depth sensor 143 and/or a source of invisible EM radiation 145, including but not limited to infrared or nearinfrared radiation. As with the client computing device(s) 130, the camera 140 can include communication circuitry 147 to enable wireless communication and/or pairing with the other constituent components of the example system 100. While the camera 140 is illustrated as a separate component of the example system 100, the camera 140 can also be integrated into one of the other constituent components of the example system 100. For example, the client computing device 130 can incorporate the camera 140 (e.g., as a smartphone or tablet)
The depth sensor 143 can capture one or more images of the biological surface 180, including, but not limited to, images of a target body surface 181 of the biological surface 180. In the illustration provided in FIG. 1, the biological surface 180 is a human using the example system 100 to create the cosmetic design 110 and the target body surface 181 is the face of the human in the region around the eye and eyebrow. The depth sensor 143 can generate a surface mapping 183 of the target body surface 181. Contours and depth information for the target body surface 181 can vary over time or between users, and the camera can generate the surface mapping 183 as part of operations for extrapolating, modifying and/or generating the cosmetic design 110 by the client computing device(s) 130.
The depth sensor 185 can be an image sensor and can capture images within a field of view 185 including the target body surface 181. The depth sensor 143 can be or include, but is not limited to, a laser-based sensor (e.g., LiDAR), a time-of-flight camera, a structured light generator, a visual simultaneous localization and mapping (vSLAM) sensor assembly including motion sensors and visible image cameras, or an ultrasound-based sensor assembly, such that the camera 140 can generate the surface mapping 183.
For example, where the depth sensor 143 is an infrared depth sensing camera, the source of invisible EM radiation 145 can be or include an infrared source that exposes the biological surface 180 including the target body surface 181 to structured invisible infrared radiation. In another example, where the depth sensor is a LiDAR system or a time-of- flight camera, the source of invisible EM radiation 145 can be or include an infrared diode laser. In this way, the EM radiation generated by the source of invisible EM radiation 145 can be scanned or otherwise directed toward the target body surface 181 over an angular spread 187, such that the target body surface 181 is exposed. In some embodiments, detection of the target body surface 181 is facilitated and/or enabled by feature and/or edge detection applied to visible spectrum (e.g., RGB) images captured by the visible light sensor 141 (e.g., by vSLAM techniques).
The surface mapping 183 can provide contour information and/or position information for features in the target body surface 181, for example, precise information about the relative position of the eyebrow ridge and the bridge of the nose, where the eyebrow begins and ends relative to the eye, etc. In this way, the surface mapping 183 can be used to generate, and/or extrapolate the cosmetic design 110 for deploying to a social media platform or other service to distribute the cosmetic design 110 to users via user device(s) 190. In this way, the cosmetic design 110 can be represented as a data file that is communicated to the server(s) 160, for example, as part of an online platform and/or database of cosmetic designs.
The surface mapping 183 can be used to modify the cosmetic design 110 by determining a shape type of a feature of the biological surface 180 and transforming the cosmetic design 110 to another, different shape type. In some embodiments, multiple layers are defined as part of generating or modifying the cosmetic design 110. The example system 100 can implement the example techniques described in reference to FIGs. 2-6 to detect an application of multiple cosmetic formulations to the target body surface 181 and can generate one or more traces 115 describing the applications spatially and temporally that together define the cosmetic design 110.
Advantageously, the cosmetic design 110 can be distributed to the user device(s) 190 as an augmented reality filter, for example, through integration with social media, with improved accuracy and precision by mapping both the target body surface 181 and a corresponding region of a user of the user device 190, and by extrapolating the cosmetic design 110, or each constituent layer of the cosmetic design 110, to fit the corresponding region. Additionally, the cosmetic design 110 can be presented dynamically, in the form of a tutorial visualization, whereby both layers and traces are animated to demonstrate layers, motions, colors, and other cosmetic aspects of the cosmetic design 110. In this way, the cosmetic design 110 represents an improvement to user experience by adapting the cosmetic design 110 from the creator to the user, and also represents an improvement to system performance, as the complete cosmetic design 110 can be transferred as mask data rather than as image data. As such, data volumes and computational resources involved in distributing the cosmetic design 110 over the network(s) 170 and storing on the server(s) 160, are reduced.
FIG. 2 is a schematic diagram illustrating an example process 200 for preparing the augmented reality cosmetic design filter 110, in accordance with various embodiments. The example process 200 can be implemented as a number of operations executed or otherwise performed by constituent components of the example system 100 of FIG. 1. In this way, the operations can be or include operations performed by one or more processors of a computer system, such as client computing device(s) 130 of FIG. 1, in response to execution of computer-readable instructions stored on non-transitory memory of the computer system. While the operations are illustrated in a particular order, the example process 200 can include more or fewer operations, and the order of operations can vary. In some embodiments, some operations are performed by different physical computers, as described in more detail in reference to FIG. 1. For example, some operations can be performed by different components interchangeably or can be performed by coordinated operation of two or more components.
At operation 201, the computer system defines a baseline description 210 of the biological surface 180. In some embodiments, the baseline description 210 is defined as part of generating the surface mapping 183, for example, by referencing visible color information to surface contour mapping information describing the biological surface 180. In this way, the baseline description 210 can include numerical representations (e.g., digital data) of the biological surface 180 including both the surface mapping 183 and color information. The example process 200 is illustrated in the context of the target body surface 181 of FIG. 1, but it is understood that the biological surface 180 can be or include a face, a hand, a portion of a face, or other surface to which a cosmetic formulation is applied.
In some embodiments, defining the baseline description 210 includes generating one or more images of the biological surface 180, using the camera 140 or other radiation sensors that are configured to detect visible light or invisible EM radiation. Invisible EM radiation can be projected onto the biological surface 180 using a radiation source of the camera 140, such as a structured light generator, LiDAR source, or ToF camera source. For example, the images can describe facial features 120 of the biological surface 180 as well as baseline features 125. Baseline features 125 can be or include materials applied to the biological surface 180 prior to example process 200, such that at least a portion of the biological surface 180 includes a pigment, cosmetic formulation, or other feature that is detectable by the camera 140. For example, baseline features 125 are illustrated as eyelashes and eyeliner. Similarly, baseline features 125 can include, but are not limited to, blemishes, wrinkles, scars, or other aesthetic aspects of the biological surface 180 other than facial features 120. Facial features 120, by contrast, refer to organs, hair, and other topographical features of the biological surface 180 that define the surface mapping 183 and influence the baseline description 210 through physical effects including, but not limited to, shadows, perspective, or the like.
At operation 203, the computer system identifies one or more applications 215 of cosmetic formulation(s) 220 to the biological surface 180. The cosmetic formulation(s) 220 can include, but are not limited to, makeup, mascara, lipstick, lipliner, eyeliner, glitter, aesthetic creams, ointments, or other materials that can be applied topically to the biological surface 180. The application(s) 215 of the cosmetic formulation(s) 220 can be detected by the camera 140 using radiation sensors, such as visible light sensors, invisible EM radiation sensors, or the like. In this way, the application(s) 215 can be detected as a shift in the biological surface relative to the baseline description, for example, in terms of coloration, surface reflectivity, or absorbance/reflectance of invisible EM radiation. In an illustrative example, a glossy lip coating can be detected through an increase in specular reflection of the biological surface 180 relative to the baseline description that includes negligible specular reflection. Similarly, a matte foundation can be detected through a decrease in specular reflection and an increase in diffuse reflection relative to the baseline description. In a non-limiting example, specular reflection can be detected by defining a luminance channel in visible light images and thresholding the luminance channel, where specular reflection is defined as a luminance value exceeding the threshold, on a pixel-wise basis.
In some embodiments, the computer system detects an applicator 225, such as a brush, sponge, pencil, finger, or other applicator 225 in proximity to the biological surface 180. In some embodiments, the computer system recognizes an identifier of the applicator 225, for example, where the applicator 225 includes a registration mark such as a barcode, QR code, RFID tag, or other representation of the identifier information. In this way, the computer system can identify the application 215 spatially and temporally and can identify the cosmetic formulation(s) 220. To that end, operation 203 can optionally include identifying the cosmetic formulation 220 and defining a color of the cosmetic formulation 220 using the identifier information of the cosmetic formulation 220. For example, while the color of an unidentified cosmetic formulation 220 can be estimated using the camera 140, as described in more detail in reference to FIG. 6, identifying the applicator 225 can permit the computer system to directly determine the color of the cosmetic formulation 220 where the applicator 225 is loaded with a particular cosmetic formulation 220, rather than being a multi-purpose applicator 225. Examples of applicators 220 for which the cosmetic formulation 220 can be identified in such a manner include, but are not limited to, pencils, lipsticks, gel applicators, or other applicators 220 that are available with a prepared cosmetic formulation 220. The application 215 of cosmetic formulation 220 can include a motion 230 of the applicator 225 relative to the biological surface 180. The motion 230 can include meaningful information with respect to the aesthetic effect of the overall cosmetic design 110, such as in blending and smudging motions. As such, motions 230 of applicators 220 can be detected and used during operations for generating traces.
Applications 215 can also include modifications or manipulations of formulations 220 already disposed on the biological surface 180. For example, the extent, coloration, and reflectance of a cosmetic formation 220 can be modified by smoothing, smudging, blending, spreading, or other techniques. Such applications 215, not adding a new formulation 220 to the biological surface 180, but rather modifying an already disposed formulation 220, can be identified by detecting a tool or a finger, and detecting a change in the biological surface 180 relative to the baseline description 210. As described in reference to FIG. 1, the baseline description 210 can be relative to each respective application 215, where applications 215 are layered and/or blended. For example, the baseline description 210 for a particular region of the biological surface 180, to which a formulation 220 is already applied, can be redefined to include the formulation 220. In this way, subsequent applications 215 can be more precisely and accurately detected.
At operation 205, the computer system generates one or more traces 115 describing the application 215 identified at operation 203. Generating the traces 115 can include determining an extent 217, layer sequence (see FIG. 5), and coloration (see FIG. 6) of the applications 215, such that the cosmetic design 110 can realistically reproduce the aesthetic effect of the applications 215 and can also reproduce the sequence of the applications 215. The extent 217, coloration, and layer sequence information, as well as other information relevant to realistically reproducing the traces 115 on user device(s) 190, can be encoded as numerical representations, such as intensity datasets referenced to the baseline description 210. For example, the extent 217 of the applications 215, illustrated as dashed lines in FIG. 2, can be detected as a shift in coloration, a shift in absorbance and/or reflectivity of EM radiation relative to the baseline description. In this way, the traces 115 can reproduce the aesthetic effect of individual applications 215.
In some embodiments, the computer system tracks the motion 230 of the applicator using images and or contour data generated by the camera 140. Generating the traces 115, therefore, can include generating numerical representations of the various motions 230 relative to the biological surface 180, as described in the baseline description, as well as the color information describing the cosmetic formulations 220 used in the applications 215. The traces 115, once generated, can be transformed to reproduce the aesthetic effect of the cosmetic design 110 on differently shaped biological surfaces 180, as viewed through user device(s) 190, described in more detail in reference to FIG. 3.
Traces 115 can encode layer-blending and layering information, such that the combination of multiple traces 115 can reproduce interactions between traces 115. For example, a set of applications 215 to a region of the biological surface 180 between the eye and eyebrow of multiple eyeshadow colors can be blended to impart a color gradient in an eyeshadow region. In this example, the example process 200 includes multiple iterations of operations 201-205, such that the region between the eyebrow and the eye are described by multiple traces 115 and also by interaction terms that describe how the multiple traces relate. In some embodiments, traces 115 include transparency information, such as alpha channel information, referenced to dynamic elements 117 that describe localized transparency of coloration or other aesthetic effects of the applications 215. In this way, traces 115 can reproduce manipulation of the cosmetic formulations 220, such as blending, smudging, smoothing, or other techniques that are typical of cosmetic designs. In an illustrative example, traces 115 and/or dynamic elements 117 can include a vector of transparency values referenced to a motion 230 and a formulation 220, such that the trace 115 reproduces a smoothing or blending of the formulation 220 based on the motion 230.
At operation 207, the computer system outputs the trace(s) 115 detected as part of operations 201-205. Outputting the trace(s) 115 can include electronic operations including, but not limited to, storing the trace(s) 115 in a local data store of the client computing device 130, transferring the trace(s) 115 to server(s) 160 via the network 170, and or sharing the trace(s) 115 with the user device(s) 190 directly (e.g., through electronic pairing) or via the network 180.
While the description of the example process 200 has focused on cosmetic designs for eyebrow/eye regions, the operations can similarly be applied to other surfaces. For example, the cosmetic design 110 can describe applications 215 of cosmetic formulations 220 to additional/al ternative surfaces including, but not limited to, lips, nose, cheeks, forehead, or hands. Similarly, cosmetic designs 110 can modify the appearance of facial features 120, including but not limited to eyebrows, eyes, lips, cheekbones, jawline, or hands. In another illustrative example, a cosmetic design 110 can include a sequence of traces 117 for emphasizing the appearance of cheekbones through application of one or more cosmetic formulations 120.
FIG. 3 is a schematic diagram illustrating an example process 300 for deploying the augmented reality cosmetic design filter 110, in accordance with various embodiments. As with example process 200 of FIG. 2, the example process 300 can be implemented as a number of operations executed or otherwise performed by constituent components of the example system 100 of FIG. 1. In this way, the operations can be or include operations performed by one or more processors of a computer system, such as user device(s) 190 of FIG. 1, in response to execution of computer-readable instructions stored on non-transitory memory of the computer system. While the operations are illustrated in a particular order, the example process 300 can include more or fewer operations, and the order of operations can vary. In some embodiments, some operations are performed by different physical computers, as described in more detail in reference to FIG. 1. For example, some operations can be performed by different components interchangeably or can be performed by coordinated operation of two or more components, such as user device(s) 190 and server(s) 160.
At operation 301, a user device 190 receives data 310 describing the cosmetic design 110. Receiving the data 310 can include electronic communication over the network 170 (e.g., the internet, a cellular network, or the like), but can also include direct communication through pairing or other approaches. In FIG. 3, the user device 190 is illustrated receiving the data 310 from server(s) 160, but it is understood that the data 310 can be received from the client computing device(s) 130. The data 310 can include traces 115 generated during one or more iterations of the example process 200, as described in more detail in reference to FIG. 2, and can also include metainformation describing the biological surface 180, such as a shape-type of the biological surfacel80, as described in more detail in reference to FIG. 4.
Operations for receiving the data 310 can include one or more data transfer techniques including, but not limited to, wireless communication or wired communication. For example, the user device 190 can communicate wirelessly with the client computing device 130 to receive the data 210 as a wireless transmission.
In some embodiments, receiving the data 310 can include a browser environment, such as a recommendation engine, whereby a user of the user device 190 can view a set of cosmetic designs 110 generated by one or more creators, rather than directly communicating with the creators to request the design. Additionally or alternatively, the data 310 can be pushed to the user device 190, for example, as part of a social media service to which the user of the user device 190 can register, follow, and/or subscribe. In some embodiments, the data 310 is recommended based at least in part on identifier information describing the user of the user device 190. Where the user of the user device 190, through historical traffic data, demonstrates an aesthetic preference, cosmetic designs 110 reflecting that preference can be preferentially recommended. Demographic, socio-economic, or biometric data can also be used to recommend cosmetic designs 110. For example, where the user of the user device 190 resides in a geographic region, cosmetic designs 110 reflecting the trends for the geographic region can be recommended. Similarly, where the user of the user device 190 is within a given age range, is employed in a given field or sector, or is a part of a given social network, cosmetic designs reflecting trends corresponding to each respective category can be identified and recommended.
At operation 303, the user device 190 maps a user surface 315 corresponding to the biological surface 180, to which the cosmetic design 110 will be virtually applied. Mapping the user surface 315 can include operations similar to those described in reference to generating the surface mapping 183 of FIG. 1, including but not limited to structured light methods, LiDAR or ToF methods, vSLAM methods, ultrasonic methods, or the like. The surface 315 can be described by a mesh of polygons 317 that can each define multiple properties for the corresponding region of the surface 315 including, but not limited to, tone, reflectance, temperature, or the like. Where the user device 190 is a smartphone incorporating depth sensors 319, operation 303 can include executing one or more applications stored in memory of the user device 190 to map the surface using the depth sensors. In this way, the user surface 315 can be represented as a numerical representation in data, describing surface contours and facial features 320 of the user surface 315. In some embodiments, the example process 300 includes integrating the cosmetic design 110 with an existing aesthetic look of the user surface 315. To that end, aesthetic features 321 such as makeup, mascara, eyelashes, or the like, are included during surface mapping.
Mapping the user surface 315 can also include estimating a surface tone, which can be a local surface tone, as described in more detail in reference to FIG. 6. In this way, the example process 300 can account for a difference in skin tones between the biological surface 180 and the user surface 315. For example, metadata provided as part of the data 310, or alternatively as stored in memory of the server(s) 160 and/or the user device 190, can include reference or lookup tables cross-referencing colors to skin tones, such that the cosmetic design 110 can be adapted to reproduce the aesthetic effect of the cosmetic design 110, rather than the literal transposition of exact colors.
At operation 305, the traces 115 received as part of the data 310 are used to generate filters 325, referenced to the mapping of the user surface 315 generated at operation 303. The filters 325 can be or include numerical representations, such as texture objects, that are formatted to be combined with the mapping of the user surface 315. In an illustrative example, the user surface 315 is described by a mesh of polygons 317 and a filter 325 is referenced to the polygons 317, such that each polygon 317 defines a vector of values describing the appearance of the filter 325.
In some embodiments, the filters 325 include dynamic elements 330, such as animation effects or interlayer-effects, that can be used to demonstrate the sequence of applications 215 used to create the cosmetic design 110. Dynamic elements 330 can reproduce at least a part of the dynamic elements 117 described in reference to the traces 115 in FIGs. 1-2.
At operation 307 the computer system presents the filters 325 integrated with images 335 of the user surface 315. Presenting the filters 325 can include displaying the filters 325 through a screen 337. For example, the user device 190 can be a smartphone or a tablet including a touchscreen display. The filters 325 can be mapped to the user surface 315 in real time or near real time using feature tracking or other augmented reality techniques, where “near real time” describes a qualitative experience by a user of the user device 190 whereby latency introduced by the operations of the example process 300 are effectively imperceptible.
The filters 325 can be presented as interactive graphical elements that a user of the user device 190 can manipulate through the screen 337, for example, by selecting or highlighting. In some embodiments, selecting a filter 325 can initiate animation of dynamic elements 330 at optional operation 309. Additionally or alternatively, animating the filters 325 can include initiating a virtual tutorial whereby filters 325 are presented in sequence by layer-order, such that the sequence of applications 215 is reproduced on the user surface 315. The sequence can be paused, reversed, advanced, or otherwise manipulated via the user device 190, and specific dynamic elements can be activated by selecting a filter 325, for example, with a user action 340 on the screen 337.
In some embodiments, interaction with the screen 337 while the filters 325 are being presented permits the user of the user device 190 to edit, add, or remove one or more filters 325. Different user actions 340, instead of activating dynamic elements, can initiate one or more alternative processes. For example, a double finger tap can open a menu 341 that permits the user of the user device 190 to view information describing the applicator 225, the cosmetic formulation 215, or other information describing the filter 325. Similarly, a press-and-hold on the filter 325 can cause the user device 190 to access an online marketplace to permit the user to purchase the cosmetic formulation 220, the applicator 225, or other consumer products. In some embodiments, a menu of social media controls 343 can be generated, such that the user of the user device can share the image 335 with the filters 325, can communicate with other users, or can provide feedback or otherwise communicate with the creator of the cosmetic design 110. It is understood that the types of user actions described are exemplary. User interface elements, such as the menu 341 and/or the social media menu 343 can be summoned by alternative and/or additional interactions. In some embodiments, menu elements are presented on the display 337 by default.
FIG. 4 is a schematic diagram illustrating an example process 400 for transforming the augmented reality cosmetic design filter 110 from one shape type to another shape type, in accordance with various embodiments. As with the example processes 200 and 300, the example process 400 can be implemented as a number of operations executed or otherwise performed by constituent components of the example system 100 of FIG. 1. In this way, the operations can be or include operations performed by one or more processors of a computer system, such as client computing device(s) 130, server(s) 160, and/or user device(s) 190 of FIG. 1, in response to execution of computer-readable instructions stored on non-transitory memory of the computer system. While the operations are illustrated in a particular order, the example process 400 can include more or fewer operations, and the order of operations can vary. In some embodiments, some operations are performed by different physical computers, as described in more detail in reference to FIG. 1. For example, some operations can be performed by different components interchangeably or can be performed by coordinated operation of two or more components.
The traces 115 are generated in reference to the baseline description 210, as described in more detail in reference to FIG. 2, and as such are specific to the biological surface 180. The example process 400, therefore, describes operations for extrapolating the filters 325 from a first face shape to a second face shape, as part of generating the traces 115 and/or the filters 325. While the description in reference to FIG. 4 focuses on face shapes, it is understood that the operations can similarly be applied to shapes of individual facial features, such as eye shapes, mouth shapes, or other feature shapes, such that the traces 115 are transformed to realistically reproduce the aesthetic effect of the traces 115, rather than simply applying the traces 115 as a texture to the user surface 315.
At operation 401, the computer system generates mask data 410 using one or more traces 115. Where traces 115 are relative to topographical data describing the biological surface, generating the mask data 410 includes projecting the data describing the applications 215 onto the topographical data describing the biological surface 180, thereby permitting the modification of the traces 115 from one shape type 420 to another shape type 420.
At operation 403, the computer system attributes a first shape type 420 to at least a portion of the biological surface 180. For example, the computer system can reference shape information describing multiple shape types 420. The shape information can be stored on, accessed, or otherwise received from server(s) 160 or client computing device(s) 130. In this way, example process 400 can include receiving information describing the multiple shape types 420.
Attributing the shape type 420 to the biological surface 180 can include referencing the baseline description 210 to determine characteristic dimensions, spacings, or other aspects of the biological surface 180 that indicate the shape type 420. In the illustrative example of a face shape, characteristic aspects can be defined by projecting a grid or other form of guide 421 onto the baseline description 210, such that the shape type 420 can be identified, either by defining an outline 423 of the baseline description 210, by identifying a characteristic curvature 425 at one or more points of the baseline description 210, or by identifying characteristic spacings 427 between features. It is understood that approaches involving edge detection and feature detection using computer image processing techniques can be employed to attribute the shape type 420.
At operation 405, the computer system generates one or more shape filters 430 for different shape types 420. As part of generating the shape filters 430, the mask data 410 generated at operation 401 is transformed from the first shape type 420-1 to a second shape type 420-2, a third shape type 420-3, or additional or alternative shape types 420. Transformation of the mask data 410 can include but is not limited to projecting the mask data 410 of the first shape type 420-1 onto the topographical data of the second shape type 420-2, and applying the corresponding transformation onto the traces 115. In this way, the shape filters 430 will reproduce the aesthetic shape and coloration of the traces 115 onto shape types 420 different from the first shape type 420-1.
In some embodiments, generating shape filters 430 for different shape types 420 can include additional modifications to the mask data 410, beyond spatial projection to account for different physical shapes. For example, some shape types 420 are characterized by more or less pronounced facial features, such as brow ridges, forehead curvature, chin shape, cheekbones, jaw shape, or the like. To that end, shape filters 430 can reposition, divide, omit, duplicate, mirror, or otherwise transform mask data 410 as part of generating shape filters 430. In an illustrative example, generating the shape filters 430 for a second shape type 420-2 corresponding to a rectangular face shape from the first shape type 420- 1 corresponding to a heart face shape includes repositioning a forehead trace upward toward the hairline and widening the forehead trace to either edge of the forehead, as well as dividing and mirroring a chin trace along the jawline. Corresponding transformations can be applied to the mask data 410 for each of the traces 115 as part of generating the shape filters 430 for each shape type 420.
In the context of facial features, where the cosmetic design 110 is addressed at a specific facial feature (e.g., an eyeshadow design between an eyebrow and an eye), the constituent operations of the example process 400 can be applied based on similar principles. For example, where the cosmetic design 110 is created in reference to a rounded eye shape type 420, generating shape filters 430 at the operation 405 can include transforming mask data 410 from the rounded eye to an almond eye shape type 420 or other eye-specific shape types 420. It is understood that corresponding operations can be applied to shapes of lips, hands, ears, or other biological surfaces.
FIG. 5 is a schematic diagram illustrating an example process 500 for generating an augmented reality cosmetic design filter 510 in multiple layers, in accordance with various embodiments. As with the example processes 200, 300, and 400, the example process 500 can be implemented as a number of operations executed or otherwise performed by constituent components of the example system 100 of FIG. 1. In this way, the operations can be or include operations performed by one or more processors of a computer system, such as client computing device(s) 130, server(s) 160, and/or user device(s) 190 of FIG. 1, in response to execution of computer-readable instructions stored on non-transitory memory of the computer system. While the operations are illustrated in a particular order, the example process 500 can include more or fewer operations, and the order of operations can vary. In some embodiments, some operations are performed by different physical computers, as described in more detail in reference to FIG. 1. For example, some operations can be performed by different components interchangeably or can be performed by coordinated operation of two or more components.
As described in more detail in reference to FIGs. 1-3, the traces 115 can be generated by recording the sequential application of multiple cosmetic formulations 220 to the biological surface 180, such that the augmented reality cosmetic design filter 510 can be segmented into discrete layers. The presentation of the filter 510, as described in more detail in reference to FIG. 3, can reproduce the layer-wise sequence of applications. Similarly, the example process 400 can also incorporate the layer-wise aspects of the example process 500 where different shape types 420 implicate different layering processes, interlayer blending, as well as more complex light and shadow effects resulting from differences in shape type 420.
At operations 501-507, a sequence of layers are defined, where each layer includes one or more traces 115. For example, at operation 501, a first layer is defined that includes forehead traces 515 and cheekbone traces 517. At operation 503, a second layer is defined that includes forehead trace(s) 520, cheek traces 521, and chin trace(s) 523. At operation 505, highlight layer(s) are defined that include highlight trace(s) 525 to accentuate or emphasize one or more facial features, overlying the first and second layers. At operation 507, eye layer(s) are defined including eyeshadow traces 530 and under-eye traces 531, overlying the highlight layers, the first layers, and the second layers.
In this way, the augmented reality cosmetic design filter 510 can dynamically add or remove layers during presentation on the user device 190. Similarly, the client computing device 130 can associate one or more traces 115 with each of a number of layers. Defining the layers as described in the example process 500 permits inter-layer blending rules to be defined. For example, higher-order layers can overly lower-order layers. Higher-order layers can completely occlude lower-order layers. Transparency or partial transparency, defined for example through an alpha channel of the mask data 410, can permit multiple layers to be overlaid in the images 337 to produce a net aesthetic effect including co-localized contributions from multiple layers.
FIG. 6 is a schematic diagram illustrating an example process 600 for determining a formulation color using surface tones and formulation tones, in accordance with various embodiments.
As described in more detail in reference to FIG. 2, detecting the applications 215 of cosmetic formulations 220 can include determining the color or the cosmetic formulations 220 and/or identifying the cosmetic formulations 220. Where techniques for determining the color include image processing operations based on images generated by the camera 140, the colors of the various cosmetic formations 220 can be influenced by the tone of the biological surface 180 in proximity to the applications 215. In this way, direct reading of image pixel values may produce an inaccurate color rendering when used in filters 325.
At operation 601, the computer system meters a number of regions 610 corresponding to the applications 215 and estimates a surface tone 615 of the biological surface 180 for each of the regions 610. In some embodiments, the operations of the example process 600 are executed in parallel with the example process 200, but can also be executed at a different time, using the baseline description 210, images recorded during the generation of the traces 115, or a calibration set of images generated before commencing the example process 200. It is contemplated that surface tones 615 may differ locally between the regions 610, for example, where a facial feature casts a shadow or where the surface defines one or more contours. To that end, the operation 601 can include defining multiple surface tones 610 for a single application 215.
At operation 603, the computer system meters the regions 610 in images including the applications 215 and estimates one or more local formulation tones 625 of the cosmetic formulations 220. Estimating the surface tones 615 and the formulation tones 625 can include sampling one or more pixels in each of the regions 610 and determining an average coloration of each of the regions 610. It is understood that the formulation tones 625 will be influenced by the underlying surface tones 615, such that the surface tones 615 can serve as reference baseline values for estimating formulation tones in a way that is sensitive to skin tone. Advantageously, implementing computer image processing approaches that include estimating the surface tones 615 in this way permits the cosmetic designs 110 to be responsive to skin tones of the user, rather than presenting a static augmented reality mask.
At operation 605, the computer system determines a color 630 of the cosmetic formulation using the surface tones 615 and the formulation tones 625. In contrast to the surface tones 615 and the formulation tones 625, it is contemplated that each cosmetic formulation 220 includes one or more pigments that are uniformly dispersed or distributed, such that the formulation expresses a single visible color, for example, through diffuse reflection of visible light. To that end, the formulation color 630 can be or include a vector of color components, such as an RGB triad or another color representation approach.
As part of the operation 605, corresponding surface tones 615 and formulation tones 625 can be compared, and the overall effect of the surface tones 615 on the formulation tones 625 can be accounted for, as part of determining the color 630 of the cosmetic formulation 220. In an illustrative example, where different regions 610 are differentially illuminated, comparing the surface tones 615 and the formulation tones 625, for example, by using the surface tones 615 to correct for luminance variation in the formulation tones 625, can control for the effect of illumination on color rendering.
It is understood that determining the formulation color 630 can improve the reproducibility of the aesthetic effect of the cosmetic design 110 on different biological surfaces, in different ambient conditions, and with different formulations. As described in more detail in reference to FIG. 2, the formulation color 630 can be modified to account for differences in skin tone between creator and user. Similarly, the formulation color 630 can be modified to improve realistic color fidelity based on ambient conditions. For example, in bright conditions, the color intensity can be increased, while in dim conditions, the color intensity can be decreased. In some cases, tinting can be modified dynamically to account for light and shape. For example, as part of the example process 300, the formulation color 630 can be modified to include a larger blue component to reflect that the trace 115 to which it corresponds is in shade. Conversely, the blue component can be decreased when the corresponding trace 115 is moved into brighter illumination.
As explained throughout the preceding description, the operations for generating, modifying, and/or presenting the traces 115 and the filters 325 are contemplated to be executed by one or more electronic devices including computer circuitry, communication modules, and computer-readable instructions stored in memory. Through the coordinated operation of the constituent elements of the example system 100 of FIG. 1, augmented reality filters can be generated and presented in near real time to users of user devices 190, integrated into social media platforms. The augmented reality filters can realistically reproduce the aesthetic effects of cosmetic designs, accounting for differences in face shape, skin tone, and ambient conditions, and can be presented dynamically to improve the sharing of cometic routine sequences beyond what is currently possible with augmented reality or conventional social media.
FIG. 7 is a block diagram that illustrates an example system 700, including components of the system of FIG. 1, in accordance with various embodiments. The example system 700 includes the client computing device 130 in electronic communication (e.g., over network 170) with the remote computer system 160. Example system 700 illustrates an example of the system 100 of FIG. 1, in a context of associated system elements, and, as such, describes electronics and software executing operations as described in reference to FIGs. 2-6. FIG. 7 depicts a non-limiting example of system elements, features and configurations; many other features and configurations are contemplated. In the example shown in FIG. 7, the client computing device 130 of FIG. 1 includes a computer system 710, multiple components 720 for interacting with the biological surface 180, a computer-readable medium 730, and a client application 740, that can be stored as computer-executable instructions on the computer-readable medium 730, and, when executed by the computer system 710, can implement the operations described in reference to the system 100 of FIG. 1, and the operations of the example techniques of FIGs. 2-3.
The client computing device 130 incorporates subcomponents including, but not limited to, a power source 711, a human-machine interface 713, one or more processors 715, a network interface 717, and can include the computer-readable medium 730. The power source 711 is a direct-current power source, for example, a rechargeable battery or a rectified power supply configured to connect to line-power (e.g., 110 VAC, 220 VAC, etc.). The human-machine interface (HMI) 713 can include any type of device capable of receiving user input or generating output for presentation to a user, such as a speaker for audio output, a microphone for receiving audio commands, a push-button switch, a toggle switch, a capacitive switch, a rotary switch, a slide switch, a rocker switch, or a touch screen.
The one or more processors 715 are configured to execute computer-executable instructions stored on the computer-readable medium 730. In an embodiment, the processor(s) 715 are configured to receive and transmit signals to and/or from the components 720 via a communication bus or other circuitry, for example, as part of executing the client application 740. The network interface 717 is configured to transmit and receive signals to and from the client computing device 130 (or other computing devices) on behalf of the processors 715. The network interface 717 can implement any suitable communication technology, including but not limited to short-range wireless technologies such as Bluetooth, infrared, near-field communication, and Wi-Fi; long-range wireless technologies such as WiMAX, 2G, 3G, 4G, LTE, and 10G; and wired technologies such as USB, FireWire, Thunderbolt, and/or Ethernet. The computer-readable medium 730 is any type of computer-readable medium on which computer-executable instructions can be stored, including but not limited to a flash memory (SSD), a ROM, an EPROM, an EEPROM, and an FPGA. The computer-readable medium 730 and the processor(s) 715 can be combined into a single device, such as an ASIC, or the computer-readable medium
730 can include a cache memory, a register, or another component of the processor 715.
In the illustrated embodiment, the computer-readable medium 730 stores computerexecutable instructions that, in response to execution by one or more processors 715, cause the client computing device 130 to implement a control engine 731. The control engine
731 controls one or more aspects of the client computing device 130, as described above. In some embodiments, the computer-executable instructions are configured to cause the client computing device 130 to perform one or more operations such as generating a surface mapping of the target surface, generating a trace, or outputting the trace to server(s) 160. In some embodiments, the control engine 731 controls basic functions by facilitating interaction between the computer system 710 and the components 720 according to the client application 740. In some embodiments, the control engine 731 detects input from HMI 713 indicating that a cosmetic routine is to be initiated (e.g., in response to activation of a power switch or “start” button, or detection of a face in front of the camera 140 of FIG. 1), or receives signals from client computing device(s) 130, the remote computer system 160, or user device(s) 190 (e.g., over a Bluetooth paired connection).
The components of the client computing device 130 can be adapted to the application or can be specific to the application (e.g., ASICs). For example, the components 720 can include one or more cameras 721, a display 723, one or more radiation sources 725, and/or one or more radiation sensors 727, as described in more detail in reference to FIG 1. In some embodiments, the components 720 are integrated into a single device. In this way, the client computing device 130 can be a specialized computing device, configured to execute the client application 740 in coordination with the components 720. In some embodiments, the client computing device 130 is a general purpose mobile electronic device, such as a tablet or smartphone, storing the client application 740.
In some embodiments, the client application 740 also includes an image capture/3D scanning engine 741 configured to capture and process digital images (e.g., color images, infrared images, depth images, etc.) obtained from one or more of the components 720 including but not limited to stereoscopic images, LiDAR data, or other forms of surface/depth sensing information. In some embodiments, such data are used to obtain a clean and precise 3D contour mapping of the target body surface (e.g., target surface 181 of FIG. 1). In some embodiments, the digital images or scans are processed by the client computing device 130 and/or transmitted to the remote computer system 160 for processing in a 3D model engine 781. In an embodiment, captured image data is used in position tracking engine 743 for determining the position of features, key-points, or edges on the target body surface. In some embodiments, the position tracking engine 743 tracks the contours of the target body surface in a 3D space, for example, by implementing v-SLAM techniques. In some embodiments, position information from the position tracking engine 743 is used to generate signals to be transmitted to the control engine 731, which are used to control one or more components 720 or elements of the computer system 710 including, for example, the sources 725 or the HMI 713, according to techniques described herein.
In some embodiments, digital 3D models described herein are generated based on sensor data obtained the client computing device 130. As such, the digital 3D models are generated by the client computing device 130 or some other computing device, such as a cloud computing system, or a combination thereof. In some embodiments, the digital 3D models include 3D topology and texture information, which can be used for reproducing an accurate representation of a body surface, such as facial structure and skin features, as described in more detail in reference to FIGs. 1-6.
In some embodiments, the client application 740 includes a user interface 745. In an embodiment, the user interface 745 includes interactive functionality including but not limited to graphical guides or prompts, presented via the display to assist a user in selecting cosmetic designs, tutorial videos, or animations. In some embodiments, the user interface 745 provides guidance (e.g., visual guides such as arrows or targets, progress indicators, audio/haptic feedback, synthesized speech, etc.) to guide a user under particular lighting conditions, angles, etc., in order to ensure that sufficient data is collected for use by mapping and projection engines.
The client application 740 can include a mapping module 747. The mapping module 747 can be or include computer-readable instructions (e.g., software, drivers, etc.) for translating a numerical representation of a cosmetic design into an augmented reality cosmetic design filter. As part of the operation of the mapping module 747, the client application 740 can receive real-time data from the camera(s) 721 and sensors 727, which can be processed by the 3D scanning engine 741, the position tracking engine 743, and can be used to progressively update the mapping and the cosmetic design as it is developed during a sequence of applications to the target body surface. In this way, the mapping module 747 can respond to motion of the target body surface, thereby increasing the tolerance of the client computing device 130 for motion on the part of the user without loss of fidelity to the cosmetic design filter. In some embodiments, the computational resource demand for such real time scanning/tracking, can be spread across multiple devices, such as the remote computer system 160, through parallelization or distribution routines.
A communication module 749 of the client application 740 can be used to prepare information for transmission to, or to receive and interpret information from other devices or systems, such as the remote computer system 160 or the user device(s) 190, As described in more detail in reference to FIG. 1. Such information can include captured digital images, scans, or video, personal care device settings, custom care routines, user preferences, user identifiers, device identifiers, or the like. In an embodiment, the client computing device 130 collects data describing execution of care routines, image data of body surfaces, or other data. In an embodiment, such data is transmitted via the network interface 717 to the remote computer system 160 for further processing or storage (e.g., in a product data store 783 or user profile data store 785). The client computing device 130 can be used by a consumer, personal care professional, or some other entity to interact with other components of the system 700, such as the remote computer system 160 or user device(s) 190. In an embodiment, the client computing device 130 is a mobile computing device such as a smartphone or a tablet computing device equipped with the components 720 and the client application 740 or provided with the components through electronic coupling with a peripheral device.
Illustrative components and functionality of the remote computer system 160 will now be described. The remote computer system 160 includes one or more server computers that implement one or more of the illustrated components, e.g., in a cloud computing arrangement. The remote computer system 160 includes a projection engine 787, the 3D model engine 781, the product data store 783, and the user profile data store 785. In an embodiment, the 3D model engine 781 uses image data (e.g., color image data, infrared image data) and depth data to generate a 3D model of the target body surface. The image data is obtained from the client computing device 130, for example, from the camera(s) 721 or the sensor(s) 727 that are integrated with or otherwise electronically coupled with client computing device 130. In an embodiment, image data and depth data associated with a user is stored in the user profile data store 785. In an embodiment, user consent is obtained prior to storing any information that is private to a user or can be used to identify a user.
In an embodiment, the mapping/proj ection engine 787 performs processing of data relating to a cosmetic routine, such as generating mappings of target surfaces using image/sensor data and/or generating a projection of the cosmetic routine as an augmented reality filter, which can then be transmitted to the user device(s) 190. In some embodiments, the projection engine 787 generates cosmetic design data using user information from the user profile data store 785, the product data store 783, the 3D model engine 781, or some other source or combination of sources.
The 3D model engine 781 can employ machine learning or artificial intelligence techniques (e.g., template matching, feature extraction and matching, classification, artificial neural networks, deep learning architectures, genetic algorithms, or the like). For example, to generate the cosmetic design in accordance with a surface mapping of a face, the 3D model engine 781 can analyze a facial mapping generated by the 3D model engine 781 to measure or map contours, wrinkles, skin texture, etc., of the target body surface. The 3D model engine 781 can receive data describing a cosmetic design based on an identifier code provided by the user through the user device(s) 190. In such a scenario, the 3D model engine 781 can use such information to generate a projection of the cosmetic design (e.g., cosmetic design 110 of FIG. 1) onto an image of the corresponding body surface of a user the augmented reality cosmetic design filter.
The devices shown in FIG 7 can communicate with each other via the network 170, which can include any suitable communication technology including but not limited to wired technologies such as DSL, Ethernet, fiber optic, USB, Firewire, Thunderbolt; wireless technologies such as WiFi, WiMAX, 3G, 4G, LTE, 5G, 10G, and Bluetooth; and private networks (e.g., an intranet) or public networks (e.g., the Internet). In general, communication between computing devices or components of FIG. 7, or other components or computing devices used in accordance with described embodiments, occur directly or through intermediate components or devices.
Alternatives to the arrangements disclosed and described with reference to FIGs. 1 and 7, are possible. For example, functionality described as being implemented in multiple components can instead be consolidated into a single component, or functionality described as being implemented in a single component can be implemented in multiple illustrated components, or in other components that are not shown in FIGs. 1 or 7. As another example, devices in FIGs. 1 and 7 that are illustrated as including particular components can instead include more components, fewer components, or different components without departing from the scope of described embodiments. As another example, functionality that is described as being performed by a particular device or subcomponent can instead be performed by one or more other devices within a system. As an example, the 3D model engine 714 can be implemented in client computing device 130 or in some other device or combination of devices.
In addition to the technical benefits of described embodiments that are described elsewhere herein, numerous other technical benefits are achieved in some embodiments. For example, the system 700 allows some aspects of the process to be conducted independently by personal care devices or client computing devices, while moving other processing burdens to the remote computer system 160 (which can be a relatively high- powered and reliable computing system), thus improving performance and preserving battery life for functionality provided by personal care devices or client computing devices.
In general, the word “engine,” as used herein, refers to logic embodied in hardware or software instructions written in a programming language, such as C, C++, COBOL, JAVA™, PHP, Perl, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft .NET™, and/or the like. An engine can be compiled into executable programs or written in interpreted programming languages. Software engines can be callable from other engines or from themselves. Generally, the engines described herein refer to logical modules that can be merged with other engines or divided into sub-engines. The engines can be stored in any type of computer-readable medium or computer storage device and be stored on and executed by one or more general purpose computers, thus creating a special purpose computer configured to provide the engine or the functionality thereof.
As understood by one of ordinary skill in the art, a “data store” as described herein can be any suitable device configured to store data for access by a computing device. One example of a data store is a highly reliable, high-speed relational database management system (DBMS) executing on one or more computing devices and accessible over a highspeed network. Another example of a data store is a key-value store. However, any other suitable storage technique and/or device capable of quickly and reliably providing the stored data in response to queries can be used, and the computing device can be accessible locally instead of over a network, or can be provided as a cloud-based service. A data store can also include data stored in an organized manner on a computer-readable storage medium, as described further below. One of ordinary skill in the art will recognize that separate data stores described herein can be combined into a single data store, and/or a single data store described herein can be separated into multiple data stores, without departing from the scope of the present disclosure.
FIG. 8 is a block diagram that illustrates aspects of an example computing device 800, in accordance with various embodiments. While multiple different types of computing devices are described in reference to the various embodiments, the example computing device 800 describes various elements that are common to many different types of computing devices. While FIG. 8 is described with reference to a computing device that is implemented as a device on a network, the description below is applicable to servers, personal computers, mobile phones, smart phones, tablet computers, embedded computing devices, and other devices that can be used to implement portions of embodiments of the present disclosure. Moreover, those of ordinary skill in the art and others will recognize that the computing device 800 can be any one of any number of currently available or yet to be developed devices. In its most basic configuration, the example computing device 800 includes at least one processor 802 and a system memory 804 connected by a communication bus 806. Depending on the exact configuration and type of device, the system memory 804 can be volatile or nonvolatile memory, such as read only memory (“ROM”), random access memory (“RAM”), EEPROM, flash memory, or similar memory technology. Those of ordinary skill in the art and others will recognize that system memory 804 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by the processor 802. In this regard, the processor 802 can serve as a computational center of the computing device 800 by supporting the execution of instructions.
As further illustrated in FIG. 8, the computing device 800 can include a network interface 810 comprising one or more components for communicating with other devices over a network. Embodiments of the present disclosure can access basic services that utilize the network interface 810 to perform communications using common network protocols. The network interface 810 can also include a wireless network interface configured to communicate via one or more wireless communication protocols, such as WiFi, 2G, 3G, LTE, WiMAX, Bluetooth, Bluetooth low energy, and/or the like. As will be appreciated by one of ordinary skill in the art, the network interface 810 illustrated in FIG. 8 can represent one or more wireless interfaces or physical communication interfaces described and illustrated above with respect to particular components of the system 100 of FIG. 1.
In the exemplary embodiment depicted in FIG. 8, the computing device 800 also includes a storage medium 808. However, services can be accessed using a computing device that does not include means for persisting data to a local storage medium. Therefore, the storage medium 808 depicted in FIG. 8 is represented with a dashed line to indicate that the storage medium 808 is optional. In any event, the storage medium 808 can be volatile or nonvolatile, removable or nonremovable, implemented using any technology capable of storing information including, but not limited to, a hard disk drive, solid state drive, CD ROM, DVD, or other disk storage, magnetic cassettes, magnetic tape, magnetic disk storage, and/or the like.
As used herein, the term “computer-readable medium” includes volatile and non-volatile and removable and non-removable media implemented in any method or technology capable of storing information, such as computer readable instructions, data structures, program modules, or other data. In this regard, the system memory 804 and storage medium 808 depicted in FIG. 8 are merely examples of computer-readable media.
Suitable implementations of computing devices that include a processor 802, system memory 804, communication bus 806, storage medium 808, and network interface 810 are known and commercially available. For ease of illustration and because it is not important for an understanding of the claimed subject matter, FIG. 8 does not show some of the typical components of many computing devices. In this regard, the example computing device 800 can include input devices, such as a keyboard, keypad, mouse, microphone, touch input device, touch screen, and/or the like. Such input devices can be coupled to the example computing device 800 by wired or wireless connections including RF, infrared, serial, parallel, Bluetooth, Bluetooth low energy, USB, or other suitable connections protocols using wireless or physical connections. Similarly, the example computing device 800 can also include output devices such as a display, speakers, printer, etc. Since these devices are well known in the art, they are not illustrated or described further herein.
While illustrative embodiments have been illustrated and described, it will be appreciated that various changes can be made therein without departing from the spirit and scope of the invention. It is to be understood that the methods and systems described herein are not limited to specific methods, specific components, or to particular implementations. It is also to be understood that the terminology used herein is for the purpose of describing embodiments only and is not intended to be limiting.
As used in the specification and the appended claims, the singular forms “a,” “an” and “the” include plural referents unless the context clearly dictates otherwise. Ranges can be expressed herein as from “about” one particular value, and/or to “about” another particular value. When such a range is expressed, another embodiment includes from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by use of the antecedent “about,” it will be understood that the particular value forms another embodiment. It will be further understood that the endpoints of each of the ranges are significant both in relation to the other endpoint, and independently of the other endpoint.
“Optional” or “optionally” means that the subsequently described event or circumstance can or can not occur, and that the description includes instances where said event or circumstance occurs and instances where it does not. Throughout the description and claims of this specification, the word “comprise” and variations of the word, such as “comprising” and “comprises,” means “including but not limited to,” and is not intended to exclude, for example, other components, integers or steps. “Exemplary” means “an example of’ and is not intended to convey an indication of a preferred or ideal embodiment. “Such as” is not used in a restrictive sense, but for explanatory purposes.

Claims

CLAIMS The embodiments of the invention in which an exclusive property or privilege is claimed are defined as follows:
1. One or more non-transitory computer memory devices storing computer- readable instructions that, when executed by computing circuitry, cause the computing circuitry to perform operations for generating an augmented reality cosmetic design filter, the operations comprising: defining a baseline description of a biological surface using one or more radiation sensors, the radiation sensors being electronically coupled with the computing circuitry; identifying an application of a cosmetic formulation to the biological surface using one or more of the radiation sensors; generating a trace describing the application in reference to the baseline description; and outputting the trace.
2. The one or more non-transitory computer memory devices of Claim 1, wherein identifying the application of the cosmetic formulation comprises detecting the cosmetic formulation relative to the biological surface using one or more of the radiation sensors and the baseline description.
3. The one or more non-transitory computer memory devices of Claim 1, wherein generating the trace comprises: receiving information describing a plurality of shape types; attributing a first shape type of the shape types to at least a portion of the biological surface using the baseline description; generating a numerical representation of the application of the cosmetic formulation, the numerical representation describing position information relative to the baseline description; and transforming the numerical representation from the first shape type to a second shape type of the shape types.
-32-
4. The one or more non-transitory computer memory devices of claim 3, wherein the biological surface is a face, and wherein the first shape type and the second shape type describe a facial feature.
5. The one or more non-transitory computer memory devices of Claim 1, wherein defining the baseline description of the biological surface comprises projecting invisible electromagnetic radiation onto the biological surface using a radiation source electronically coupled with the computing circuitry.
6. The one or more non-transitory computer memory devices of Claim 1, wherein the one or more radiation sensors comprise a camera configured to detect electromagnetic radiation from ultraviolet, visible, or infrared spectral ranges, or a combination thereof.
7. The one or more non-transitory computer memory devices of Claim 1, wherein generating the trace comprises: tracking a motion of the application relative to the biological surface; and generating a numerical representation of the motion relative to the baseline description.
8. The one or more non-transitory computer memory devices of Claim 1, wherein the instructions, when executed by the computing circuitry, cause the computing circuitry to perform further operations comprising: identifying the cosmetic formulation; and defining a color of the cosmetic formulation using identifier information of the cosmetic formulation.
9. The one or more non-transitory computer memory devices of Claim 1, wherein the instructions, when executed by the computing circuitry, cause the computing circuitry to perform further operations comprising: estimating a surface tone of the biological surface; estimating a formulation tone of the cosmetic formulation using; and determining a color of the cosmetic formulation using the surface tone and the formulation tone.
-33-
10. The one or more non-transitory computer memory devices of Claim 1, wherein the one or more non-transitory computer memory devices are electronically coupled with a smart phone.
11. The one or more non-transitory computer memory devices of Claim 1, wherein outputting the trace comprises: communicating the trace to a remote computing system.
12. A method for generating an augmented reality cosmetic design filter, comprising: defining a baseline description of the biological surface using one or more radiation sources and one or more radiation sensors; identifying an application of a cosmetic formulation to the biological surface using the radiation sensors; generating a trace describing the application in reference to the baseline description; and outputting the trace.
13. The method of Claim 12, wherein generating the trace comprises: receiving shape information describing a plurality of shape types; attributing a first shape type of the shape types to at least a portion of the biological surface using the shape information; generating a numerical representation of the application of the cosmetic formulation, the numerical representation describing position information relative to the baseline description; and transforming the numerical representation from the first shape type to a second shape type of the shape types.
14. The method of Claim 12, wherein identifying the application of the cosmetic formulation comprises: detecting an applicator of the cosmetic formulation; estimating a position of the applicator of the cosmetic formulation relative to the biological surface; tracking a motion of the applicator relative to the biological surface; and generating a numerical representation of the motion relative to the baseline description.
15. The method of Claim 12, further comprising: estimating a surface tone of the biological surface; estimating a formulation tone of the cosmetic formulation; and determining a color of the cosmetic formulation using the surface tone and the formulation tone.
PCT/US2022/077230 2021-09-30 2022-09-28 Augmented reality cosmetic design filters WO2023056333A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020247005743A KR20240037304A (en) 2021-09-30 2022-09-28 Augmented reality cosmetic design filter
CN202280053466.0A CN117769723A (en) 2021-09-30 2022-09-28 Augmented reality cosmetic design filter

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US17/491,061 US20230101374A1 (en) 2021-09-30 2021-09-30 Augmented reality cosmetic design filters
US17/491,061 2021-09-30
FRFR2113547 2021-12-15
FR2113547A FR3130423B1 (en) 2021-12-15 2021-12-15 COSMETIC DRAWING FILTERS IN AUGMENTED REALITY

Publications (2)

Publication Number Publication Date
WO2023056333A1 true WO2023056333A1 (en) 2023-04-06
WO2023056333A4 WO2023056333A4 (en) 2023-05-19

Family

ID=83693152

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/077230 WO2023056333A1 (en) 2021-09-30 2022-09-28 Augmented reality cosmetic design filters

Country Status (2)

Country Link
KR (1) KR20240037304A (en)
WO (1) WO2023056333A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100189357A1 (en) * 2006-10-24 2010-07-29 Jean-Marc Robin Method and device for the virtual simulation of a sequence of video images
EP3709637A1 (en) * 2019-03-13 2020-09-16 L'oreal Systems, devices, and methods for projecting digital content including hair color changes onto a user's head, face, or body
JP2021144582A (en) * 2020-03-13 2021-09-24 カシオ計算機株式会社 Makeup simulation device, makeup simulation method and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100189357A1 (en) * 2006-10-24 2010-07-29 Jean-Marc Robin Method and device for the virtual simulation of a sequence of video images
EP3709637A1 (en) * 2019-03-13 2020-09-16 L'oreal Systems, devices, and methods for projecting digital content including hair color changes onto a user's head, face, or body
JP2021144582A (en) * 2020-03-13 2021-09-24 カシオ計算機株式会社 Makeup simulation device, makeup simulation method and program

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
PETER MOHR ET AL: "Retargeting Video Tutorials Showing Tools With Surface Contact to Augmented Reality", PROCEEDINGS OF THE 2017 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS , CHI '17, ACM PRESS, NEW YORK, NEW YORK, USA, 2 May 2017 (2017-05-02), pages 6547 - 6558, XP058337758, ISBN: 978-1-4503-4655-9, DOI: 10.1145/3025453.3025688 *
SOARES BORGES ALINE DE FATIMA ET AL: "A Virtual Makeup Augmented Reality System", 2019 21ST SYMPOSIUM ON VIRTUAL AND AUGMENTED REALITY (SVR), IEEE, 28 October 2019 (2019-10-28), pages 34 - 42, XP033645016, DOI: 10.1109/SVR.2019.00022 *
TREEPONG BANTITA ET AL: "The Development of an Augmented Virtuality for Interactive Face Makeup System", 21 February 2018, SAT 2015 18TH INTERNATIONAL CONFERENCE, AUSTIN, TX, USA, SEPTEMBER 24-27, 2015; [LECTURE NOTES IN COMPUTER SCIENCE; LECT.NOTES COMPUTER], SPRINGER, BERLIN, HEIDELBERG, PAGE(S) 614 - 625, ISBN: 978-3-540-74549-5, XP047465001 *

Also Published As

Publication number Publication date
KR20240037304A (en) 2024-03-21
WO2023056333A4 (en) 2023-05-19

Similar Documents

Publication Publication Date Title
US10860838B1 (en) Universal facial expression translation and character rendering system
CN109690617B (en) System and method for digital cosmetic mirror
JP3984191B2 (en) Virtual makeup apparatus and method
CN101779218B (en) Makeup simulation system, makeup simulation apparatus, makeup simulation method, and makeup simulation program
US8550818B2 (en) System and method for providing and modifying a personalized face chart
JP4435809B2 (en) Virtual makeup apparatus and method
WO2018221092A1 (en) Image processing device, image processing system, image processing method, and program
TWI573093B (en) Method of establishing virtual makeup data, electronic device having method of establishing virtual makeup data and non-transitory computer readable storage medium thereof
JP2020526809A (en) Virtual face makeup removal, fast face detection and landmark tracking
US20140160123A1 (en) Generation of a three-dimensional representation of a user
CN113628327B (en) Head three-dimensional reconstruction method and device
Karaoğlu et al. Self-supervised face image manipulation by conditioning GAN on face decomposition
CN116830073A (en) Digital color palette
JP2009039523A (en) Terminal device to be applied for makeup simulation
Rosin et al. Non-photorealistic rendering of portraits
US11544876B2 (en) Integrated cosmetic design applicator
KR101719927B1 (en) Real-time make up mirror simulation apparatus using leap motion
US20200126314A1 (en) Method and system of automated facial morphing for eyebrow hair and face color detection
US20230101374A1 (en) Augmented reality cosmetic design filters
KR102554058B1 (en) A virtual experience device that recommends customized styles to users
WO2023056333A1 (en) Augmented reality cosmetic design filters
EP4388503A1 (en) Augmented reality cosmetic design filters
US11321882B1 (en) Digital makeup palette
Ren et al. Make-a-character: High quality text-to-3d character generation within minutes
Zhao et al. Artistic rendering of portraits

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22790185

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280053466.0

Country of ref document: CN

ENP Entry into the national phase

Ref document number: 20247005743

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 1020247005743

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2022790185

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022790185

Country of ref document: EP

Effective date: 20240319

NENP Non-entry into the national phase

Ref country code: DE