CN111507907B - System, method and storage medium for executing on computing device - Google Patents

System, method and storage medium for executing on computing device Download PDF

Info

Publication number
CN111507907B
CN111507907B CN202010068688.2A CN202010068688A CN111507907B CN 111507907 B CN111507907 B CN 111507907B CN 202010068688 A CN202010068688 A CN 202010068688A CN 111507907 B CN111507907 B CN 111507907B
Authority
CN
China
Prior art keywords
cosmetic effect
digital image
individual
light
incidence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010068688.2A
Other languages
Chinese (zh)
Other versions
CN111507907A (en
Inventor
郭家祯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Playboy Mobile Co ltd
Original Assignee
Playboy Mobile Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Playboy Mobile Co ltd filed Critical Playboy Mobile Co ltd
Publication of CN111507907A publication Critical patent/CN111507907A/en
Application granted granted Critical
Publication of CN111507907B publication Critical patent/CN111507907B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/77
    • G06T5/94
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Abstract

The invention provides a system, a method and a storage medium for executing on a computing device. The method executed in the computing device comprises the steps that the computing device obtains a digital image containing a body, and determines the illumination condition of a content in the digital image. The computing device obtains a selection from a user containing a cosmetic effect and determines a surface characteristic of the selected cosmetic effect. The computing device defines a region of interest on a face of the individual corresponding to the cosmetic effect using a face alignment technique. The computing device captures illumination conditions in the area of interest and adjusts visual features of the cosmetic effect according to the surface characteristics of the cosmetic effect and the illumination conditions of the area of interest. The computing device virtually applies the adjusted cosmetic effect to the region of interest in the digital image.

Description

System, method and storage medium for executing on computing device
Technical Field
The present invention relates to a system, a method and a storage medium for executing on a computing device, and more particularly, to a system and a method for virtually applying a cosmetic effect to a digital image according to lighting conditions and surface characteristics of the cosmetic effect.
Background
In the digital image, the illumination conditions of individuals are different, so that when the cosmetic effect is virtually applied, a vivid and natural result cannot be obtained. Therefore, there is a need in the art for an improved system and method for virtually applying various cosmetic effects for different lighting conditions.
Disclosure of Invention
The invention provides a method executed in a computing device, which comprises the following steps that the computing device obtains a digital image containing a body and determines the illumination condition of a content in the digital image. The computing device obtains a selection from a user containing a cosmetic effect and determines a surface characteristic of the selected cosmetic effect. The computing device defines a region of interest on a face of the individual corresponding to the cosmetic effect using a face alignment technique. The computing device captures illumination conditions in the area of interest and adjusts visual features of the cosmetic effect according to the surface characteristics of the cosmetic effect and the illumination conditions of the area of interest. The computing device virtually applies the adjusted cosmetic effect to the region of interest in the digital image.
Further, the step of determining the illumination condition in the digital image further includes: at least one of an angle of incidence on the individual in the digital image, a light intensity, and a color of incidence on the individual in the digital image is evaluated.
Still further, the step of evaluating at least one of an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image further comprises: comparing a shadow effect on the individual in the digital image with shadow effects in a plurality of predefined three-dimensional models, each predefined three-dimensional model having a corresponding information related to the lighting conditions; identifying a most conforming three-dimensional model by comparing the shadow effect of the individual in the digital image with the shadow effects of the plurality of predefined three-dimensional models; in the most conforming three-dimensional model, obtaining corresponding information related to the illumination condition, wherein the corresponding information comprises at least one of the following: an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image.
Still further, the step of adjusting the visual characteristics of the cosmetic effect according to the surface characteristics of the cosmetic effect and the lighting conditions of the area of interest further comprises: a color of the cosmetic effect is adjusted according to at least one of an angle of incident light on the individual in the digital image, a light intensity, and a color of incident light on the individual in the digital image.
Further, the color of the cosmetic effect is adjusted in units of pixels in the region of interest.
Still further, the surface characteristics of the selected cosmetic effect include at least one of: a diffuse reflectance characteristic of the selected cosmetic effect; a specular reflection characteristic of the selected cosmetic effect; and a transparency of the selected cosmetic effect.
Further, the surface characteristics of the selected cosmetic effect are predefined and stored in a database.
The present invention provides a system for executing on a computing device. The system executed on the computing device includes a memory storing a plurality of instructions and a processor coupled to the memory, the processor configured with the plurality of instructions. The processor obtains a digital image including a body and determines a lighting condition of a content in the digital image. The processor obtains a selection from a user containing a cosmetic effect and determines a surface characteristic of the selected cosmetic effect. The processor defines a region of interest on a face of the individual corresponding to the cosmetic effect using a face alignment technique. The processor captures illumination conditions in the region of interest and adjusts visual features of the cosmetic effect according to the surface characteristics of the cosmetic effect and the illumination conditions of the region of interest. The processor virtually applies the adjusted cosmetic effect to the region of interest in the digital image.
Further, the processor determines the lighting condition of the content in the digital image by evaluating at least one of an angle of incidence of light on the individual in the digital image, a light intensity, and a color of the incidence of light on the individual in the digital image.
Still further, the instructions for the processor to evaluate at least one of an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image further comprise: comparing a shadow effect on the individual in the digital image with shadow effects in a plurality of predefined three-dimensional models, each predefined three-dimensional model having a corresponding information related to the lighting conditions; identifying a most conforming three-dimensional model by comparing the shadow effect of the individual in the digital image with the shadow effects of the plurality of predefined three-dimensional models; in the most conforming three-dimensional model, obtaining corresponding information related to the illumination condition, wherein the corresponding information comprises at least one of the following: an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image.
Still further, the instructions for the processor to adjust the visual characteristics of the cosmetic effect based on the surface characteristics of the cosmetic effect and the lighting conditions of the area of interest include: a color of the cosmetic effect is adjusted according to at least one of an angle of incident light on the individual in the digital image, a light intensity, and a color of incident light on the individual in the digital image.
Still further, the processor adjusts the color of the cosmetic effect in units of pixels in the region of interest.
Still further, the surface characteristics of the selected cosmetic effect include at least one of: a diffuse reflectance characteristic of the selected cosmetic effect; a specular reflection characteristic of the selected cosmetic effect; and a transparency of the selected cosmetic effect.
Further, the surface characteristics of the selected cosmetic effect are predefined and stored in a database.
The present invention provides a non-transitory computer readable storage medium. The non-transitory computer readable storage medium stores a plurality of instructions that are implemented on a computing device having a processor. When the processor executes the instructions, the computing device obtains a digital image including a body and determines a lighting condition of a content in the digital image. The computing device obtains a selection from a user containing a cosmetic effect and determines a surface characteristic of the selected cosmetic effect. The computing device defines a region of interest on a face of the individual corresponding to the cosmetic effect using a face alignment technique. The computing device captures illumination conditions in the area of interest and adjusts visual features of the cosmetic effect according to the surface characteristics of the cosmetic effect and the illumination conditions of the area of interest. The computing device virtually applies the adjusted cosmetic effect to the region of interest in the digital image.
Further, the processor determines the illumination condition in the digital image by evaluating at least one of an angle of incidence of light on the individual in the digital image, a light intensity, and a color of the incidence of light on the individual in the digital image.
Still further, the step of the processor evaluating at least one of an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image further comprises: comparing a shadow effect on the individual in the digital image with shadow effects in a plurality of predefined three-dimensional models, each predefined three-dimensional model having a corresponding information related to the lighting conditions; identifying a most conforming three-dimensional model by comparing the shadow effect of the individual in the digital image with the shadow effects of the plurality of predefined three-dimensional models; in the most conforming three-dimensional model, obtaining corresponding information related to the illumination condition, wherein the corresponding information comprises at least one of the following: an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image.
Still further, the instructions for the processor to adjust the visual characteristics of the cosmetic effect based on the surface characteristics of the cosmetic effect and the lighting conditions of the area of interest include: a color of the cosmetic effect is adjusted according to at least one of an angle of incident light on the individual in the digital image, a light intensity, and a color of incident light on the individual in the digital image.
Still further, the processor adjusts the color of the cosmetic effect in units of pixels in the region of interest.
Still further, the surface characteristics of the selected cosmetic effect include at least one of: a diffuse reflectance characteristic of the selected cosmetic effect; a specular reflection characteristic of the selected cosmetic effect; and a transparency of the selected cosmetic effect.
For a further understanding of the nature and the technical aspects of the present invention, reference should be made to the following detailed description of the invention and the accompanying drawings, which are provided for purposes of reference only and are not intended to limit the invention.
Drawings
For a more complete understanding of the present invention, reference is made to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present invention. Moreover, in the several figures, like reference numerals designate corresponding parts throughout the several views.
FIG. 1 illustrates a block diagram of a computing device virtually applying a cosmetic effect according to lighting conditions and surface characteristics of the cosmetic effect, in various embodiments of the invention.
FIG. 2 illustrates a schematic diagram of the computing device of FIG. 1 in various embodiments of the present invention.
FIG. 3 illustrates a top-level flow chart of a virtual application of a cosmetic effect according to lighting conditions and surface characteristics of the cosmetic effect, in accordance with various embodiments of the present invention, in which the computing device of FIG. 1 performs a portion of the functionality.
FIG. 4 illustrates an example of a digital image obtained by the computing device of FIG. 1, in which a shadow effect is displayed on the face of a portion of an individual, in accordance with the present invention.
FIG. 5 illustrates an exemplary embodiment of a user interface displayed by the computing device of FIG. 1, including an effects toolbar for a user to select a favorite cosmetic effect, in accordance with various embodiments of the present invention.
FIG. 6 illustrates an example of a predefined three-dimensional model that includes various shadow effects and is available to the computing device of FIG. 1 in various embodiments of the invention.
FIG. 7 illustrates an example of a three-dimensional model identified by the computing device of FIG. 1 as best conforming to the digital image of FIG. 4 in various embodiments of the invention.
FIG. 8 illustrates an exemplary application of a facial alignment technique to the digital image of FIG. 4 by the computing device of FIG. 1 to define a region of interest corresponding to a cosmetic effect, in various embodiments of the invention.
Fig. 9 illustrates an exemplary example of surface characteristics of a selected cosmetic effect in various embodiments of the present invention.
FIG. 10 illustrates an exemplary embodiment of the present invention in which the computing device of FIG. 1 adjusts visual characteristics of a selected cosmetic effect based on the surface characteristics of the cosmetic effect and the lighting conditions of the area of interest to produce a modified cosmetic effect.
Fig. 11 illustrates an example of adjusting a cosmetic effect in units of pixels in a region of interest in various embodiments of the present invention.
FIG. 12 illustrates an example of how the computing device of FIG. 1 recognizes a most conforming three-dimensional model in various embodiments of the invention.
Detailed Description
The following is a description of how the cosmetic effect is virtually applied according to the lighting conditions and the surface characteristics of the cosmetic effect, by way of specific examples. In various embodiments, the illumination condition around the individual in the digital image and the surface characteristics of the cosmetic effect to be applied to the face of the individual are analyzed, and then the visual characteristics of the cosmetic effect are adjusted according to the analysis result, and the adjusted cosmetic effect is applied to the face of the individual to provide a more realistic and natural representation.
A system for virtually applying a cosmetic effect according to the lighting conditions and the surface characteristics of the cosmetic effect will be described in detail below, and the operation of the components in the system will be described. FIG. 1 is a block diagram of an implementation computing device 102. Computing device 102 may be implemented as, for example, but not limited to: smart phones, tablet computing devices or notebook computers, etc.
A virtual cosmetic application device 104 executes on a processor of computing device 102, virtual cosmetic application device 104 including a content analyzer 106, a cosmetic effect component 108, a target area analyzer 110, and a virtual effect applicator 112. The content analyzer 106 is configured to obtain a digital image including a body and determine the lighting conditions of the content in the digital image.
As known to those skilled in the art, digital images may be encoded in the following archival format, such as, but not limited to: JPEG (Joint Photographic Experts Group), TIFF (Tagged Image File Format), PNG (Portable Network Graphics), GIF (Graphics Interchange Format), BMP (Bitmap) or other types of digital files, but are not limited thereto. In addition, the digital image may be obtained from a still image of the video, such as but not limited to: MPEG-1 (Motion Picture Experts Group-1), MPEG-2, MPEG-4, H.264, 3GPP (Third Generation Partnership Project), 3GPP-2, SD-Video (Standard-Definition Video), HD-Video (High-Definition Video), DVD (Digital Versatile Disc) multimedia, VCD (Video Compact Disc) multimedia, HD-DVD (High-Definition Digital Versatile Disc) multimedia, DTV/HDTV (Digital Television Video/High-Definition Digital Television) multimedia, AVI (Audio Video Interleave), DV (Digital Video), QT (QuickTime) archive, WMV (Windows Media Video), ASF (Advanced System Format), RM (Real Media), FLV (Flash Media), MP3 (MPEG Audio Layer III), MP2 (MPEG Audio Layer II), WAV (Waveform Audio Format), WMA (Windows Media Audio), 360Degree Video (360 Degree Video), 3D Scan Model (3D Scan Model), or other types of digital formats.
In some embodiments, the content analyzer 106 determines the lighting conditions in the digital image by evaluating some parameters. For example, the parameters may be an angle of incidence of light on the individual in the digital image, a light intensity, a color of incidence of light on the individual in the digital image, and so on. One or more of the above parameters (e.g., color, intensity) may then be used to adjust the cosmetic effect and apply the adjusted cosmetic effect to the face of the individual. In some embodiments, the lighting conditions of the digital image may be evaluated by comparing the shadow effect on the individual in the digital image with the shadow effects in a plurality of predefined three-dimensional models 118, the three-dimensional models 118 being stored in a database 116.
Each three-dimensional model 118 has a corresponding information associated with the lighting conditions, such as: angle of incidence on the individual, light intensity and light color. The content analyzer 106 may identify a most conforming three-dimensional model 118 and evaluate the lighting conditions using corresponding information of the most conforming three-dimensional model 118. How the content analyzer 106 recognizes the most conforming three-dimensional model 118 will be described later in conjunction with fig. 12.
The cosmetic effect member 108 is configured to obtain a selection from a user that includes a cosmetic effect and to determine a surface characteristic of the selected cosmetic effect. In some embodiments, surface characteristics 120 in various cosmetic effects are predefined and stored in database 116 for selection by a user. The surface characteristics 120 may include a diffuse reflectance characteristic of the selected cosmetic effect, a specular reflectance characteristic of the selected cosmetic effect, and a transparency of the selected cosmetic effect. When the user selects a favorite cosmetic effect, the cosmetic effect member 108 retrieves from the database 116 the surface characteristics 120 corresponding to the selected cosmetic effect.
The target area analyzer 110 is configured to define a region of interest corresponding to the cosmetic effect on the face of the individual by using a face alignment technique. For example, if the selected cosmetic effect includes a lipstick effect, the area of interest will be located on the lips of the individual. The target area analyzer 110 is further configured to intercept lighting conditions in the area of interest, and then the cosmetic effect member 108 may adjust visual characteristics of the cosmetic effect based on the surface characteristics of the cosmetic effect and the lighting conditions of the area of interest. Since the illumination condition is different for each pixel unit in the region of interest, the cosmetic effect member 108 uses the pixels in the region of interest as a base unit for adjusting the cosmetic effect.
Fig. 2 shows an exemplary embodiment of computing device 102 in fig. 1. The computing device 102 may be implemented as any of a variety of wired or wireless computing devices, such as a desktop computer, a portable computer, a dedicated server computer (Dedicated Server Computer), a multi-tasking computing device (Multiprocessor Computing Device), a smart phone or tablet, and so forth. Referring to FIG. 2, the computing device 102 includes a memory 214, a processing device 202, a plurality of Input/Output interfaces 204, a network Interface 206, a display 208, a peripheral Interface 211 and a mass storage device 226, each of which are connected by a Local Data Bus 210.
The processing device 202 may include any custom made or commercially available processor, a central processing unit (Central Processing Unit, CPU) associated with the computing device 102 or a coprocessor among several computing processors, a semiconductor microprocessor (in the form of a microchip), a Macroprocessor, one or more application specific integrated circuits (Application Specific Integrated Circuits, ASICs), a plurality of suitably configured digital logic gates, and other common electronic configurations comprising discrete components for coordinating the overall operation of the computing system, both independently and in various combinations.
The memory 214 may include any of volatile memory components (Volatile Memory Element) or nonvolatile memory components (Nonvolatile Memory Elements). For example, volatile memory components include random access memory (Random Access Memory, RAM), such as dynamic random access memory (Dynamic Random Access Memory, DRAM) or static random access memory (Static Random Access Memory, SRAM). The non-volatile Memory component may be a Read-Only Memory (ROM), a hard disk, a magnetic tape, a Read-Only Memory disc (Compact Disc Read-Only Memory). The memory 214 generally includes a native operating System 216, one or more native applications (Native Application), an Emulation System (Emulation System), or an Emulation application (Emulated Application) for any kind of operating System and/or Emulation hardware platform or Emulation operating System. For example, the aforementioned applications (i.e., native applications or simulation applications) may include specific software, i.e., include some or all of the components of computing device 102 in FIG. 1. In such embodiments, the components are stored in the memory 214 and executed by the processing device 202, so that the processing device 202 can perform the operations/functions of the disclosed features. The components in memory 214 are known to those skilled in the art as such, and therefore, some of the components in memory 214 are not described in detail for brevity. In some embodiments, computing device 102 may be implemented in hardware and/or software.
The input/output interface 204 provides any number of interfaces to input or output data. For example, when computing device 102 comprises a personal computer, the aforementioned components may be coupled to one or more input/output interfaces 204, such as a keyboard and a mouse, as shown in FIG. 2. Display 208 includes a computer display, a plasma screen (LCD) of a personal computer, a liquid crystal display of a handheld device, a touch screen, or other display device.
In the present disclosure, a non-transitory computer readable medium stores a program for use by or in connection with an instruction execution system, apparatus, or device. In particular, specific examples of the computer-readable medium can include, but are not limited to, a portable computer diskette, a random access Memory, a read-Only Memory, an erasable programmable read-Only Memory (Erasable Programmable Read-Only Memory, EPROM, EEPROM, or Flash Memory), and a portable compact disc read-Only Memory (Portable Compact Disc Read-Only Memory, CDROM).
Referring now to FIG. 3, FIG. 3 illustrates a flowchart 300 of a computing device 102 of FIG. 1 virtually applying a cosmetic effect based on lighting conditions and surface characteristics of the cosmetic effect, in accordance with various embodiments of the invention. The flowchart 300 of FIG. 3 is merely exemplary of the various components that may be implemented by the computing device 102 of FIG. 1 and that may result from different types of functional layouts. In other words, the flowchart 300 of FIG. 3 may be viewed as depicting one or more embodiments of steps in performing the method execution of the computing device 102.
Although a particular order of execution is disclosed in the flowchart 300 of fig. 3, the order of execution is merely to aid in understanding the invention, and the actual order of operation may vary from that described. For example, the order of execution of two or more block diagrams may be relatively adjusted, transposed, or patched. Also, the blocks of FIG. 3 with sequential order may be performed simultaneously or partially simultaneously. And such modifications are intended to fall within the scope of the present disclosure.
At block 310, the computing device 102 obtains a digital image including an individual. At block 320, the computing device 102 determines lighting conditions for content in the digital image. In some embodiments, computing device 102 determines the lighting conditions of the digital image by evaluating an angle of incident light on an individual in the digital image, a light intensity, and/or a color of incident light on an individual in the digital image. In some embodiments, computing device 102 may also evaluate lighting conditions by comparing shadow effects displayed on the digital image with various shadow effects in a predefined three-dimensional model. Each three-dimensional model has corresponding information related to lighting conditions therein.
The computing device 102 may then identify the most conforming three-dimensional model by comparing the shadow effect displayed on the digital image with various shadow effects in the predefined three-dimensional model. The computing device 102 then retrieves corresponding information related to the lighting conditions from the most conforming three-dimensional model. In some embodiments, the corresponding information may include: the angle of incidence of light on the individual in the digital image, the intensity of light, and/or the color of incidence of light on the individual in the digital image.
In block 330, the computing device 102 obtains a selection from the user that includes a cosmetic effect. In block 340, computing device 102 determines a surface characteristic in the selected cosmetic effect. In some embodiments, the surface characteristics of the selected cosmetic effect include: a diffuse reflectance characteristic of the selected cosmetic effect, a specular reflectance characteristic of the selected cosmetic effect, and/or a transparency of the selected cosmetic effect. In some embodiments, the surface characteristics of the selected cosmetic effect may be predefined and stored in database 116 of fig. 1. In block 350, the computing device 102 defines a region of interest on a face of the individual corresponding to the cosmetic effect using a face alignment technique. In block 360, the computing device 102 captures an illumination condition of the region of interest.
In block 370, computing device 102 adjusts visual characteristics of the cosmetic effect based on the surface characteristics of the cosmetic effect and the lighting conditions of the area of interest. In some embodiments, computing device 102 adjusts the color in the cosmetic effect based on the angle of incident light on the individual in the digital image, the intensity of the light, and/or the color of incident light on the individual in the digital image to achieve the effect of adjusting the visual characteristic. In some embodiments, computing device 102 adjusts the color of the cosmetic effect in pixels in the area of interest. In block 380, the computing device 102 virtually applies the adjusted cosmetic effect to the region of interest in the digital image. Finally, the flowchart of fig. 3 ends.
For a further description of various aspects of the invention, reference is made to the following drawings. The computing device 102 of fig. 1 obtains a digital image 402, the face of a portion of an individual in the digital image 402 being displayed with a shadow effect, as shown in fig. 4. As will be described further below, the selected cosmetic effect is adjusted for application to the individual's face based on the surface characteristics of the selected cosmetic effect and the lighting conditions of the digital image 402.
Fig. 5 shows an example of a user interface 502 displayed by the computing device 102 of fig. 1. The user interface 502 displays a cosmetic effects toolbar 504, and the cosmetic effects toolbar 504 allows the user to select favorite cosmetic effects. In fig. 5, the user has selected a blush effect to apply to the digital image 402.
As previously described, the content analyzer 106 of FIG. 1 determines the illumination condition of the digital image 402 by evaluating the angle of incidence of light on an individual in the digital image, the intensity of light, and/or the color of incidence of light on an individual in the digital image. In some embodiments, this information may be evaluated by comparing shadow effects on individuals in the digital image with various shadow effects in a predefined three-dimensional model 118, which three-dimensional model 118 may be as shown in FIG. 6. For convenience of illustration, six three-dimensional models 118 with different lighting conditions are shown in fig. 6, however, the kinds of lighting conditions are not limited to the above, and three-dimensional models 118 with other lighting conditions may be stored in the database 116. Each three-dimensional model 118 has corresponding information associated with lighting conditions, with different corresponding information resulting in different shadow effects. Referring to fig. 7, the content analyzer 106 in fig. 1 can identify the most conforming three-dimensional model 118 after comparing the shadow effect displayed on the individual in the digital image with various shadow effects in the predefined three-dimensional model. The content analyzer 106 obtains corresponding information related to lighting conditions in the most conforming three-dimensional model 118.
Referring to fig. 8, the target area analyzer 110 of fig. 1 may define a region of interest 802, 804 corresponding to a cosmetic effect on an individual's face in the digital image 402 by using a face alignment technique. In the example of fig. 8, the user selects the make-up effect of the blush, and thus, the target area analyzer 110 will define the individual cheeks as the areas of interest 802, 804 based on the selected blush make-up effect.
As described above, the visual characteristics of the selected cosmetic effect are adjusted according to the lighting conditions of the region of interest and the surface characteristics of the selected cosmetic effect, and then the adjusted cosmetic effect is applied to the region of interest. Referring to fig. 9, the surface characteristics of the selected cosmetic effect may include, but are not limited to: diffuse reflectance properties 902 of the selected cosmetic effect, specular reflectance properties 904 of the selected cosmetic effect, and/or transparency of the selected cosmetic effect.
The diffuse reflectance characteristics 902 of the selected cosmetic effect are generally described as the characteristics of light scattered at different angles after being incident on a plane. A specular reflection characteristic 904 of the selected cosmetic effect is typically described as the characteristic of a ray of light that is specularly reflected after it has been incident on a plane. Transparency is generally the property of describing a material that allows light to pass through without causing the light to diverge. Other surface characteristics may also be subsurface scattering (subsurface scattering), which generally refers to light passing through a translucent object and interacting with the material to cause scattering off the object surface at different angles.
Referring to fig. 10, the cosmetic effect member 108 adjusts visual characteristics of the selected cosmetic effect according to the surface characteristics 120 of the selected cosmetic effect and the lighting conditions 1004 of the region of interest to produce a modified cosmetic effect 1006. Next, the virtual effect applicator 112 virtually applies the modified cosmetic effect 1006 to the regions of interest 1012, 1014 of the digital image 402. Notably, the cosmetic effect member 108 adjusts the visual characteristics of the selected cosmetic effect in units of pixels in the region of interest. In the digital image 402 shown in fig. 4, a large-area shadow effect is displayed in the face area on one side, and thus, the visual characteristics of the selected cosmetic effect (e.g., blush effect) are different in the two regions of interest 1012, 1014.
Further, referring to fig. 11, fig. 11 illustrates that in various embodiments of the present invention, the cosmetic effect is modified in units of pixels in the region of interest 1104. In the example shown in fig. 11, the selected cosmetic effect includes a lipstick effect. In accordance with the above selection, the target region analyzer 110 of FIG. 1 defines a region of interest 1104 that includes the lips of the individual. As shown in fig. 11, a portion of an individual's lips are under shadow 1102, and the visual characteristics of the lipstick effect are different in the region of interest 1104 because the visual characteristics of the lipstick effect are adjusted in pixels. After adjusting the visual characteristics of the lipstick effect, the adjusted lipstick effect may be applied to the region of interest 1104.
FIG. 12 illustrates an example of how the computing device 102 of FIG. 1 recognizes a most conforming three-dimensional model in various embodiments of the invention. In some embodiments, the content analyzer 106 executing on the computing device 102 (FIG. 1) may analyze the individuals in the digital image 402 and identify the three-dimensional model 118 that best conforms to the digital image 402 based on various lighting conditions or shadow effects of the three-dimensional model 118. The predefined three-dimensional models 118 each have different lighting conditions, as shown in fig. 6.
As shown in fig. 12, the content analyzer 106 first converts the digital image 402 of the individual into an image 1202 having brightness. The content analyzer 106 then constructs the luminance-only image 1202 into a three-dimensional mesh model 1204. For each predefined three-dimensional model 118, the content analyzer 106 processes the three-dimensional mesh model 1204 associated with the individual and derives a plurality of vertices (1-n) from corresponding points on the three-dimensional mesh model of the current three-dimensional model 118. In detail, the content analyzer 106 compares the brightness value l at the image point corresponding to the plurality of vertices (1 to n) 1 To l n The luminance correlation between image a and image B is determined. Where image a corresponds to the three-dimensional mesh model 1204 associated with the individual and image B corresponds to the three-dimensional mesh model of the three-dimensional model 118 currently being processed. When the three-dimensional grid model of the three-dimensional model 118 which is processed at present and the brightness value l of the three-dimensional grid model 1204 corresponding to the individual at each image point 1 To l n With the highest similarity, the three-dimensional model 118 is determined to be the most conforming three-dimensional model 118.
The foregoing disclosure is only a preferred embodiment of the present invention and is not intended to limit the scope of the claims, so that all equivalent technical changes made by the application of the present invention and the accompanying drawings are included in the scope of the claims.

Claims (14)

1. A method performed on a computing device, the method performed on the computing device comprising:
obtaining a digital image including a body;
determining the illumination condition of a content in the digital image, wherein the step of determining the illumination condition of the content in the digital image comprises the following steps:
the step of evaluating at least one of an angle of incidence of light on the individual, a light intensity, and a color of incidence of light on the individual in the digital image, wherein the step of evaluating at least one of an angle of incidence of light on the individual, a light intensity, and a color of incidence of light on the individual in the digital image further comprises:
comparing a shadow effect on the individual in the digital image with shadow effects in a plurality of predefined three-dimensional models, each of the predefined three-dimensional models having a corresponding information related to lighting conditions;
identifying a most conforming three-dimensional model by comparing the shadow effect of the individual in the digital image with the shadow effects of the plurality of predefined three-dimensional models; and
obtaining the corresponding information related to the illumination condition in the most conforming three-dimensional model, wherein the corresponding information comprises at least one of the following: an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image;
obtaining a selection from a user comprising a cosmetic effect;
determining a surface property of the selected cosmetic effect;
defining a region of interest corresponding to the cosmetic effect on a face of the individual by using a face alignment technique;
capturing lighting conditions in the region of interest;
adjusting visual features of the cosmetic effect according to the surface characteristics of the cosmetic effect and the illumination conditions of the region of interest; and
virtually applying the adjusted cosmetic effect to the region of interest in the digital image.
2. The method of claim 1, wherein the step of adjusting the visual characteristics of the cosmetic effect based on the surface characteristics of the cosmetic effect and the lighting conditions of the area of interest further comprises:
adjusting a color of the cosmetic effect according to at least one of an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image.
3. The method of claim 2, wherein the color of the cosmetic effect is adjusted in pixels in the area of interest.
4. The method of claim 1, wherein the selected cosmetic effect surface characteristics comprise at least one of:
a diffuse reflectance characteristic of the selected cosmetic effect;
a specular reflection characteristic of the selected cosmetic effect; and
a transparency of the selected cosmetic effect.
5. The method of claim 1, wherein the surface characteristics of the selected cosmetic effect are predefined and stored in a database.
6. A system executing on a computing device, the system executing on the computing device comprising:
a memory storing a plurality of instructions; and
a processor coupled to the memory and configured with a plurality of the instructions, the plurality of instructions comprising at least:
obtaining a digital image including a body;
determining the illumination condition of a content in the digital image, wherein the step of determining the illumination condition of the content in the digital image comprises the following steps:
the step of evaluating at least one of an angle of incidence of light on the individual, a light intensity, and a color of incidence of light on the individual in the digital image, wherein the step of evaluating at least one of an angle of incidence of light on the individual, a light intensity, and a color of incidence of light on the individual in the digital image further comprises:
comparing a shadow effect on the individual in the digital image with shadow effects in a plurality of predefined three-dimensional models, each of the predefined three-dimensional models having a corresponding information related to lighting conditions;
identifying a most conforming three-dimensional model by comparing the shadow effect of the individual in the digital image with the shadow effects of the plurality of predefined three-dimensional models; and
obtaining the corresponding information related to the illumination condition in the most conforming three-dimensional model, wherein the corresponding information comprises at least one of the following: an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image;
obtaining a selection from a user comprising a cosmetic effect;
determining a surface property of the selected cosmetic effect;
defining a region of interest corresponding to the cosmetic effect on a face of the individual by using a face alignment technique;
capturing lighting conditions in the region of interest;
adjusting visual features of the cosmetic effect according to the surface characteristics of the cosmetic effect and the illumination conditions of the region of interest; and
virtually applying the adjusted cosmetic effect to the region of interest in the digital image.
7. The system of claim 6, wherein the instructions for the processor to adjust the visual characteristics of the cosmetic effect based on the surface characteristics of the cosmetic effect and the lighting conditions of the area of interest comprise:
adjusting a color of the cosmetic effect according to at least one of an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image.
8. The system of claim 6, wherein the processor adjusts the color of the cosmetic effect in pixels in the area of interest.
9. The system of claim 6, wherein the surface characteristics of the selected cosmetic effect include at least one of:
a diffuse reflectance characteristic of the selected cosmetic effect;
a specular reflection characteristic of the selected cosmetic effect; and
a transparency of the selected cosmetic effect.
10. The system of claim 6, wherein the surface characteristics of the selected cosmetic effect are predefined and stored in a database.
11. A non-transitory computer readable storage medium storing a plurality of instructions, a plurality of the instructions being executed on a computing device having a processor, the computing device at least executing:
obtaining a digital image including a body;
determining the illumination condition of a content in the digital image, wherein the step of determining the illumination condition of the content in the digital image comprises the following steps:
the step of evaluating at least one of an angle of incidence of light on the individual, a light intensity, and a color of incidence of light on the individual in the digital image, wherein the step of evaluating at least one of an angle of incidence of light on the individual, a light intensity, and a color of incidence of light on the individual in the digital image further comprises:
comparing a shadow effect on the individual in the digital image with shadow effects in a plurality of predefined three-dimensional models, each of the predefined three-dimensional models having a corresponding information related to lighting conditions;
identifying a most conforming three-dimensional model by comparing the shadow effect of the individual in the digital image with the shadow effects of the plurality of predefined three-dimensional models; and
obtaining the corresponding information related to the illumination condition in the most conforming three-dimensional model, wherein the corresponding information comprises at least one of the following: an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image;
obtaining a selection from a user comprising a cosmetic effect;
determining a surface property of the selected cosmetic effect;
defining a region of interest corresponding to the cosmetic effect on a face of the individual by using a face alignment technique;
capturing lighting conditions in the region of interest;
adjusting visual features of the cosmetic effect according to the surface characteristics of the cosmetic effect and the illumination conditions of the region of interest; and
virtually applying the adjusted cosmetic effect to the region of interest in the digital image.
12. The non-transitory computer readable storage medium of claim 11, wherein the instructions for the processor to adjust the visual characteristics of the cosmetic effect based on the surface characteristics of the cosmetic effect and the lighting conditions of the area of interest comprise:
adjusting a color of the cosmetic effect according to at least one of an angle of incidence of light on the individual in the digital image, a light intensity, and a color of incidence of light on the individual in the digital image.
13. The non-transitory computer readable storage medium of claim 12, wherein the processor adjusts the color of the cosmetic effect in pixels in the region of interest.
14. The non-transitory computer readable storage medium of claim 11, wherein the surface characteristics of the selected cosmetic effect include at least one of:
a diffuse reflectance characteristic of the selected cosmetic effect;
a specular reflection characteristic of the selected cosmetic effect; and
a transparency of the selected cosmetic effect.
CN202010068688.2A 2019-01-30 2020-01-21 System, method and storage medium for executing on computing device Active CN111507907B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962798784P 2019-01-30 2019-01-30
US62/798,784 2019-01-30

Publications (2)

Publication Number Publication Date
CN111507907A CN111507907A (en) 2020-08-07
CN111507907B true CN111507907B (en) 2023-05-30

Family

ID=71863898

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010068688.2A Active CN111507907B (en) 2019-01-30 2020-01-21 System, method and storage medium for executing on computing device

Country Status (1)

Country Link
CN (1) CN111507907B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101371272A (en) * 2006-01-17 2009-02-18 株式会社资生堂 Makeup simulation system, makeup simulation device, makeup simulation method and makeup simulation program
US9449412B1 (en) * 2012-05-22 2016-09-20 Image Metrics Limited Adaptive, calibrated simulation of cosmetic products on consumer devices
CN107273837A (en) * 2017-06-07 2017-10-20 广州视源电子科技股份有限公司 The method and system virtually made up
CN109191569A (en) * 2018-09-29 2019-01-11 深圳阜时科技有限公司 A kind of simulation cosmetic device, simulation cosmetic method and equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ATE416434T1 (en) * 2002-04-12 2008-12-15 Agency Science Tech & Res ROBUST FACE REGISTRATION VIA MULTIPLE FACE PROTOTYPE SYNTHESIS

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101371272A (en) * 2006-01-17 2009-02-18 株式会社资生堂 Makeup simulation system, makeup simulation device, makeup simulation method and makeup simulation program
US9449412B1 (en) * 2012-05-22 2016-09-20 Image Metrics Limited Adaptive, calibrated simulation of cosmetic products on consumer devices
CN107273837A (en) * 2017-06-07 2017-10-20 广州视源电子科技股份有限公司 The method and system virtually made up
CN109191569A (en) * 2018-09-29 2019-01-11 深圳阜时科技有限公司 A kind of simulation cosmetic device, simulation cosmetic method and equipment

Also Published As

Publication number Publication date
CN111507907A (en) 2020-08-07

Similar Documents

Publication Publication Date Title
EP3690825B1 (en) Systems and methods for virtual application of makeup effects based on lighting conditions and surface properties of makeup effects
Shi et al. Learning non-lambertian object intrinsics across shapenet categories
García et al. Learning image processing with OpenCV
US9251613B2 (en) Systems and methods for automatically applying effects based on media content characteristics
US20190166980A1 (en) Systems and Methods for Identification and Virtual Application of Cosmetic Products
KR101885090B1 (en) Image processing apparatus, apparatus and method for lighting processing
CN111556336B (en) Multimedia file processing method, device, terminal equipment and medium
Joshi et al. OpenCV by example
US20140240353A1 (en) System for and method of augmenting video and images
US10762665B2 (en) Systems and methods for performing virtual application of makeup effects based on a source image
Jarabo et al. Effects of approximate filtering on the appearance of bidirectional texture functions
CN110913205A (en) Video special effect verification method and device
US20180165855A1 (en) Systems and Methods for Interactive Virtual Makeup Experience
US10789769B2 (en) Systems and methods for image style transfer utilizing image mask pre-processing
US10529125B2 (en) Image processing device and method therefor
WO2020117575A1 (en) Automated real-time high dynamic range content review system
CN111507907B (en) System, method and storage medium for executing on computing device
Escrivá et al. Building Computer Vision Projects with OpenCV 4 and C++: Implement complex computer vision algorithms and explore deep learning and face detection
Balcı et al. Sun position estimation and tracking for virtual object placement in time-lapse videos
Ma et al. A lighting robust fitting approach of 3D morphable model for face reconstruction
Filip et al. Fast method of sparse acquisition and reconstruction of view and illumination dependent datasets
CN110136272B (en) System and method for virtually applying makeup effects to remote users
US20230316610A1 (en) Systems and methods for performing virtual application of a ring with image warping
Decenciere et al. Content-dependent image sampling using mathematical morphology: Application to texture mapping
Güssefeld et al. Creating Feasible Reflectance Data for Synthetic Optical Flow Datasets

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant