WO2021169307A1 - 人脸图像的试妆处理方法、装置、计算机设备和存储介质 - Google Patents

人脸图像的试妆处理方法、装置、计算机设备和存储介质 Download PDF

Info

Publication number
WO2021169307A1
WO2021169307A1 PCT/CN2020/119543 CN2020119543W WO2021169307A1 WO 2021169307 A1 WO2021169307 A1 WO 2021169307A1 CN 2020119543 W CN2020119543 W CN 2020119543W WO 2021169307 A1 WO2021169307 A1 WO 2021169307A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
makeup
tried
pixel
pixel value
Prior art date
Application number
PCT/CN2020/119543
Other languages
English (en)
French (fr)
Inventor
孙宇超
王东媛
姚聪
Original Assignee
北京旷视科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京旷视科技有限公司 filed Critical 北京旷视科技有限公司
Publication of WO2021169307A1 publication Critical patent/WO2021169307A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Definitions

  • This application relates to the field of image processing technology, and in particular to a method, device, computer equipment, and storage medium for trial makeup processing of face images.
  • Existing electronic application products used to display the makeup try-on effect usually use a virtual makeup-try algorithm to display the face image after makeup to show the makeup try-on effect.
  • the aforementioned virtual makeup test algorithm usually uses a map method, that is, firstly detects the location where makeup is needed, and then directly uses the preset makeup trial map to cover the location where makeup is needed to present the makeup trial effect.
  • a makeup processing method for a face image comprising:
  • the mask image and the image of the part to be tried on are merged to obtain the try on image.
  • the fusion of the mask image and the image of the part to be tried on to obtain the try on image includes:
  • the makeup image is determined.
  • determining the makeup image according to the pixel value of the first pixel, the pixel value of the second pixel, and the mask image includes:
  • the pixel value of the target first pixel point is obtained through the first fusion method
  • the pixel value of the target second pixel point is obtained through the second fusion method
  • the makeup image is determined.
  • the first fusion method is a color filter method
  • the second fusion method is a multiply film
  • the method before fusing the mask image and the image of the makeup site to be tried, the method further includes:
  • the mask image and the processed image of the part to be tried on are merged.
  • the method before fusing the mask image and the image of the makeup site to be tried, the method further includes:
  • the adjustment parameters include saturation and/or brightness
  • the mask image after adjusting the parameters and the image of the makeup part to be tried are merged.
  • acquiring the image of the makeup site to be tried on the face image includes:
  • the dense key points that represent the boundary of the makeup part to be tested are determined
  • the method further includes:
  • the trial makeup image is merged with the face image to obtain a complete trial makeup image.
  • obtaining the mask image of the image of the makeup site to be tried includes:
  • makeup parameters include color parameters, gloss parameters, saturation parameters, brightness parameters, sequin parameters, and/or highlight parameters;
  • a makeup processing device for a face image includes:
  • the first obtaining module is used to obtain the image of the makeup part to be tried on the face image
  • the second obtaining module is used to obtain a mask image of the image of the makeup part to be tried; the mask image is used to render the image of the makeup part to be tried;
  • the fusion module is used to fuse the mask image and the image of the makeup part to be tried to obtain the makeup image.
  • a computer device in a third aspect, includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the method described in the first aspect when the computer program is executed.
  • a computer-readable storage medium has a computer program stored thereon, and the computer program implements the steps of the method described in the first aspect when the computer program is executed by a processor.
  • the above-mentioned method, device, computer equipment and storage medium for the trial makeup processing of the face image include: acquiring the image of the makeup site to be tried on the face image, and then acquiring the mask image of the image of the makeup site to be tried, and then combining the mask image with The images of the makeup site to be tried are merged to obtain the makeup image. Since the makeup image obtained by the above method combines the original skin color in the image of the makeup part to be tried and the color of the makeup product in the mask image, a real makeup effect is produced, which is compared with the traditional direct application of the patch. The method for displaying the makeup trial effect.
  • the makeup trial processing method for the face image used to display the makeup trial effect can combine the skin color information of the makeup site to be tried on to display the makeup trial effect during the makeup trial process, so as to achieve The effect of thousands of people and thousands of colors is more in line with the real makeup application process, which greatly improves the authenticity of the final makeup trial effect.
  • Figure 1 is a diagram of the internal structure of a computer device in an embodiment
  • FIG. 2 is a schematic flow chart of a makeup processing method for a face image in an embodiment
  • FIG. 3 is a schematic flowchart of step S103 in an embodiment
  • FIG. 4 is a schematic flowchart of step S203 in an embodiment
  • FIG. 5 is a schematic flow chart of a makeup processing method for a face image in an embodiment
  • Fig. 6 is a schematic flowchart of step S101 in an embodiment
  • FIG. 7 is a schematic flowchart of step S502 in an embodiment
  • FIG. 8 is a schematic flowchart of step S102 in an embodiment
  • FIG. 9 is a structural block diagram of a makeup processing device for a face image in an embodiment
  • FIG. 10 is a structural block diagram of a makeup processing device for a face image in an embodiment
  • FIG. 11 is a structural block diagram of a makeup processing device for a face image in an embodiment
  • Fig. 12 is a structural block diagram of a makeup processing device for a face image in an embodiment
  • Fig. 13 is a structural block diagram of a makeup processing device for a face image in an embodiment
  • FIG. 14 is a structural block diagram of a makeup processing device for a face image in an embodiment
  • Fig. 15 is a structural block diagram of an apparatus for trial makeup processing of a face image in an embodiment.
  • a computer device is provided.
  • the computer device may be a terminal or a server, and its internal structure diagram may be as shown in FIG. 1.
  • the computer equipment includes a processor, a memory, a communication interface, a display screen and an input device connected through a system bus.
  • the processor of the computer device is used to provide calculation and control capabilities.
  • the memory of the computer device includes a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium stores an operating system and a computer program.
  • the internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage medium.
  • the communication interface of the computer device is used to communicate with an external terminal in a wired or wireless manner, and the wireless manner can be implemented through WIFI, an operator's network, NFC (near field communication) or other technologies.
  • the computer program is executed by the processor to realize a makeup processing method of a face image.
  • the display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, or it can be a button, a trackball or a touchpad set on the housing of the computer equipment , It can also be an external keyboard, touchpad, or mouse.
  • FIG. 1 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the computer device to which the solution of the present application is applied.
  • the specific computer device may Including more or fewer parts than shown in the figure, or combining some parts, or having a different arrangement of parts.
  • a method for processing face images is provided.
  • the method is applied to the computer device in FIG. 1 as an example.
  • the specific method for processing the image of the makeup site the method includes the following steps:
  • S101 Acquire an image of a part to be tried on makeup on a face image.
  • the image of the makeup site to be tried is a photo or video image obtained by using a device or device with a photographing or recording function such as a mobile phone, camera, or video camera to capture a face
  • the makeup site to be tried can be any part of the face. , For example, eyes, lips, cheeks, etc.
  • the computer device can take a face image by connecting a camera or a built-in camera, so as to obtain an image of the part to be tried on makeup based on the face image; optionally, the computer device can also directly use the camera Take a picture of the part of the face to be tried on to obtain an image of the part to be tried on; optionally, the computer device can also import the face image taken by the user’s mobile phone or other camera equipment, and then obtain the face to be tried on it according to the face image Images of makeup parts.
  • the computer device may also directly obtain the image of the makeup site to be tried on the face image in other ways, for example, directly download the image of the makeup site to be tried on the face image from the Internet. This embodiment is not limited.
  • S102 Obtain a mask image of the image of the makeup part to be tried, and the mask image is used to render the image of the makeup part to be tried.
  • the mask image can also be called a cover image, which is usually used when rendering an image.
  • the computer device when it obtains the image of the part to be tried on makeup based on the above steps, it can generate a mask image of the image of the part to be tried on makeup by setting the parameters for applying makeup.
  • the makeup parameters can be the color, gloss, brightness, sequins, highlights, etc. of various types of beauty products.
  • the makeup parameters can be the parameters selected by the user according to actual application needs and preferences.
  • the user when the user needs to apply makeup on the lips, the user can input a certain brand of beauty products that he likes on the interface of the computer device, or Enter your favorite beauty color, and the computer device will generate a mask image of the lip part image according to the attributes of the beauty product or the beauty color selected by the user, so that the computer device can use the mask image to show the makeup effect on the lips part. .
  • the mask image of the image of the part to be tried on makeup and the image of the part to be tried on makeup can be further fused to render the part of the face to be tried on makeup.
  • the above-mentioned fusion processing process realizes the fusion of the attributes such as the color of the trial makeup product in the mask image on the basis of the original skin color in the image of the makeup site to be tried, so that the final makeup image obtained is the attribute of the makeup trial product and the skin color of the makeup site to be tried. The result of fusion.
  • the above-mentioned trial makeup processing method of the face image includes: acquiring the image of the makeup site to be tried on the face image, and then acquiring the mask image of the image of the makeup site to be tried, and then fusing the mask image with the image of the makeup site to be tried. Get the makeup image. Since the makeup image obtained by the above method combines the original skin color in the image of the makeup part to be tried and the color of the makeup product in the mask image, a real makeup effect is produced, which is compared with the traditional direct application of the patch. The method for displaying the makeup trial effect.
  • the makeup trial processing method for the face image used to display the makeup trial effect can combine the skin color information of the makeup site to be tried on to display the makeup trial effect during the makeup trial process, so as to achieve The effect of thousands of people and thousands of colors is more in line with the real makeup application process, which greatly improves the authenticity of the final makeup trial effect.
  • step S103 fuse the mask image and the image of the part to be tried on to obtain the try on makeup image
  • S201 Determine the pixel value of the first pixel on the image of the makeup site to be tried which is greater than or equal to a preset threshold.
  • the preset threshold may be any value from 0 to 255, and the specific value may be determined according to actual application requirements.
  • the preset threshold in this embodiment may be 128.
  • the first pixel represents the pixel in the high-brightness area on the image of the makeup site to be tried.
  • the computer device when it obtains the image of the makeup site to be tried, it can extract the first pixel with the pixel value greater than or equal to the preset threshold, that is, the first pixel in the high-brightness area, so that it can be used later.
  • the first pixel is merged with the corresponding pixel on the mask image.
  • S202 Determine a pixel value of a second pixel point that is smaller than a preset threshold on the image of the makeup site to be tried.
  • the above-mentioned second pixel point represents the pixel point in the low-brightness area on the image of the part to be tried on makeup.
  • the computer device when it obtains the image of the makeup site to be tried, it can extract the second pixel with the pixel value less than the preset threshold, that is, the second pixel in the high-brightness area, so that the second pixel can be used later.
  • the pixels are merged with the corresponding pixels on the mask image.
  • S203 Determine a makeup trial image according to the pixel value of the first pixel, the pixel value of the second pixel, and the mask image.
  • the computer device When the computer device obtains the first pixel and the second pixel, it can further merge the pixel value of the first pixel with the pixel value of the corresponding pixel on the mask image, and combine the pixel value of the second pixel Fuse with the pixel value of the corresponding pixel on the mask image to obtain the makeup image.
  • the above step S203 "determine a makeup image based on the pixel value of the first pixel, the pixel value of the second pixel, and the mask image" includes:
  • the computer device can divide the color on the image of the makeup site to be tested into the colors of the three channels of R, G, and B, and change the colors of the three channels respectively. Specifically, the color on the image of the makeup site to be tested can be changed The area whose color is lower than the preset threshold color is adjusted using the first fusion method. For example, the value of the aforementioned preset threshold color may be 128.
  • the above-mentioned first fusion method may be a color filter method, which may be specifically represented by the following relational formula (1), so as to obtain the pixel value of the target first pixel:
  • C represents the pixel value of the target first pixel, and also represents the color values of the three channels R, G, and B of the target first pixel
  • A represents the pixel value of the first pixel on the image of the makeup site to be tried, also Represents the color values of the three channels R, G, and B of the first pixel on the image of the makeup site to be tried
  • B in formula (2) represents the pixel value corresponding to the first pixel on the mask image, and also represents the mask image The color values of the three channels R, G, B of the upper and the first pixel point pair.
  • the computer device when the computer device determines the pixel value of the first pixel on the image of the makeup site to be tried, and the pixel value corresponding to the first pixel on the mask image, it can further add the image of the makeup site to be tried on
  • the pixel value of the first pixel and the pixel value corresponding to the first pixel on the mask image are respectively substituted into the variables A and B in the above relationship (1) to obtain the pixel value C of the target first pixel .
  • the area where the color of the image of the makeup site to be tested is higher than the preset threshold color can be adjusted by using the second fusion method.
  • the above-mentioned second fusion method may be a multiply, which may be specifically expressed by the following relational formula (2), so as to obtain the pixel value of the target second pixel:
  • C represents the pixel value of the target second pixel, and also represents the color values of the three channels R, G, and B of the target second pixel
  • A represents the pixel value of the second pixel on the image of the makeup site to be tried, also Represents the color values of the three channels R, G, and B of the second pixel on the image of the makeup site to be tried
  • B in formula (2) represents the pixel value corresponding to the second pixel on the mask image, and also represents the mask image The color values of the three channels R, G, and B corresponding to the second pixel on the top.
  • the computer device when the computer device determines the pixel value of the second pixel on the image of the makeup site to be tried, and the pixel value corresponding to the second pixel on the mask image, it can further add the image of the makeup site to be tried on
  • the pixel value of the second pixel and the pixel value corresponding to the second pixel on the mask image are respectively substituted into the variables A and B in the above relationship (2) to obtain the pixel value C of the target second pixel .
  • S303 Determine the makeup trial image according to the pixel value of the target first pixel and the pixel value of the target second pixel.
  • the computer device obtains the pixel value of the target first pixel and the pixel value of the target second pixel based on the methods of S301 and S302, it can be based on the pixel value of the target first pixel and the pixel value of the target second pixel.
  • the value determines the pixel value of all pixels on the makeup image to get the makeup image.
  • the computer device can also perform the above fusion process pixel by pixel, that is, on the one hand, it determines the size of the pixel value of each pixel on the makeup image to be tried, and then selects the corresponding pixel value according to the size of the pixel value of each pixel.
  • Relation (1) or (2) substitute the pixel value of each pixel on the makeup image to be tried and the pixel value of the corresponding pixel on the mask image into the corresponding relation (1) or (2). Calculating to obtain the pixel value of each pixel on the makeup trial image is equivalent to obtaining the makeup trial image.
  • the GPU can be used for acceleration to increase the rate of acquiring the makeup image to achieve real-time processing effects.
  • the makeup effect is more realistic and more in line with the actual makeup effect.
  • S401 Perform blur processing on an image of a part to be tried on makeup to obtain an image of a part to be tried on in the middle.
  • the image of the part to be tried on makeup may be further subjected to blur processing, for example, Gaussian blur processing, to achieve the effect of smoothing the pixel value of each pixel on the image of the part to be tried on makeup.
  • blur processing for example, Gaussian blur processing
  • the processed image of the part to be tried on makeup can be obtained, that is, the image of the middle part to be tried on makeup.
  • S402 Obtain a pixel difference between the pixel value of the pixel on the image of the makeup site to be tried and the pixel value of the corresponding pixel on the image of the makeup site to be tried in the middle.
  • the computer device obtains the image of the intermediate makeup area to be tried based on the above step S401
  • the pixel value of the pixel on the image of the makeup area to be tried and the pixel value of the pixel on the image of the intermediate makeup area to be tried may be further subjected to a difference calculation.
  • the pixel difference between the pixel value of the pixel point on the image of the makeup site to be tried and the pixel value of the corresponding pixel point on the image of the makeup site to be tested in the middle is obtained.
  • the computer device can then perform the attenuation highlight processing on the pixel point of the highlight position, thereby eliminating the highlight on the image of the makeup site to be tried.
  • the computer device When the computer device obtains the pixel difference value based on the steps described in S402, it can further compare the pixel difference value with the preset pixel threshold value, and compare the pixels on the image of the makeup site that are greater than the preset pixel threshold value. After performing the brightness reduction process, the pixel value of the pixel on the image of the makeup site to be tested can be reduced, that is, the brightness of the pixel on the image of the makeup site to be tested is weakened.
  • the image of the makeup site to be tried after processing by the above method is the image of the makeup site to be tried after reducing the high brightness.
  • S103 "fuse the mask image and the image of the makeup site to be tried” includes: fuse the mask image with the processed image of the makeup site to be tried.
  • the computer device When the computer device obtains the image of the part to be tried on makeup after the above processing, it can merge the image of the part to be tried on makeup after processing and the mask image according to the method described in the foregoing embodiment to obtain the image of the try on makeup. Since the image of the part to be tried on makeup after the above processing is an image after blur processing, that is, an image after highlighting is reduced, the above tried makeup image can achieve a matte makeup effect.
  • the makeup effect on the fused image can be more in line with the real makeup effect.
  • the above-mentioned processing procedure steps adjust the corresponding parameters of the mask image according to the values of the preset adjustment parameters to obtain the mask image after the adjustment parameters; the adjustment parameters include saturation and/or brightness.
  • the computer device when it acquires the mask image, it can further adjust the saturation and brightness of the mask image according to the preset saturation and/or brightness and other adjustment parameters, so that The difference between the saturation and/or brightness and other attributes on the mask image after adjusting the parameters and the attributes on the image to be tried is reduced, so that when the mask image after the adjustment is used for fusion, the fusion
  • the makeup effect on the post image is more realistic.
  • S103 "fuse the mask image and the image of the makeup site to be tried” includes: fusing the mask image after adjusting the parameters with the image of the makeup site to be tried.
  • the mask image after the adjustment parameters and the image of the makeup part to be tried can be merged according to the method described in the foregoing embodiment to obtain the try makeup image. Since the above-mentioned mask image after adjusting the parameters is an image after the parameters are adjusted according to actual needs, the above-mentioned trial makeup image can realize the unrealistic makeup effect caused by the large difference between the mask image and the image of the makeup site to be tried. The problem.
  • the specific implementation manner of the above step S101 "obtain the image of the makeup site to be tried on the face image" includes:
  • S501 Obtain a face image through an image acquisition device.
  • the image acquisition device may be a camera, a camera, a mobile phone, and other devices that have the function of taking pictures or taking pictures.
  • the image acquisition device can acquire a face image, and can also acquire a video image containing a face image.
  • the computer device can acquire a face image or a video image through the image acquisition device, and when the video image is acquired, it can be split into an image with a human face by using the frame rate. It should be noted that when the computer device obtains the video stream, it can implement real-time beautification of each frame of the video stream according to the method described in the foregoing embodiment to obtain a video after the beautification.
  • S502 Determine an image of a part to be tried on makeup according to the face image.
  • the computer device When the computer device obtains the face image, it can further determine the image of the makeup site to be tried on the face image. Specifically, when the computer device obtains the face image, it can detect the makeup site to be tried on the face image according to the corresponding detection method, and obtain an image of the makeup site to be tried. Optionally, when the computer device obtains the face image, it can also determine the part to be tried on the face image according to the instruction input by the user, so as to obtain the image of the part to be tried on.
  • the specific implementation of the above step S502 "determine the image of the makeup site to be tried on the face image” or the above step S101 "obtain the image of the makeup site to be tried on the face image” includes :
  • S601 Perform key point detection on the face image to obtain key points on the face image.
  • the key points can represent the key points of the facial features on the face image, and can also represent the key points of other parts on the face image.
  • the computer device may use a deep learning algorithm or other detection methods to detect the key points on the face image to obtain the key points on the face image.
  • S602 Determine, according to the key points, the dense key points representing the boundary of the makeup site to be tried.
  • the computer equipment When the computer equipment obtains the key points on the face image, it can further detect the key points on the face image through the dense key point detection method or other detection methods, and obtain the dense key points, through which the dense key points can be accurate Obtain the real boundary of the part to be applied or the position of the part to be applied.
  • S603 Determine the image of the makeup site to be tested according to the dense key points.
  • the image of the part to be tried on can be determined according to the dense key points. For example, if the actual boundary of the eye part can be accurately obtained through the dense key points, then the image of the part to be tried on makeup can be determined from the face image by the position of the boundary.
  • the method for processing the face image for try on makeup further includes the step of: combining the makeup image with the face The images are merged to obtain a complete trial makeup image.
  • the computer device When the computer device performs makeup processing on the face image according to the method described in the foregoing embodiment, and after obtaining the makeup trial image, it can also fuse the makeup trial image representing the makeup trial effect with the originally completed face image to obtain a complete trial makeup image. Makeup images, so that users can experience a more comprehensive trial makeup effect based on the makeup images.
  • the specific implementation manner of the above step S102 "obtain the mask image of the image of the makeup site to be tried" includes:
  • S701 Acquire preset makeup parameters; makeup parameters include color parameters, gloss parameters, saturation parameters, brightness parameters, sequin parameters, and/or highlight parameters.
  • the makeup parameters can include parameters that reflect the attributes of various beauty products. Specifically, they can include color parameters, gloss parameters, saturation parameters, brightness parameters, sequin parameters, and/or highlight parameters of various beauty products.
  • the makeup parameters of the eyes can specifically include the color of the eye shadow, the sequins of the eye shadow, the gradation effect of the eye shadow, etc.; if you apply makeup to the lips, the makeup parameters of the lips can be Specifically, it includes the color of the lip glaze, the highlight of the lip glaze, and the saturation of the color of the lip glaze.
  • the makeup parameters can be set according to user needs.
  • the user can input makeup parameters by selecting items or inputting commands on the display interface of the computer device, and the computer device can obtain the makeup parameters input or selected by the user.
  • keyboard input or voice input can be specifically used, which is not limited in this embodiment. For example, if a computer device needs to apply makeup to the eyes on a face image, the user can select a red makeup color, and then select the makeup parameters of that color on the computer device, and then the computer device can get it Red makeup parameters.
  • the computer device can generate a mask image containing the attributes of the makeup parameters according to the makeup parameters. For example, as in the above example, after the computer device obtains the red makeup parameters, it can correspondingly generate a red mask image for eye makeup according to the red makeup parameters.
  • the effect of a variety of beauty products can be produced by modifying the makeup parameters, thereby solving the problem of a large number of beauty products.
  • the face image makeup processing method proposed in this application greatly saves the cost of try-on makeup, and also reduces the risk of cross-infection of offline makeup try-on, and greatly improves The safety of try-on makeup.
  • a makeup processing device for a face image which includes: a first acquisition module 11, a second acquisition module 12, and a fusion module 13, wherein:
  • the first obtaining module 11 is configured to obtain an image of a part to be tried on the face image
  • the second acquiring module 12 is used to acquire a mask image of the image of the makeup part to be tried, and the mask image is used to render the image of the makeup part to be tried;
  • the fusion module 13 is used for fusing the mask image and the image of the makeup site to be tried to obtain the makeup image.
  • the aforementioned fusion module 13 includes:
  • the first determining unit 131 is configured to determine the pixel value of the first pixel on the image of the part to be tried makeup that is greater than or equal to a preset threshold;
  • the second determining unit 132 is configured to determine the pixel value of the second pixel on the image of the part to be tried on makeup that is smaller than the preset threshold;
  • the third determining unit 133 is configured to determine the makeup trial image according to the pixel value of the first pixel, the pixel value of the second pixel, and the mask image.
  • the above-mentioned third determining unit 133 is specifically configured to obtain through the first fusion method according to the pixel value of the first pixel on the image of the makeup site to be tried and the pixel value corresponding to the first pixel on the mask image
  • the pixel value of the first pixel of the target according to the pixel value of the second pixel on the image of the makeup site to be tried and the pixel value of the second pixel on the mask image, the second fusion method is used to obtain the second pixel of the target Pixel; Determine the makeup image according to the pixel value of the target first pixel and the pixel value of the target second pixel.
  • the makeup processing device for the face image further includes:
  • the first processing module 14 is configured to perform blur processing on the image of the part to be tried on makeup to obtain an image of the part to be tried on in the middle;
  • the third acquiring module 15 is configured to acquire the pixel difference between the pixel value of the pixel point on the image of the makeup part to be tried and the pixel value of the corresponding pixel point on the image of the middle part of the makeup part to be tried;
  • the second processing module 16 is configured to perform brightness reduction processing on the pixel points on the image of the part to be tried on makeup corresponding to the pixel difference when the pixel difference is greater than the preset pixel threshold to obtain a processed image of the part to be tried on makeup;
  • the aforementioned fusion module 13 is specifically used for fusing the mask image and the processed image of the makeup site to be tried.
  • the makeup processing device for the face image further includes:
  • the adjustment module 17 is configured to adjust the corresponding parameters of the mask image according to the values of the preset adjustment parameters to obtain the mask image after the adjustment parameters; the adjustment parameters include saturation and/or brightness.
  • the aforementioned fusion module 13 is specifically used for fusing the mask image after adjusting the parameters and the image of the makeup site to be tried.
  • the above-mentioned first obtaining module 11 includes:
  • the first detection unit 111 is configured to perform key point detection on the face image to obtain key points on the face image;
  • the second detection unit 112 is configured to determine, according to the key points, the dense key points representing the boundary of the makeup site to be tested;
  • the fourth determining unit 113 is configured to determine the image of the makeup site to be tried based on the dense key points.
  • the makeup processing device for the face image further includes:
  • the fusion complete image module 18 is used for fusing the makeup trial image with the face image to obtain a complete makeup trial image.
  • the above-mentioned second acquisition module 12 includes:
  • the acquiring unit 121 is configured to acquire preset makeup parameters; the makeup parameters include color parameters, gloss parameters, saturation parameters, brightness parameters, sequin parameters, and/or highlight parameters;
  • the fifth determining unit 122 is configured to determine the mask image according to the makeup parameters.
  • the makeup trial processing device for the face image For the specific limitation of the makeup trial processing device for the face image, please refer to the above limitation on the makeup trial processing method for the face image, which will not be repeated here.
  • the various modules in the above-mentioned face image makeup trial processing device can be implemented in whole or in part by software, hardware, and a combination thereof.
  • the above-mentioned modules may be embedded in the form of hardware or independent of the processor in the computer equipment, or may be stored in the memory of the computer equipment in the form of software, so that the processor can call and execute the operations corresponding to the above-mentioned modules.
  • a computer device including a memory and a processor, a computer program is stored in the memory, and the processor implements the following steps when the processor executes the computer program:
  • the mask image and the image of the part to be tried on are merged to obtain the try on image.
  • a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the following steps are implemented:
  • the mask image and the image of the part to be tried on are merged to obtain the try on image.
  • a computer program including computer-readable code, which when the computer-readable code runs on a computer device, causes the computer device to perform the following steps:
  • the mask image and the image of the part to be tried on are merged to obtain the try on image.
  • Non-volatile memory may include read-only memory (Read-Only Memory, ROM), magnetic tape, floppy disk, flash memory, or optical storage.
  • Volatile memory may include random access memory (RAM) or external cache memory.
  • RAM can be in various forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

一种人脸图像的试妆处理方法、装置、计算机设备和存储介质。所述方法包括:获取人脸图像上的待试妆部位图像(S101),进而获取待试妆部位图像的蒙版图像(S102),然后将蒙版图像与待试妆部位图像进行融合,得到试妆图像(S103)。由于通过上述方法得到的试妆图像融合了待试妆部位图像中原始肤色和蒙版图像中试妆产品的颜色等属性,进而产生了真实的试妆效果,相比于传统直接使用贴片的方式展示试妆效果的方法,该方法提供的用于展示试妆效果的人脸图像的试妆处理方法,能够在试妆的过程中结合待试妆部位的肤色信息展示试妆效果,从而达到千人千色的效果,更加符合真实的试妆应用过程,进而极大的提高了最终展示试妆效果的真实度。

Description

人脸图像的试妆处理方法、装置、计算机设备和存储介质
本申请要求在2020年02月28日提交中国专利局、申请号为202010129222.9、发明名称为“人脸图像的试妆处理方法、装置、计算机设备和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理技术领域,特别是涉及一种人脸图像的试妆处理方法、装置、计算机设备和存储介质。
背景技术
随着时代多元化技术的发展,化妆对于用户来说,成为了一种刚需。为了适应不同程度的化妆需求,美妆品牌不断开发出不同效果的美妆产品,也就相继出现了很多体现试妆效果的电子应用产品。
现有的用于展现试妆效果的电子应用产品,通常采用虚拟试妆算法展示上妆后的人脸图像,以展现试妆效果。而且上述虚拟试妆算法通常采用贴图的方式,即首选检测出需要化妆的位置,再直接使用预设的试妆贴图覆盖在需要化妆的位置上,以呈现试妆效果。
然而,目前展现试妆效果的方法存在展示效果差,真实度低的问题。
发明内容
基于此,有必要针对上述技术问题,提供一种能够有效提高展示效果,以及提高上妆真实度的人脸图像的试妆处理方法、装置、计算机设备和存储介质。
第一方面,一种人脸图像的试妆处理方法,所述方法包括:
获取人脸图像上的待试妆部位图像;
获取待试妆部位图像的蒙版图像;蒙版图像用于渲染待试妆部位图像;
将蒙版图像和待试妆部位图像进行融合,得到试妆图像。
在其中一个实施例中,将蒙版图像和待试妆部位图像进行融合,得到试妆图像,包括:
确定待试妆部位图像上大于或等于预设阈值的第一像素点的像素值;
确定待试妆部位图像上小于预设阈值的第二像素点的像素值;
根据第一像素点的像素值、第二像素点的像素值、以及蒙版图像,确定试妆图像。
在其中一个实施例中,根据第一像素点的像素值、第二像素点的像素值、以及蒙版图像,确定试妆图像,包括:
根据待试妆部位图像上第一像素点的像素值和蒙版图像上与第一像素点对应的像素值,通过第一融合方式得到目标第一像素点的像素值;
根据待试妆部位图像上第二像素点的像素值和蒙版图像上与第二像素点对应的像素值,通过第二融合方式得到目标第二像素点的像素值;
根据目标第一像素点的像素值和目标第二像素点的像素值,确定试妆图像。
在其中一个实施例中,第一融合方式为滤色方式,第二融合方式为正片叠底。
在其中一个实施例中,将蒙版图像和待试妆部位图像进行融合之前,所述方法还包括:
对待试妆部位图像进行模糊处理,得到中间待试妆部位图像;
获取待试妆部位图像上的像素点的像素值与中间待试妆部位图像上对应的像素点的像素值之间的像素差值;
若像素差值大于预设像素阈值,将像素差值对应的待试妆部位图像上的像素点进行亮度削弱处理,得到处理后的待试妆部位图像;
将蒙版图像和待试妆部位图像进行融合,包括:
将蒙版图像和处理后的待试妆部位图像进行融合。
在其中一个实施例中,将蒙版图像和待试妆部位图像进行融合之前,所述方法还包括:
按照预设的调节参数的值,对蒙版图像的对应参数进行调整,得到调整参数后的蒙版图像;调节参数包括饱和度和/或明亮度;
将蒙版图像和待试妆部位图像进行融合,包括:
将调整参数后的蒙版图像和待试妆部位图像进行融合。
在其中一个实施例中,获取人脸图像上的待试妆部位图像,包括:
对人脸图像进行关键点检测,得到人脸图像上的关键点;
根据关键点确定表示待试妆部位边界的稠密关键点;
根据稠密关键点确定待试妆部位图像。
在其中一个实施例中,将蒙版图像和待试妆部位图像进行融合,得到试妆图像之后,所述方法还包括:
将试妆图像与人脸图像进行融合,得到完整试妆图像。
在其中一个实施例中,获取待试妆部位图像的蒙版图像,包括:
获取预设的上妆参数;上妆参数包括颜色参数、光泽度参数、饱和度参数、亮度参数、亮片参数、和/或高光参数;
根据上妆参数确定蒙版图像。
第二方面,一种人脸图像的试妆处理装置,所述装置包括:
第一获取模块,用于获取人脸图像上的待试妆部位图像;
第二获取模块,用于获取待试妆部位图像的蒙版图像;蒙版图像用于渲染待试妆部位图像;
融合模块,用于将蒙版图像和待试妆部位图像进行融合,得到试妆图像。
第三方面,一种计算机设备,包括存储器和处理器,所述存储器存储有计算机程序,所述处理器执行所述计算机程序时实现第一方面所述的方法的步骤。
第四方面,一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现第一方面所述的方法的步骤。
上述人脸图像的试妆处理方法、装置、计算机设备和存储介质,包括:获取人脸图像上的待试妆部位图像,进而获取待试妆部位图像的蒙版图像,然后将蒙版图像与待试妆部位图像进行融合,得到试妆图像。由于通过上述方法得到的试妆图像融合了待试妆部位图像中原始肤色和蒙版图像中试妆产品的颜色等属性,进而产生了真实的试妆效果,相比于传统直接使用贴片的方式展示试妆效果的方法,本申请提供的用于展示试妆效果的人脸图像的试妆处理方法,能够在试妆的过程中结合待试妆部位的肤色信息展示试妆效果,从而达到千人千色的效果,更加符合真实的试妆应用过程,进而极大的提高了最终展示试妆效果的真实度。
上述说明仅是本发明技术方案的概述,为了能够更清楚了解本发明的技术手段,而可依照说明书的内容予以实施,并且为了让本发明的上述和其它目的、特征和优点能够更明显易懂,以下特举本发明的具体实施方式。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为一个实施例中计算机设备的内部结构图;
图2为一个实施例中人脸图像的试妆处理方法的流程示意图;
图3为一个实施例中S103步骤的流程示意图;
图4为一个实施例中S203步骤的流程示意图;
图5为一个实施例中人脸图像的试妆处理方法的流程示意图;
图6为一个实施例中S101步骤的流程示意图;
图7为一个实施例中S502步骤的流程示意图;
图8为一个实施例中S102步骤的流程示意图;
图9为一个实施例中人脸图像的试妆处理装置的结构框图;
图10为一个实施例中人脸图像的试妆处理装置的结构框图;
图11为一个实施例中人脸图像的试妆处理装置的结构框图;
图12为一个实施例中人脸图像的试妆处理装置的结构框图;
图13为一个实施例中人脸图像的试妆处理装置的结构框图;
图14为一个实施例中人脸图像的试妆处理装置的结构框图;
图15为一个实施例中人脸图像的试妆处理装置的结构框图。
具体实施例
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
在一个实施例中,提供了一种计算机设备,该计算机设备可以是终端,也可以是服务器,其内部结构图可以如图1所示。该计算机设备包括通过系统总线连接的处理器、存储器、通信接口、显示屏和输入装置。其中,该计算机设备的处理器用于提供计算和控制能力。该计算机设备的存储器包括非易失性存储介质、内存储器。该非易失性存储介质存储有操作系统和计算机程序。该内存储器为非易失性存储介质中的操作系统和计算机程序的运行提 供环境。该计算机设备的通信接口用于与外部的终端进行有线或无线方式的通信,无线方式可通过WIFI、运营商网络、NFC(近场通信)或其他技术实现。该计算机程序被处理器执行时以实现一种人脸图像的试妆处理方法。该计算机设备的显示屏可以是液晶显示屏或者电子墨水显示屏,该计算机设备的输入装置可以是显示屏上覆盖的触摸层,也可以是计算机设备外壳上设置的按键、轨迹球或触控板,还可以是外接的键盘、触控板或鼠标等。
本领域技术人员可以理解,图1中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
在一个实施例中,如图2所示,提供了一种人脸图像的试妆处理方法,以该方法应用于图1中的计算机设备为例进行说明,本实施例涉及的是计算机设备对待试妆部位图像进行处理的具体方法,该方法包括以下步骤:
S101,获取人脸图像上的待试妆部位图像。
其中,待试妆部位图像为利用手机、照相机、摄像机等具有拍照或摄像功能的设备或装置对人脸进行拍摄得到的照片或视频图像,上述待试妆部位可以是人脸上的任一部位,例如,眼睛、嘴唇、脸颊等。
具体地,当计算机设备需要展示上妆效果时,计算机设备可以通过连接摄像头或内置摄像头拍摄人脸图像,从而根据人脸图像获取待试妆部位图像;可选的,计算机设备也可以通过摄像头直接拍摄人脸上待试妆部位,得到待试妆部位图像;可选的,计算机设备也可以导入用户手机或其它摄录设备拍摄的人脸图像,进而根据该人脸图像获取其上的待试妆部位图像。可选的,计算机设备也可以通过其它方式直接获取到人脸图像上的待试妆部位图像,例如,从网络上直接下载获取人脸图像上的待试妆部位图像。对此本实施例不做限定。
S102,获取待试妆部位图像的蒙版图像,蒙版图像用于渲染待试妆部位图像。
其中,蒙版图像也可以称为盖板图像,通常用于渲染图像时使用。本实施例中,计算机设备基于上述步骤获取到待试妆部位图像时,可以在通过设置用于上妆的参数生成待试妆部位图像的蒙版图像。其中,上妆的参数可以 是各种类型美妆产品的颜色、光泽度、亮度、亮片、高光等。上妆的参数可以是用户根据实际应用需求和喜好选择的参数,例如,当用户需要在嘴唇部位上妆时,用户可以在计算机设备的界面上输入自己喜欢的某一品牌的美妆产品,或输入自己喜欢的美妆颜色,计算机设备则根据用户选择的美妆产品的属性或美妆颜色生成嘴唇部位图像的蒙版图像,以便之后计算机设备使用该蒙版图像实现嘴唇部位上妆效果的展现。
S103,将蒙版图像和待试妆部位图像进行融合,得到试妆图像。
当计算机设备获取到待试妆部位图像的蒙版图像时,可以进一步的将待试妆部位图像的蒙版图像和待试妆部位图像进行融合处理,以渲染人脸图像上的待试妆部位,从而得到可以展示出上妆效果的试妆图像。上述融合处理过程实现了在待试妆部位图像中原始肤色的基础上融合蒙版图像中试妆产品的颜色等属性,使最后得到的试妆图像为试妆产品的属性与待试妆部位肤色融合的结果。
上述人脸图像的试妆处理方法,包括:获取人脸图像上的待试妆部位图像,进而获取待试妆部位图像的蒙版图像,然后将蒙版图像与待试妆部位图像进行融合,得到试妆图像。由于通过上述方法得到的试妆图像融合了待试妆部位图像中原始肤色和蒙版图像中试妆产品的颜色等属性,进而产生了真实的试妆效果,相比于传统直接使用贴片的方式展示试妆效果的方法,本申请提供的用于展示试妆效果的人脸图像的试妆处理方法,能够在试妆的过程中结合待试妆部位的肤色信息展示试妆效果,从而达到千人千色的效果,更加符合真实的试妆应用过程,进而极大的提高了最终展示试妆效果的真实度。
在一个实施例中,如图3所示,上述步骤S103“将蒙版图像和待试妆部位图像进行融合,得到试妆图像”,包括:
S201,确定待试妆部位图像上大于或等于预设阈值的第一像素点的像素值。
其中,预设阈值可以具体取0~255中的任一值,具体的值可以根据实际应用需求确定,例如,本实施例中的预设阈值可以取128。第一像素点代表待试妆部位图像上高亮度区域内的像素点。本实施例中,当计算机设备获取到待试妆部位图像时,可以从中提取出像素值大于或等于预设阈值的第一像 素点,即高亮度区域内的第一像素点,以便之后使用该第一像素点与蒙版图像上的对应像素点进行融合。
S202,确定待试妆部位图像上小于预设阈值的第二像素点的像素值。
其中,上述第二像素点代表待试妆部位图像上低亮度区域内的像素点。本实施例中,当计算机设备获取到待试妆部位图像时,可以从中提取出像素值小于预设阈值的第二像素点,即高亮度区域内的第二像素点,以便之后使用该第二像素点与蒙版图像上的对应像素点进行融合。
S203,根据第一像素点的像素值、第二像素点的像素值、以及蒙版图像,确定试妆图像。
当计算机设备获取到第一像素点和第二像素点时,可以进一步的将第一像素点的像素值和蒙版图像上对应像素点的像素值进行融合,以及将第二像素点的像素值和蒙版图像上对应像素点的像素值进行融合,得到试妆图像。
在一个实施例中,如图4所示,上述步骤S203“根据第一像素点的像素值、第二像素点的像素值、以及蒙版图像,确定试妆图像”,包括:
S301,根据待试妆部位图像上第一像素点的像素值和蒙版图像上与第一像素点对应的像素值,通过第一融合方式得到目标第一像素点的像素值。
具体的,计算机设备可以将待试妆部位图像上的颜色分成R、G、B三个通道的颜色,并对三个通道的颜色分别进行变化,具体的,可以将待试妆部位图像上的颜色低于预设阈值颜色的区域采用第一融合方式进行调节。例如,上述预设阈值颜色的值可以取128。
例如,上述第一融合方式可以为滤色方式,具体可以采用如下关系式(1)表示,从而得到目标第一像素点的像素值:
Figure PCTCN2020119543-appb-000001
其中,C表示目标第一像素点的像素值,也代表目标第一像素点的三个通道R、G、B的颜色值;A表示待试妆部位图像上第一像素点的像素值,也代表待试妆部位图像上第一像素点的三个通道R、G、B的颜色值;公式(2)中B表示蒙版图像上与第一像素点对应的像素值,也代表蒙版图像上与第一像素点对的三个通道R、G、B的颜色值。
本实施例中,当计算机设备确定了待试妆部位图像上第一像素点的像素值,以及蒙版图像上与第一像素点对应的像素值时,可以进一步的将待试妆 部位图像上第一像素点的像素值和蒙版图像上与第一像素点对应的像素值分别代入到上述关系式(1)中的变量A和B中,便可得到目标第一像素点的像素值C。
S302,根据待试妆部位图像上第二像素点的像素值和蒙版图像上与第二像素点的像素值,通过第二融合方式得到目标第二像素点的像素值。
具体的,可以将待试妆部位图像上的颜色高于预设阈值颜色的区域采用第二融合方式进行调节。
可选的,上述第二融合方式可以为正片叠底,具体的可以使用如下关系式(2)表示,从而得到目标第二像素点的像素值:
Figure PCTCN2020119543-appb-000002
其中,C表示目标第二像素点的像素值,也代表目标第二像素点的三个通道R、G、B的颜色值;A表示待试妆部位图像上第二像素点的像素值,也代表待试妆部位图像上第二像素点的三个通道R、G、B的颜色值;公式(2)中B表示蒙版图像上与第二像素点对应的像素值,也代表蒙版图像上与第二像素点对应的三个通道R、G、B的颜色值。
本实施例中,当计算机设备确定了待试妆部位图像上第二像素点的像素值,以及蒙版图像上与第二像素点对应的像素值时,可以进一步的将待试妆部位图像上第二像素点的像素值和蒙版图像上与第二像素点对应的像素值分别代入到上述关系式(2)中的变量A和B中,便可得到目标第二像素点的像素值C。
S303,根据目标第一像素点的像素值和目标第二像素点的像素值,确定试妆图像。
当计算机设备基于上述S301和上述S302的方法得到目标第一像素点的像素值和目标第二像素点的像素值时,即可根据目标第一像素点的像素值和目标第二像素点的像素值确定试妆图像上全部像素点的像素值,即得到试妆图像。
可选的,上述融合过程,计算机设备还可以逐像素点的进行,即一方面判断待试妆图像上的各像素点的像素值的大小,进而根据各像素点的像素值的大小选择对应的关系式(1)或(2),将待试妆图像上的各像素点的像素值,以及蒙版图像上对应像素点的像素值代入到相应的关系式(1)或(2) 中进行计算,得到试妆图像上的各像素点的像素值,即相当于得到试妆图像。在具体实施上述过程时,可以采用GPU的方式进行加速,以提高获取试妆图像的速率,以达到实时的处理效果。
需要说明的是,通过上述图4实施例所述的方法得到的试妆图像,相比于原有的待试妆部位图像,其上的上妆区域的变化颜色主要是高亮度的区域,而暗色区域的颜色基本保持不变,这样也就保留了待上妆部位上的阴影部位效果和高亮度部位效果,从而提高了待上妆部位的立体感和饱和度,使试妆图像上的上妆效果更加真实,更加符合实际上妆效果。
在实际应用中,还存在一种应用场景,即若用户选择雾面美妆产品进行上妆时,例如,蒙面口红,则上妆后的效果肯定没有高光效果,因此,在融合蒙版图像和待试妆部位图像之前,还需要对待试妆部位图像进行高光削弱处理,该处理过程,如图5所述,包括:
S401,对待试妆部位图像进行模糊处理,得到中间待试妆部位图像。
具体的,当计算机设备获取到待试妆部位图像时,可以进一步的对待试妆部位图像进行模糊处理,例如,高斯模糊处理,以达到平滑待试妆部位图像上各像素点的像素值效果,从而去掉待试妆部位图像上像素值高的像素点,以及去掉待试妆部位图像上像素值低的像素点,使待试妆部位图像上不存在高光的像素点,相当于达到去高光的效果。本实施例中,计算机设备对待试妆部位图像进行模糊处理后,即可得到经过处理后的待试妆部位图像,即中间待试妆部位图像。
S402,获取待试妆部位图像上的像素点的像素值与中间待试妆部位图像上对应的像素点的像素值之间的像素差值。
当计算机设备基于上述S401的步骤得到中间待试妆部位图像时,可以进一步的将待试妆部位图像上像素点的像素值与中间待试妆部位图像上像素点的像素值进行差值运算,从而得到待试妆部位图像上的像素点的像素值与中间待试妆部位图像上对应的像素点的像素值之间的像素差值。在实际应用中,待试妆部位图像上大于中间待试妆部位图像上的像素点中可能存在高光位置的像素点,若从待试妆部位图像上大于中间待试妆部位图像上的像素点中确定了高光位置的像素点,之后计算机设备即可对该高光位置的像素点进行削弱高光处理,从而消除待试妆部位图像上的高光。
S403,若像素差值大于预设像素阈值,将像素差值对应的待试妆部位图像上的像素点进行亮度削弱处理,得到处理后的待试妆部位图像。
当计算机设备基于前述S402所述的步骤得到像素差值时,即可进一步的将该像素差值与预设像素阈值进行比较,并将大于预设像素阈值对应的待试妆部位图像上的像素点进行亮度削弱处理,之后便可降低待试妆部位图像上的该像素点的像素值,即削弱该待试妆部位图像上的像素点的亮度。经过上述方法处理后的待试妆部位图像即为削弱高光亮度后的待试妆部位图像。
对应的,上述S103“将蒙版图像和待试妆部位图像进行融合”的具体实施方式包括:将蒙版图像和处理后的待试妆部位图像进行融合。
当计算机设备得到上述处理后的待试妆部位图像时,即可按照前述实施例所述的方法,将处理后的待试妆部位图像和蒙版图像进行融合,得到试妆图像。由于上述处理后的待试妆部位图像为经过模糊处理后的图像,即削弱高光后的图像,因此,上述试妆图像可以实现雾面的上妆效果。
在另一种应用场景中,即若用户选择的美妆产品的颜色与待试妆部位的颜色之间的差异性比较大(例如,上妆产品的颜色为绿色,待试妆部位的颜色为黑色)时,上妆后的效果和实际上妆效果之间肯定会存在差异,因此,在融合蒙版图像和待试妆部位图像之前,还需要对蒙版图像进行一些处理,使蒙版图像在融合待试妆部位图像后,可以使融合图像上的试妆效果可以更加符合真实的上妆效果。上述处理过程步骤:按照预设的调节参数的值,对蒙版图像的对应参数进行调整,得到调整参数后的蒙版图像;调节参数包括饱和度和/或明亮度。
本实施例中,计算机设备在获取到蒙版图像时,可以进一步的按照预设的饱和度和/或明亮度等调节参数的值,调节该蒙版图像的饱和度、明亮度等参数,使调整参数后的蒙版图像上的饱和度和/或明亮度等属性,与待试妆部位图像上的属性之间的差异减少,以使之后使用调整参数后的蒙版图像进行融合时,融合后的图像上的上妆效果更加真实。
对应的,上述S103“将蒙版图像和待试妆部位图像进行融合”的具体实施方式包括:将调整参数后的蒙版图像和待试妆部位图像进行融合。
当计算机设备得到上述调整参数后的蒙版图像时,即可按照前述实施例所述的方法,将调整参数后的蒙版图像和待试妆部位图像进行融合,得到试 妆图像。由于上述调整参数后的蒙版图像为根据实际需求经过参数调整后的图像,因此,上述试妆图像可以实现因蒙版图像和待试妆部位图像之间较大差异导致的上妆效果不真实的问题。
在一个实施例中,如图6所示,上述步骤S101“获取人脸图像上的待试妆部位图像”的具体实施方式,包括:
S501,通过图像采集设备获取人脸图像。
其中,图像采集设备可以是摄像机、相机、手机等具有拍照或摄像功能的设备。图像采集设备可以获取人脸图像,也可以获取包含人脸图像的视频图像。本实施例中,计算机设备可以通过图像采集设备获取到人脸图像,也可以获取到视频图像,当获取到视频图像时,可以利用帧率将其拆分成带有人脸的图像。需要说明的是,当计算机设备获取到视频流时,即可根据前述实施例所述的方法实现对每帧视频流进行实时的美妆,得到美妆后的视频。
S502,根据人脸图像确定待试妆部位图像。
计算机设备在获取到人脸图像时,即可进一步的确定该人脸图像上的待试妆部位图像。具体的,计算机设备获取到人脸图像时,可以根据相应的检测方法,检测出该人脸图像上的待试妆部位,得到待试妆部位图像。可选的,计算机设备获取到人脸图像时,也可以根据用户输入的指令确定出该人脸图像上待试妆部位,从而得到待试妆部位图像。
在一个实施例中,如图7所示,上述步骤S502“根据人脸图像确定待试妆部位图像”或上述步骤S101“获取人脸图像上的待试妆部位图像”的具体实施方式,包括:
S601,对人脸图像进行关键点检测,得到人脸图像上的关键点。
其中,关键点可以表示人脸图像上的五官关键点,也可以表示人脸图像上其它部位关键点。本实施例中,计算机设备可以采用深度学习算法或其它检测方法,对人脸图像上的关键点进行检测,得到人脸图像上的关键点。
S602,根据关键点确定表示待试妆部位边界的稠密关键点。
当计算机设备得到人脸图像上的关键点时,可以进一步的通过稠密关键点检测方式或其它检测方式,对人脸图像上的关键点进行检测,得到稠密关键点,通过该稠密关键点能够准确的得到真实的待上妆部位边界或待上妆部位的位置。
S603,根据稠密关键点确定待试妆部位图像。
因为通过稠密关键点能够准确的得到真实的待上妆部位边界或待上妆部位的位置,因此,根据稠密关键点即可确定待试妆部位图像。例如,若通过稠密关键点能够准确的得到真实的眼睛部位的边界,之后,即可通过该边界所在位置从人脸图像上确定待试妆部位图像。
在实际应用中,上述步骤S103“将蒙版图像和待试妆部位图像进行融合,得到试妆图像”之后,上述人脸图像的试妆处理方法,还包括步骤:将试妆图像与人脸图像进行融合,得到完整试妆图像。
当计算机设备按照前述实施例所述的方法对人脸图像进行上妆处理,得到试妆图像后,还可以将表示试妆效果的试妆图像与原来完成的人脸图像进行融合,得到完整试妆图像,以便用户能够根据试妆图像上更加全面的体验试妆效果。
在一个实施例中,如图8所示,上述步骤S102“获取待试妆部位图像的蒙版图像”的具体实施方式,包括:
S701,获取预设的上妆参数;上妆参数包括颜色参数、光泽度参数、饱和度参数、亮度参数、亮片参数、和/或高光参数。
其中,上妆参数可以包括体现各种美妆产品属性的参数,具体的,可以包括各种美妆产品的颜色参数、光泽度参数、饱和度参数、亮度参数、亮片参数、和/或高光参数等,例如,若给眼睛部位上妆时,眼睛部位的上妆参数可以具体包括眼影的颜色、眼影的亮片、眼影的渐变效果等;若给嘴唇部位上妆时,嘴唇部位的上妆参数可以具体包括唇釉的颜色、唇釉的高光、唇釉色彩的饱和度等。上妆参数可以根据用户需求进行设置。本实施例中,用户可以在计算机设备的显示界面上通过选择项的方式或输入命令的方式输入上妆参数,计算机设备即可得到用户输入或选择的上妆参数。对于用于输入的方法,可以具体通过键盘输入或语音输入,对此本实施例不做限定。例如,若计算机设备需要对人脸图像上的眼睛部位进行上妆处理时,用户可以选择红色的上妆颜色,即可在计算机设备上选择该颜色的上妆参数,之后计算机设备即可获取到红色上妆参数。
S702,根据上妆参数确定蒙版图像。
基于上述的步骤完成后,计算机设备即可根据上妆参数生成包含该上妆 参数属性的蒙版图像。例如,如上述例子,在计算机设备获取到红色上妆参数后,即可根据红色上妆参数对应生成用于上妆眼睛的红色蒙版图像。
上述实施例提供的方法,可以通过修改上妆参数,即可产生多种美妆产品的效果,从而解决美妆产品数量多的问题。相比于传统的线下试妆,本申请提出的人脸图像的试妆处理方法,极大的节约了试妆成本,而且还减少了线下试妆的交叉感染等风险,极大的提高了试妆的安全性。
应该理解的是,虽然图2-8的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,图2-8中的至少一部分步骤可以包括多个步骤或者多个阶段,这些步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤中的步骤或者阶段的至少一部分轮流或者交替地执行。
在一个实施例中,如图9所示,提供了一种人脸图像的试妆处理装置,包括:第一获取模块11、第二获取模块12和融合模块13,其中:
第一获取模块11,用于获取人脸图像上的待试妆部位图像;
第二获取模块12,用于获取待试妆部位图像的蒙版图像,蒙版图像用于渲染待试妆部位图像;
融合模块13,用于将蒙版图像和待试妆部位图像进行融合,得到试妆图像。
在一个实施例中,如图10所示,上述融合模块13,包括:
第一确定单元131,用于确定待试妆部位图像上大于或等于预设阈值的第一像素点的像素值;
第二确定单元132,用于确定待试妆部位图像上小于预设阈值的第二像素点的像素值;
第三确定单元133,用于根据第一像素点的像素值、第二像素点的像素值、以及蒙版图像,确定试妆图像。
在一个实施例中,上述第三确定单元133具体用于根据待试妆部位图像上第一像素点的像素值和蒙版图像上与第一像素点对应的像素值,通过第一 融合方式得到目标第一像素点的像素值;根据待试妆部位图像上第二像素点的像素值和蒙版图像上与第二像素点对应的像素值,通过第二融合方式得到目标第二像素点的像素;根据目标第一像素点的像素值和目标第二像素点的像素值,确定试妆图像。
在一个实施例中,如图11所示,上述融合模块13之前,上述人脸图像的试妆处理装置还包括:
第一处理模块14,用于对待试妆部位图像进行模糊处理,得到中间待试妆部位图像;
第三获取模块15,用于获取待试妆部位图像上的像素点的像素值与中间待试妆部位图像上对应的像素点的像素值之间的像素差值;
第二处理模块16,用于在像素差值大于预设像素阈值时,将像素差值对应的待试妆部位图像上的像素点进行亮度削弱处理,得到处理后的待试妆部位图像;
对应的,上述融合模块13具体用于将蒙版图像和处理后的待试妆部位图像进行融合。
在一个实施例中,如图12所示,上述融合模块13之前,上述人脸图像的试妆处理装置还包括:
调整模块17,用于按照预设的调节参数的值,对蒙版图像的对应参数进行调整,得到调整参数后的蒙版图像;调节参数包括饱和度和/或明亮度。
对应的,上述融合模块13具体用于将调整参数后的蒙版图像和待试妆部位图像进行融合。
在一个实施例中,如图13所示,上述第一获取模块11,包括:
第一检测单元111,用于对人脸图像进行关键点检测,得到人脸图像上的关键点;
第二检测单元112,用于根据关键点确定表示待试妆部位边界的稠密关键点;
第四确定单元113,用于根据稠密关键点确定待试妆部位图像。
在一个实施例中,如图14所示,上述融合模块13之后,上述人脸图像的试妆处理装置还包括:
融合完整图像模块18,用于将试妆图像与人脸图像进行融合,得到完整 试妆图像。
在一个实施例中,如图15所示,上述第二获取模块12,包括:
获取单元121,用于获取预设的上妆参数;上妆参数包括颜色参数、光泽度参数、饱和度参数、亮度参数、亮片参数、和/或高光参数;
第五确定单元122,用于根据上妆参数确定蒙版图像。
关于人脸图像的试妆处理装置的具体限定可以参见上文中对于人脸图像的试妆处理方法的限定,在此不再赘述。上述人脸图像的试妆处理装置中的各个模块可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于计算机设备中的处理器中,也可以以软件形式存储于计算机设备中的存储器中,以便于处理器调用执行以上各个模块对应的操作。
在一个实施例中,提供了一种计算机设备,包括存储器和处理器,存储器中存储有计算机程序,该处理器执行计算机程序时实现以下步骤:
获取人脸图像上的待试妆部位图像;
获取待试妆部位图像的蒙版图像;蒙版图像用于渲染待试妆部位图像;
将蒙版图像和待试妆部位图像进行融合,得到试妆图像。
在一个实施例中,提供了一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现以下步骤:
获取人脸图像上的待试妆部位图像;
获取待试妆部位图像的蒙版图像;蒙版图像用于渲染待试妆部位图像;
将蒙版图像和待试妆部位图像进行融合,得到试妆图像。
在一个实施例中,提供了一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算机设备上运行时,导致所述计算机设备以下步骤:
获取人脸图像上的待试妆部位图像;
获取待试妆部位图像的蒙版图像;蒙版图像用于渲染待试妆部位图像;
将蒙版图像和待试妆部位图像进行融合,得到试妆图像。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一非易失性计算机可读取存储介质中,该计算机程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用 的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和易失性存储器中的至少一种。非易失性存储器可包括只读存储器(Read-Only Memory,ROM)、磁带、软盘、闪存或光存储器等。易失性存储器可包括随机存取存储器(RandomAccess Memory,RAM)或外部高速缓冲存储器。作为说明而非局限,RAM可以是多种形式,比如静态随机存取存储器(Static Random Access Memory,SRAM)或动态随机存取存储器(Dynamic Random Access Memory,DRAM)等。
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (13)

  1. 一种人脸图像的试妆处理方法,所述方法包括:
    获取人脸图像上的待试妆部位图像;
    获取所述待试妆部位图像的蒙版图像;所述蒙版图像用于渲染所述待试妆部位图像;
    将所述蒙版图像和所述待试妆部位图像进行融合,得到试妆图像。
  2. 根据权利要求1所述的方法,其特征在于,所述将所述蒙版图像和所述待试妆部位图像进行融合,得到试妆图像,包括:
    确定所述待试妆部位图像上大于或等于预设阈值的第一像素点的像素值;
    确定所述待试妆部位图像上小于所述预设阈值的第二像素点的像素值;
    根据所述第一像素点的像素值、所述第二像素点的像素值、以及所述蒙版图像,确定所述试妆图像。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述第一像素点的像素值、所述第二像素点的像素值、以及所述蒙版图像,确定所述试妆图像,包括:
    根据所述待试妆部位图像上第一像素点的像素值和所述蒙版图像上与第一像素点对应的像素值,通过第一融合方式得到目标第一像素点的像素值;
    根据所述待试妆部位图像上第二像素点的像素值和所述蒙版图像上与第二像素点对应的像素值,通过第二融合方式得到目标第二像素点的像素值;
    根据所述目标第一像素点的像素值和所述目标第二像素点的像素值,确定所述试妆图像。
  4. 根据权利要求3所述的方法,其特征在于,所述第一融合方式为滤色方式,所述第二融合方式为正片叠底。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,所述将所述蒙版图像和所述待试妆部位图像进行融合之前,所述方法还包括:
    对所述待试妆部位图像进行模糊处理,得到中间待试妆部位图像;
    获取所述待试妆部位图像上的像素点的像素值与所述中间待试妆部位图像上对应的像素点的像素值之间的像素差值;
    若所述像素差值大于预设像素阈值,将所述像素差值对应的所述待试妆 部位图像上的像素点进行亮度削弱处理,得到处理后的待试妆部位图像;
    所述将所述蒙版图像和所述待试妆部位图像进行融合,包括:
    将所述蒙版图像和所述处理后的待试妆部位图像进行融合。
  6. 根据权利要求1-4任一项所述的方法,其特征在于,所述将所述蒙版图像和所述待试妆部位图像进行融合之前,所述方法还包括:
    按照预设的调节参数的值,对所述蒙版图像的对应参数进行调整,得到调整参数后的蒙版图像;所述调节参数包括饱和度和/或明亮度;
    所述将所述蒙版图像和所述待试妆部位图像进行融合,包括:
    将所述调整参数后的蒙版图像和所述待试妆部位图像进行融合。
  7. 根据权利要求6所述的方法,其特征在于,所述获取人脸图像上的待试妆部位图像,包括:
    对所述人脸图像进行关键点检测,得到所述人脸图像上的关键点;
    根据所述关键点确定表示待试妆部位边界的稠密关键点;
    根据所述稠密关键点确定所述待试妆部位图像。
  8. 根据权利要求1所述的方法,其特征在于,所述将所述蒙版图像和所述待试妆部位图像进行融合,得到试妆图像之后,所述方法还包括:
    将所述试妆图像与所述人脸图像进行融合,得到完整试妆图像。
  9. 根据权利要求1所述的方法,其特征在于,所述获取所述待试妆部位图像的蒙版图像,包括:
    获取预设的上妆参数;所述上妆参数包括颜色参数、光泽度参数、饱和度参数、亮度参数、亮片参数、和/或高光参数;
    根据所述上妆参数确定所述蒙版图像。
  10. 一种人脸图像的试妆处理装置,其特征在于,所述装置包括:
    第一获取模块,用于获取人脸图像上的待试妆部位图像;
    第二获取模块,用于获取所述待试妆部位图像的蒙版图像;所述蒙版图像用于渲染所述待试妆部位图像;
    融合模块,用于将所述蒙版图像和所述待试妆部位图像进行融合,得到试妆图像。
  11. 一种计算机设备,包括存储器和处理器,所述存储器存储有计算机程序,其特征在于,所述处理器执行所述计算机程序时实现权利要求1至9中任一项所述方法的步骤。
  12. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于, 所述计算机程序被处理器执行时实现权利要求1至9中任一项所述的方法的步骤。
  13. 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行根据权利要求1至9中任一项所述的方法的步骤。
PCT/CN2020/119543 2020-02-28 2020-09-30 人脸图像的试妆处理方法、装置、计算机设备和存储介质 WO2021169307A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010129222.9A CN111369644A (zh) 2020-02-28 2020-02-28 人脸图像的试妆处理方法、装置、计算机设备和存储介质
CN202010129222.9 2020-02-28

Publications (1)

Publication Number Publication Date
WO2021169307A1 true WO2021169307A1 (zh) 2021-09-02

Family

ID=71208234

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/119543 WO2021169307A1 (zh) 2020-02-28 2020-09-30 人脸图像的试妆处理方法、装置、计算机设备和存储介质

Country Status (2)

Country Link
CN (1) CN111369644A (zh)
WO (1) WO2021169307A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113689363A (zh) * 2021-09-07 2021-11-23 北京顺势兄弟科技有限公司 一种人像图处理方法、装置、电子设备、存储介质
CN113781309A (zh) * 2021-09-17 2021-12-10 北京金山云网络技术有限公司 图像处理方法、装置及电子设备
CN116433827A (zh) * 2023-04-07 2023-07-14 广州趣研网络科技有限公司 虚拟人脸形象的生成方法、展示方法、生成及展示装置

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111369644A (zh) * 2020-02-28 2020-07-03 北京旷视科技有限公司 人脸图像的试妆处理方法、装置、计算机设备和存储介质
CN113761994B (zh) * 2020-08-07 2024-05-21 北京沃东天骏信息技术有限公司 处理图像的方法、装置、设备和计算机可读介质
CN112381709B (zh) * 2020-11-13 2022-06-21 北京字节跳动网络技术有限公司 图像处理方法、模型训练方法、装置、设备和介质
CN112712479A (zh) * 2020-12-24 2021-04-27 厦门美图之家科技有限公司 妆容处理方法、系统、移动终端及存储介质
CN112686820A (zh) * 2020-12-29 2021-04-20 北京旷视科技有限公司 虚拟美妆方法、装置和电子设备
CN112767285B (zh) * 2021-02-23 2023-03-10 北京市商汤科技开发有限公司 图像处理方法及装置、电子设备和存储介质
CN112766234B (zh) * 2021-02-23 2023-05-12 北京市商汤科技开发有限公司 图像处理方法及装置、电子设备和存储介质
CN113344836B (zh) * 2021-06-28 2023-04-14 展讯通信(上海)有限公司 人脸图像处理方法及装置、计算机可读存储介质、终端
CN113344837B (zh) * 2021-06-28 2023-04-18 展讯通信(上海)有限公司 人脸图像处理方法及装置、计算机可读存储介质、终端
CN113837020B (zh) * 2021-08-31 2024-02-02 北京新氧科技有限公司 一种化妆进度检测方法、装置、设备及存储介质
CN113762212B (zh) * 2021-09-27 2024-06-11 北京市商汤科技开发有限公司 图像处理方法及装置、电子设备和存储介质
CN114119154A (zh) * 2021-11-25 2022-03-01 北京百度网讯科技有限公司 虚拟化妆的方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408974A (zh) * 2008-11-18 2009-04-15 深圳市迅雷网络技术有限公司 一种图像处理方法和装置
CN108596828A (zh) * 2018-04-18 2018-09-28 网易(杭州)网络有限公司 图像泛光处理方法与装置、电子设备、存储介质
WO2019062608A1 (zh) * 2017-09-30 2019-04-04 深圳市商汤科技有限公司 图像处理方法和装置、电子设备、计算机存储介质
CN109712090A (zh) * 2018-12-18 2019-05-03 维沃移动通信有限公司 一种图像处理方法、装置和移动终端
CN110390632A (zh) * 2019-07-22 2019-10-29 北京七鑫易维信息技术有限公司 基于妆容模板的图像处理方法、装置、存储介质及终端
CN111369644A (zh) * 2020-02-28 2020-07-03 北京旷视科技有限公司 人脸图像的试妆处理方法、装置、计算机设备和存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105023252A (zh) * 2015-07-14 2015-11-04 厦门美图网科技有限公司 一种美容图像的增强处理方法、系统及拍摄终端
CN108564526A (zh) * 2018-03-30 2018-09-21 北京金山安全软件有限公司 一种图像处理方法、装置、电子设备及介质
CN108898546B (zh) * 2018-06-15 2022-08-16 北京小米移动软件有限公司 人脸图像处理方法、装置及设备、可读存储介质
CN109063560B (zh) * 2018-06-28 2022-04-05 北京微播视界科技有限公司 图像处理方法、装置、计算机可读存储介质和终端
CN110728618B (zh) * 2018-07-17 2023-06-27 淘宝(中国)软件有限公司 虚拟试妆的方法、装置、设备及图像处理方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408974A (zh) * 2008-11-18 2009-04-15 深圳市迅雷网络技术有限公司 一种图像处理方法和装置
WO2019062608A1 (zh) * 2017-09-30 2019-04-04 深圳市商汤科技有限公司 图像处理方法和装置、电子设备、计算机存储介质
CN108596828A (zh) * 2018-04-18 2018-09-28 网易(杭州)网络有限公司 图像泛光处理方法与装置、电子设备、存储介质
CN109712090A (zh) * 2018-12-18 2019-05-03 维沃移动通信有限公司 一种图像处理方法、装置和移动终端
CN110390632A (zh) * 2019-07-22 2019-10-29 北京七鑫易维信息技术有限公司 基于妆容模板的图像处理方法、装置、存储介质及终端
CN111369644A (zh) * 2020-02-28 2020-07-03 北京旷视科技有限公司 人脸图像的试妆处理方法、装置、计算机设备和存储介质

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113689363A (zh) * 2021-09-07 2021-11-23 北京顺势兄弟科技有限公司 一种人像图处理方法、装置、电子设备、存储介质
CN113689363B (zh) * 2021-09-07 2024-03-29 北京顺势兄弟科技有限公司 一种人像图处理方法、装置、电子设备、存储介质
CN113781309A (zh) * 2021-09-17 2021-12-10 北京金山云网络技术有限公司 图像处理方法、装置及电子设备
CN116433827A (zh) * 2023-04-07 2023-07-14 广州趣研网络科技有限公司 虚拟人脸形象的生成方法、展示方法、生成及展示装置
CN116433827B (zh) * 2023-04-07 2024-06-07 广州趣研网络科技有限公司 虚拟人脸形象的生成方法、展示方法、生成及展示装置

Also Published As

Publication number Publication date
CN111369644A (zh) 2020-07-03

Similar Documents

Publication Publication Date Title
WO2021169307A1 (zh) 人脸图像的试妆处理方法、装置、计算机设备和存储介质
CN109829930B (zh) 人脸图像处理方法、装置、计算机设备及可读存储介质
US20230401682A1 (en) Styled image generation method, model training method, apparatus, device, and medium
US8620038B2 (en) Method, system and computer program product for automatic and semi-automatic modification of digital images of faces
US8265351B2 (en) Method, system and computer program product for automatic and semi-automatic modification of digital images of faces
US10403036B2 (en) Rendering glasses shadows
US8660319B2 (en) Method, system and computer program product for automatic and semi-automatic modification of digital images of faces
WO2022179025A1 (zh) 图像处理方法及装置、电子设备和存储介质
WO2017016171A1 (zh) 用于终端设备的窗口显示处理方法、装置、设备及存储介质
CN112241933A (zh) 人脸图像处理方法、装置、存储介质及电子设备
CN109672830B (zh) 图像处理方法、装置、电子设备及存储介质
CN112308944A (zh) 仿真唇妆的扩增实境显示方法
WO2023093291A1 (zh) 图像处理方法、装置、计算机设备和计算机程序产品
WO2022132032A1 (zh) 人像图像处理方法及装置
JP2021144582A (ja) メイクアップシミュレーション装置、メイクアップシミュレーション方法及びプログラム
WO2021128593A1 (zh) 人脸图像处理的方法、装置及系统
CN110689546A (zh) 个性化头像的生成方法、装置、设备及存储介质
CN112967193A (zh) 图像校准方法及装置、计算机可读介质和电子设备
WO2024125328A1 (zh) 直播图像帧处理方法、装置、设备、可读存储介质及产品
JP4372494B2 (ja) 画像処理装置、画像処理方法、プログラム、記録媒体
WO2023151214A1 (zh) 图像生成方法、系统、电子设备、存储介质和产品
WO2021035979A1 (zh) 边缘学习的图像填充方法、装置、终端及可读存储介质
WO2023273111A1 (zh) 一种图像处理方法、装置、计算机设备和存储介质
CN115953597B (zh) 图像处理方法、装置、设备及介质
CN113223128B (zh) 用于生成图像的方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20921373

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 01.02.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20921373

Country of ref document: EP

Kind code of ref document: A1