WO2024041623A1 - 特效图的生成方法、装置、设备及存储介质 - Google Patents

特效图的生成方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2024041623A1
WO2024041623A1 PCT/CN2023/114831 CN2023114831W WO2024041623A1 WO 2024041623 A1 WO2024041623 A1 WO 2024041623A1 CN 2023114831 W CN2023114831 W CN 2023114831W WO 2024041623 A1 WO2024041623 A1 WO 2024041623A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
illumination
target object
light source
target
Prior art date
Application number
PCT/CN2023/114831
Other languages
English (en)
French (fr)
Inventor
袁琦
Original Assignee
北京字跳网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字跳网络技术有限公司 filed Critical 北京字跳网络技术有限公司
Publication of WO2024041623A1 publication Critical patent/WO2024041623A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • Embodiments of the present disclosure relate to the field of image processing technology, for example, to a method, device, equipment and storage medium for generating special effects images.
  • Embodiments of the present disclosure provide a method, device, equipment and storage medium for generating special effects images, which can generate special effects images with lighting effects and enrich the display content of images.
  • embodiments of the present disclosure provide a method for generating special effects graphics, including:
  • the light source information includes light source color, light source position and illumination intensity
  • the target lighting map and the original image are fused to obtain a target lighting special effect map.
  • embodiments of the present disclosure also provide a device for generating special effects graphics, including:
  • the acquisition module is configured to acquire the current light source information, the normal map of the original image and the target object mask map; wherein the light source information includes light source color, light source position and illumination intensity;
  • the target lighting map generation module is configured to generate light based on the normal map, the target object mask map and the The light source information is used to generate the target lighting map;
  • the target lighting special effects map acquisition module is configured to fuse the target lighting map and the original image to obtain the target lighting special effects map.
  • embodiments of the present disclosure also provide an electronic device, including:
  • a storage device arranged to store at least one program
  • the at least one processor When the at least one program is executed by the at least one processor, the at least one processor is caused to implement the method for generating a special effect map as described in any embodiment of the present disclosure.
  • embodiments of the disclosure further provide a storage medium containing computer-executable instructions that, when executed by a computer processor, perform the generation of the special effects map as described in any embodiment of the disclosure. method.
  • Figure 1 is a schematic flowchart of a method for generating special effects graphics provided by an embodiment of the present disclosure
  • Figure 2a is an example diagram of a target object mask provided by an embodiment of the present disclosure
  • Figure 2b is an example diagram of a second illumination intensity map provided by an embodiment of the present disclosure
  • Figure 2c is an example diagram of a light intensity mask provided by an embodiment of the present disclosure.
  • Figure 3a is an example diagram of a reverse target object mask provided by an embodiment of the present disclosure
  • Figure 3b is an example diagram of a first backlight intensity map provided by an embodiment of the present disclosure
  • Figure 3c is an example of a first blur mask provided by an embodiment of the present disclosure.
  • Figure 3d is an example diagram of a second blur mask provided by an embodiment of the present disclosure.
  • Figure 3e is an example diagram of a fusion mask provided by an embodiment of the present disclosure.
  • Figure 3f is an example diagram of a second backlight intensity map provided by an embodiment of the present disclosure.
  • Figure 3g is an example diagram of a target backlight intensity map provided by an embodiment of the present disclosure.
  • Figure 4a is an example of a grayscale image provided by an embodiment of the present disclosure
  • Figure 4b is an example diagram of a partial object mask provided by an embodiment of the present disclosure.
  • Figure 4c is an example of a smooth grayscale image provided by an embodiment of the present disclosure.
  • Figure 4d is an example diagram of a smooth local object graph provided by an embodiment of the present disclosure.
  • Figure 5 is a schematic structural diagram of a device for generating special effects images provided by an embodiment of the present disclosure
  • FIG. 6 is a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure.
  • the term “include” and its variations are open-ended, ie, “including but not limited to.”
  • the term “based on” means “based at least in part on.”
  • the term “one embodiment” means “at least one embodiment”; the term “another embodiment” means “at least one additional embodiment”; and the term “some embodiments” means “at least some embodiments”. Relevant definitions of other terms will be given in the description below.
  • a prompt message is sent to the user to clearly remind the user that the operation requested by the user will require the acquisition and use of the user's personal information. Therefore, users can autonomously choose whether to provide personal information to software or hardware such as electronic devices, applications, servers or storage media that perform the operations of the technical solution of the present disclosure based on the prompt information.
  • the method of sending prompt information to the user may be, for example, a pop-up window, and the prompt information may be presented in the form of text in the pop-up window.
  • the pop-up window can also contain a selection control for the user to choose "agree” or "disagree” to provide personal information to the electronic device.
  • Figure 1 is a schematic flowchart of a method for generating a special effects map provided by an embodiment of the present disclosure.
  • the embodiment of the present disclosure is applicable to the situation of generating a lighting special effects map.
  • the method can be executed by a device for generating a special effects map, and the device can be It is implemented in the form of at least one of software and hardware, optionally, through an electronic device.
  • the electronic device can be a mobile terminal, a personal computer (Personal Computer, PC) or a server.
  • PC Personal Computer
  • the method includes:
  • the light source information includes light source color, light source position and light intensity.
  • the light source can be a virtual light source, and the light source position can change based on the user's trigger operation.
  • the virtual light source can be generated by: generating a transformable (rotation, translation and scaling) virtual object, and placing the virtual object at the origin of the world coordinates; generating a transformable light source object, and placing the offset at The position d away from the world coordinate origin; use the light source object as a sub-object of the virtual object, obtain the touch screen drag operation, and map the coordinate transformation of the touch screen drag operation to the coordinate transformation of the virtual object, thereby achieving the touch screen to drive the light source object
  • d can be used as the radius of the sphere
  • the light source object is the virtual light source.
  • the light source color and light intensity can be preset, that is, they can be set by the user.
  • the normal map may be an image composed of normal information of multiple pixels in the original image.
  • the normal information is represented by a three-dimensional vector, that is, the normal vector.
  • the three components of the normal information are mapped to three color channel values respectively, thereby obtaining the normal map.
  • any normal estimation calculation can be used This method determines the normal information of multiple pixels in the original image, and generates a normal map based on the normal information.
  • the target object can be any object such as a person, animal or plant, and is not limited here.
  • the pixel value of a pixel in the target object mask represents the confidence that the pixel belongs to the target object.
  • the method of obtaining the target object mask of the original image may be: identifying the target object of the original image, obtaining the confidence that each pixel belongs to the target object, and generating the target object mask based on the confidence of all pixels. membrane diagram.
  • the pixel value of the target object mask is a value between 0 and 1. "0" means that the pixel does not belong to the target object and appears black in the target object mask. "1" means that the pixel belongs to the target object. Appears white in the target object mask.
  • Figure 2a is an example of a target object mask in this embodiment. As shown in Figure 2a, the target object is a portrait, the white area is the portrait, and the black area is the background area.
  • S120 Generate a target lighting map based on the normal map, target object mask map and light source information.
  • the color value of a pixel in the target lighting map represents the lighting color of the pixel.
  • the target illumination map consists of three parts: the target object illumination map, the background area illumination map, and the back illumination map.
  • the method of generating the target illumination map based on the normal map, the target object mask map and the light source information may be: generating the target object illumination map based on the normal map, the light source information and the target object mask map; based on the light source information and The target object mask map generates a background area illumination map; a back illumination map is generated based on the normal map and the target object mask map; the target object illumination map, the background area illumination map and the back illumination map are fused to obtain the target object illumination map .
  • the target object lighting map is generated based on the lighting colors of multiple pixels of the target object.
  • the background area lighting map is generated based on the lighting colors of multiple pixels in the background area.
  • the backlight map is generated based on the backlight colors of multiple pixels in the original image.
  • the process of generating a target object illumination map based on the normal map, light source information, and target object mask map may be: determining the illumination intensity of the pixel points of the target object based on the normal map, light source information, and target object mask map, Fusion of illumination intensity and light source color is used to obtain the illumination color of the target object's pixels, and a target object illumination map is generated based on the illumination color of the target object's pixels.
  • the process of generating the background area illumination map based on the light source information and the target object mask map can be: based on the light source information and the target object
  • the mask map determines the illumination intensity of the pixels in the background area, fuses the illumination intensity with the light source color, obtains the illumination color of the pixels in the background area, and generates a background area illumination map based on the illumination color of the pixels in the background area.
  • the process of generating a back-illumination map based on the normal map and the target object mask map can be: determining the back-illumination intensity of the pixels of the original image based on the normal map and the target object mask map, and comparing the back-illumination intensity with the light source color.
  • Fusion is performed to obtain the backlight color of the pixel, and a backlight map is generated based on the backlight color.
  • the target illumination map is generated based on the target object illumination map, the background area illumination map and the back illumination map, and different illumination maps are generated based on the image area and illumination direction, thereby finally presenting an interlaced light and dark effect.
  • the method of generating the target object illumination map based on the normal map, light source information and target object mask map may be: determining the first illumination intensity map based on the normal map and light source information; combining the first illumination intensity map and the light source color. Fusion is performed to obtain the initial illumination map; the initial illumination map is fused with the target object mask map to obtain the target object illumination map.
  • the pixel value of a pixel in the first light intensity map represents the light intensity of the pixel.
  • the method of determining the first illumination intensity map based on the normal map and the light source information may be: determining the intensity attenuation information of the pixel based on the normal information and the light source information in the normal map, and calculating the illumination intensity based on the intensity attenuation information. Adjust, obtain the illumination intensity of the pixel, and generate the first illumination intensity map based on the illumination intensity of all pixels.
  • the method of fusing the first illumination intensity map and the light source color may be: multiplying the illumination intensity of the pixel in the first illumination intensity map by the light source color to obtain the illumination color of the pixel.
  • the method of fusing the initial illumination map and the target object mask map may be: multiplying the illumination color of the initial illumination map and the pixel value of the corresponding pixel in the target object mask map to obtain the target object illumination map.
  • the initial illumination map is fused with the target object mask map, so that the illumination color of the pixels of the target object can be accurately determined.
  • the method of determining the first illumination intensity map based on the normal map and light source information may be: determining the first angle information between the incident light and the pixel points in the original image based on the normal map and light source information; The distance between the pixels in the image determines the intensity attenuation information; the light intensity is adjusted according to the first angle information and the attenuation information to obtain the target intensity of the pixel; the target intensity of the pixel is generated based on the into the first light intensity map.
  • the process of determining the first angle information between the incident light and the pixel point based on the normal map and light source information may be: determining the illumination direction vector, extracting the normal vector of the pixel point in the normal map, and applying the illumination direction vector respectively.
  • the direction vector and the normal vector of each pixel are normalized, and the normalized illumination direction vector and the normal vector are dot multiplied to obtain the first angle information.
  • the lighting direction vector can be a vector where the light source position (represented by world coordinates) points to the center position of the target object (converting screen coordinates to world coordinates), or a vector where the light source position points to the center position of the target object's face.
  • the method of determining the attenuation information based on the distance between the light source and the pixels in the original image may be: converting the light source position from world coordinates to screen coordinates, calculating the distance between the light source position in the screen coordinate system and the pixels in the original image, Subtract the ratio of distance to halo radius from 1 to obtain the intermediate result value; set the exponent of the attenuation value to the intermediate result value to obtain attenuation information.
  • the process of converting the light source position from world coordinates to screen coordinates can be: multiply the light source world coordinates by the Model View Projection (MVP) transformation matrix to obtain the light source projection coordinates, and then sum the x component of the light source projection coordinates The y component is linearly transformed to obtain the screen coordinates of the light source.
  • MVP Model View Projection
  • the method of adjusting the illumination intensity according to the first included angle information and the intensity attenuation information may be: multiplying the illumination intensity by the first included angle information and the intensity attenuation information in sequence to obtain the target intensity of the pixel.
  • the method of generating the first illumination intensity map based on the target intensity of the pixel may be: using the target intensity as the pixel value of the pixel, thereby obtaining the first illumination intensity map.
  • the illumination intensity is adjusted according to the first angle information and the intensity attenuation information, which can improve the accuracy of determining the target intensity of the pixel point.
  • the method of generating the background area illumination map based on the light source information and the target object mask map may be: obtaining the distance between the light source and the pixels in the original image; generating a second illumination intensity map based on the distance and illumination intensity; The light intensity map is fused with the target object mask map to obtain the light intensity mask map; based on the light intensity mask map, the set color and light source color are fused to obtain the background area illumination map.
  • the method of obtaining the distance between the light source and the pixels in the original image can be: positioning the light source The position is converted from world coordinates to screen coordinates, and the distance between the light source position in the screen coordinate system and the pixels in the original image is calculated.
  • the method of generating the second light intensity map based on the distance and light intensity may be: performing an exponential operation on the distance, subtracting the exponential operation result from 1 to obtain an intermediate result, and multiplying the intermediate result with the light intensity of the light source to obtain the pixel value.
  • Target intensity and then generate a second light intensity map based on the target intensity of the pixel.
  • FIG. 2b is an example of the second illumination intensity map in this embodiment.
  • the method of fusing the second illumination intensity map and the target object mask map may be: subtracting the pixel value of the corresponding pixel point in the target object mask map from the pixel value of the pixel point in the second illumination intensity map, If the pixel value obtained by the subtraction is less than 0, the pixel value is adjusted to 0. If the pixel value obtained by the subtraction is greater than 1, the pixel value is adjusted to 1 to obtain the light intensity mask.
  • Figure 2c is an example of the light intensity mask map in this embodiment. As shown in Figure 2c, the light intensity mask map can be understood as the light intensity map of the target object extracted from the second light intensity map.
  • the set color can be black, and the corresponding color value is (0, 0, 0).
  • the way to fuse the set color and the illumination color based on the illumination intensity mask is to use the pixel value of the illumination intensity mask as the weighting coefficient of the illumination color, and subtract the pixel value of the illumination intensity mask from 1. The result of the pixel value is used as the weighted coefficient of the set color.
  • the set color and the lighting color are weighted and summed.
  • the weighted summed color value is multiplied by the set value to obtain the background area lighting color of the pixel. , thereby obtaining the background area lighting map.
  • the setting value can be set to 0.5.
  • the illumination colors of the determined background area illumination map and the target object illumination map are different, so that the image presents an alternating light and dark effect.
  • the method of generating the backlight intensity map based on the normal map and the target object mask map may be: determining the first backlight intensity map based on the normal map, the target object mask map, and the perspective information; and masking the target object.
  • the backlight intensity map is fused to obtain the target backlight intensity map; the target backlight intensity map and the light source color are fused to obtain the backlight map.
  • the viewing angle information can be the viewing angle of the virtual camera corresponding to the current image, and can be determined by the viewing angle direction. vector representation.
  • the reverse processing method for the target object mask map may be: subtracting the pixel value of each pixel in the target object mask map from 1 to obtain the reverse target object mask map.
  • Figure 3a is an example of the reverse target object mask image in this embodiment. As shown in Figure 3a, compared with the target object mask image, the portrait area becomes black, The background area turns white.
  • the method of determining the first backlight intensity map based on the normal map, the target object mask map and the viewing angle information may be: determining the second difference between the normal information and the viewing angle information of each pixel in the normal map. Angle information; determine the initial backlight intensity of the pixel based on the second angle information; generate an initial backlight intensity map based on the initial backlight intensity; fuse the initial backlight intensity map with the target object mask map , obtain the first backlight intensity map.
  • the process of determining the second angle information between the normal information and the viewing angle information of each pixel in the normal map may be: normalizing the normal vector and the viewing angle direction vector respectively, and normalizing the normal vector and the viewing angle direction vector.
  • the normal vector and the viewing angle direction vector are dot multiplied, and the dot multiplication result is intercepted between 0 and 1 to obtain the second included angle information.
  • the process of determining the initial backlight intensity of the pixel point based on the second included angle information may be: subtracting the second included angle information from 1, performing an exponential operation on the subtraction result to set the control intensity, and determining the exponential operation result as The initial backlight intensity of this pixel.
  • the set control strength may be a value set by the user.
  • the method of fusing the initial backlight intensity map and the target object mask map may be: multiplying the pixel values of the pixels in the initial backlight intensity map by the pixel values of the corresponding pixels in the target object mask map.
  • FIG. 3b is an example of the first backlight intensity map in this embodiment.
  • an intensity map with a contour light effect can be generated.
  • the method of generating the second backlight intensity map based on the target object mask map and the reverse target object mask map may be: separately blurring the target object mask map and the reverse target object mask map, Obtain the first blur mask image and the second blur mask image; fuse the second blur mask image and the reverse target object mask image to obtain the fusion mask image; combine the fusion mask image with the first blur mask image The images are fused to obtain the second backlight intensity image.
  • the first blur mask image may be a image obtained by blurring the target object mask image
  • the second blur mask image may be an image obtained by blurring the reverse target object mask image.
  • the blur can be Gaussian Vague.
  • FIG. 3c is an example of the first blur mask image
  • FIG. 3d is an example image of the second blur mask image.
  • the method of fusing the second blur mask image and the reverse target object mask image may be: taking the pixel value of the second blur mask image and the pixel value corresponding to the reverse target object mask image. Maximum value, generate a fusion mask based on the maximum pixel value.
  • FIG. 3e is an example diagram of the fused mask image in this embodiment.
  • the method of fusing the fusion mask image and the first blur mask image may be: taking the minimum value among the pixel values of the fusion mask image and the pixel values corresponding to the first blur mask image, based on the minimum pixel value to generate a second backlight intensity map.
  • FIG. 3f is an example of the second backlight intensity map in this embodiment.
  • the second backlight intensity map has the effect of stroking the target object.
  • the method of fusing the first backlight intensity map and the second backlight intensity map may be: merging the pixel values of the first backlight intensity map and the pixel values of the second backlight intensity map. Add up to obtain the target backlight intensity map.
  • FIG. 3g is an example of the target back-illumination intensity map in this embodiment.
  • the method of fusing the target backlight intensity map and the light source color can be: multiply the light intensity in the target backlight intensity map by the light source color to obtain the backlight map.
  • the target object illumination map, the background area illumination map and the back illumination map are fused to obtain the target illumination map by: determining the relative position between the light source and the target object based on the first angle information; based on the relative position Fusion of the target object lighting map, background area lighting map and back lighting map to obtain the target lighting map.
  • the relative position includes the light source being located in the front direction of the target object, the light source being located in the back direction of the target object, and the light source being located in the side direction of the target object.
  • the first angle information is the dot product of the normal vector of the target object and the illumination direction vector.
  • the method of determining the relative position between the light source and the target object based on the first angle information may be: if the first angle information glare is within the range of (t, 1], that is, t ⁇ glare ⁇ 1, then the light source is located directly in front of the target object.
  • t can be a value between 0-1, for example, set to 0.1 or 0.2.
  • the target object illumination map, the background area illumination map and the back illumination map are fused.
  • the method of obtaining the target illumination map may be: in response to the light source being located in the forward direction of the target object, the target object illumination map and The background area illumination map is fused to obtain the target illumination map; in response to the light source being located at the back of the target object, the background area illumination map and the back illumination map are fused to obtain the target illumination map; in response to the light source being located at the side of the target object, Then interpolate and fuse the target object lighting map and the back lighting map to obtain the intermediate lighting map; fuse the intermediate lighting map and the background area lighting map to obtain the target lighting map.
  • the way to fuse the target object's light map and the background area's light map may be to add the pixel values of the target object's light map and the pixel values of the background area's light map.
  • the way to fuse the background area illumination map and the back illumination map is to add the pixel values of the background area illumination map and the pixel values of the back illumination map.
  • the interpolation and fusion method for the target object illumination map and the back illumination map may be: determining the mapping relationship between [-t, t] and [0, 1] through interpolation operation, and determining the first included angle according to the mapping relationship.
  • the target value corresponding to the information, the result of subtracting the target value from 1 is determined as the weighted coefficient of the target object's illumination map
  • the target value is determined as the weighted coefficient of the back-facing illumination map
  • the target object's illumination map and back-facing illumination map are calculated based on the weighting coefficient. Perform a weighted calculation to obtain an intermediate lighting map.
  • the way to fuse the intermediate lighting map and the background area lighting map may be to add the pixel values of the intermediate lighting map and the pixel values of the background area lighting map.
  • the target illumination map is determined based on the relative position between the light source and the target object, which can improve the accuracy and realism of the target illumination map.
  • the way to fuse the target illumination map and the original image may be to add the color values of the target illumination map and the color values of the original image.
  • it also includes the following steps: obtaining the grayscale image and the local object mask map of the original image; fusing the grayscale image and the local object mask map to obtain the local object map; based on the normal map, perspective information and light source Adjust the color of the light source to obtain the target lighting color; fuse the set color and the target lighting color based on the local object map to obtain the local object lighting special effect map; combine the local object lighting special effects map The effect image and the target object lighting map are fused to obtain an updated target object lighting map.
  • the local object is an object composed of a local area of the target object.
  • the target object is a portrait
  • the local object is hair.
  • the method of obtaining the grayscale image of the original image may be: performing grayscale processing on the original image to obtain the grayscale image.
  • FIG. 4a is an example of a grayscale image in this embodiment.
  • the method of obtaining the local object mask map of the original image may be: identifying the local objects in the original image and obtaining the local object mask map.
  • FIG. 4b is an example diagram of a local object mask map in this embodiment.
  • the method of fusing the grayscale image and the local object mask map to obtain the local object map may be: smoothing the grayscale image to obtain a smooth grayscale image; merging the smoothed grayscale image and the local object mask map The graphs are fused to obtain a smooth local object graph.
  • the method of smoothing the grayscale image may be: determining the first difference between the grayscale value and the first set value, determining the second difference between the second set value and the first set value, calculating The proportion value between the first difference value and the second difference value, and intercept the proportion value to between 0-1, perform N-th power processing on the proportion value, and obtain the processed grayscale value, based on the processed grayscale Value produces smooth grayscale values.
  • the first setting value may be 0,
  • the second setting value may be 0.5
  • N may be 3.
  • the formula for processing the proportional value to the Nth power can be expressed as x*x*(a-bx), where a and b are constants, and x is the proportional value.
  • Figure 4c is an example of a smooth grayscale image in this embodiment,
  • the method of fusing the smooth grayscale image and the local object mask map may be: multiplying the pixel values of the smooth grayscale image and the pixel values of the local object mask map.
  • FIG. 4d is an example diagram of a smoothed local object graph in this embodiment.
  • smoothing the grayscale image can highlight more details of the local object, thereby making the local object have a highlight effect.
  • smoothing the grayscale image can obtain more detailed hair strands, thereby obtaining a more accurate lighting effect.
  • the light source color is adjusted according to the normal map, viewing angle information and light source position
  • the target illumination color can be obtained by: determining the reflected illumination direction according to the normal map and the light source position; The third angle information of the viewing angle and the reflected light direction; adjust the light source color according to the third angle information to obtain the target lighting color.
  • the direction of reflected light can be understood as the direction of reflected light.
  • the process of determining the reflected lighting direction based on the normal map and the light source position can be: determining the lighting direction vector based on the light source position and the center position of the target object, linearly calculating the lighting direction vector and the normal direction vector to obtain the reflected lighting direction vector.
  • the linear calculation method for the lighting direction vector and the normal direction vector can be: calculate the dot product result of the lighting direction vector and the normal direction vector, multiply the dot product result, the set value and the normal direction vector, and then multiply the lighting direction vector and the normal direction vector.
  • direction vector minus the multiplied vector can be 2.
  • the method of determining the third included angle information between the viewing angle and the reflected illumination direction may be: taking the dot product result of the viewing angle direction vector and the reflected illumination direction vector as the third included angle information.
  • the color of the light source is adjusted according to the third included angle information
  • the method of obtaining the target illumination color may be: performing an exponential operation to set the highlight value on the third included angle information, and comparing the index operation result with the light source color. Multiply to obtain the target lighting color.
  • the set highlight value is the value set by the user.
  • the color of the light source is adjusted according to the third angle information, so that local objects can exhibit a highlight effect.
  • the method of fusing the set color and the target lighting color based on the local object graph to obtain the local object lighting special effects map can be: fusing the set color and the target lighting color based on the smoothed local object graph to obtain the local object lighting special effects. picture.
  • the set color can be black, and the corresponding color value is (0,0,0).
  • the way to fuse the set color and the target lighting color based on the smoothed local object map can be to use the pixel value of the smoothed local object map as the weighted coefficient of the target lighting color, and subtract the pixel value of the smoothed local object map from 1.
  • As the weighting coefficient of the set color a weighted sum of the set color and the target lighting color is performed based on the weighting coefficient.
  • the way to fuse the local object lighting special effects map and the target object lighting map is to add the pixel values of the local object lighting special effects map and the corresponding pixel values in the target object lighting map.
  • the target object illumination map, the background area illumination map and the back illumination map are fused.
  • the method of obtaining the target illumination map can be: fusing the updated target object illumination map, the background area illumination map and the back illumination map. , obtain the target lighting map.
  • the technical solution of the embodiment of the present disclosure obtains the current light source information, the normal map of the original image, and the target object mask map; where the light source information includes the light source color, light source position, and illumination intensity; according to the normal map, the target object mask map and light source information to generate a target lighting map; fuse the target lighting map and the original image to obtain the target lighting special effects map.
  • the special effects map generation method provided by the embodiments of the present disclosure generates a target lighting map based on the normal map, target object mask map, and light source information, which can generate special effects maps with lighting effects and enrich the display content of the image.
  • Figure 5 is a schematic structural diagram of a device for generating special effects graphics provided by an embodiment of the present disclosure. As shown in Figure 5, the device includes:
  • the acquisition module 510 is configured to acquire the current light source information, the normal map of the original image and the target object mask map; wherein the light source information includes light source color, light source position and illumination intensity;
  • the target lighting map generation module 520 is configured to generate a target lighting map based on the normal map, the target object mask map and the light source information;
  • the target lighting special effects map acquisition module 530 is configured to fuse the target lighting map and the original image to obtain a target lighting special effects map.
  • the target light map generation module 520 is also set to:
  • the target object illumination map, the background area illumination map and the back illumination map are fused to obtain a target illumination map.
  • the target light map generation module 520 is also set to:
  • the initial illumination map is fused with the target object mask map to obtain a target object illumination map.
  • the target light map generation module 520 is also set to:
  • a first light intensity map is generated based on the target intensity of the pixel.
  • the target light map generation module 520 is also set to:
  • the set color and the light source color are fused based on the light intensity mask map to obtain a background area illumination map.
  • the target light map generation module 520 is also set to:
  • Fusion of the first back-illumination intensity map and the second back-illumination intensity map obtains a target back-illumination intensity map
  • the target backlight intensity map and the light source color are fused to obtain a backlight map.
  • the target light map generation module 520 is also set to:
  • the initial backlight intensity map and the target object mask map are fused to obtain a first backlight intensity map.
  • the target light map generation module 520 is also set to:
  • the fusion mask image is fused with the first blur mask image to obtain a second backlight intensity map.
  • target object light map update module set to:
  • the grayscale image and the local object mask image of the original image wherein the local object is an object composed of a local area of the target object;
  • the local object lighting special effects map and the target object lighting map are fused to obtain an updated target object lighting map.
  • the target object light map update module is also set to:
  • the set color and the target lighting color are fused based on the smoothed local object map to obtain a local object lighting special effect map.
  • the target object light map update module is also set to:
  • the light source color is adjusted according to the third included angle information to obtain a target illumination color.
  • the target light map generation module 520 is also set to:
  • the relative position between the light source and the target object is determined according to the first angle information; wherein the relative position includes the light source being located in the forward direction of the target object, the light source being located in the back direction of the target object, and the light source being located in the lateral direction of the target object;
  • the target object illumination map, the background area illumination map and the back illumination map are fused to obtain a target illumination map.
  • the target light map generation module 520 is also set to:
  • the target object illumination map and the background area illumination map are fused to obtain the target illumination map
  • the background area illumination map and the back illumination map are fused to obtain the target illumination map
  • the target object illumination map and the back illumination map are interpolated and fused to obtain an intermediate illumination map; the intermediate illumination map and the background area illumination map are fused to obtain Target lightmap.
  • the special effects diagram generation device provided by the embodiments of the present disclosure can execute the special effects diagram generation method provided by any embodiment of the present disclosure, and has corresponding functional modules and effects for executing the method.
  • FIG. 6 is a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure.
  • FIG. 6 shows a schematic structural diagram of an electronic device (such as the terminal device or server in FIG. 6 ) 500 suitable for implementing embodiments of the present disclosure.
  • Electronic devices in embodiments of the present disclosure may include, but are not limited to, mobile phones, notebook computers, digital broadcast receivers, personal digital assistants (Personal Digital Assistant, PDA), tablet computers (PAD), portable multimedia players (Portable Media Player , PMP), mobile terminals such as vehicle-mounted terminals (such as vehicle-mounted navigation terminals), and fixed terminals such as digital TVs, desktop computers, etc.
  • PDA Personal Digital Assistant
  • PMP portable multimedia players
  • mobile terminals such as vehicle-mounted terminals (such as vehicle-mounted navigation terminals)
  • fixed terminals such as digital TVs, desktop computers, etc.
  • the electronic device shown in FIG. 6 is only an example and should not impose any limitations on the functions and scope of use of the embodiments of the present disclosure.
  • the electronic device 500 may include a processor (such as a central processing unit, a graphics processor, etc.) 501.
  • the processor 501 may process data according to a program stored in a read-only memory (Read-Only Memory, ROM) 502 or from a program.
  • the storage device 508 loads the program in the random access memory (Random Access Memory, RAM) 503 to perform various appropriate actions and processes.
  • RAM Random Access Memory
  • various programs and data required for the operation of the electronic device 500 are also stored.
  • the processor 501, ROM 502 and RAM 503 are connected to each other through a bus 504.
  • An input/output (I/O) interface 505 is also connected to bus 504.
  • input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, a Liquid Crystal Display (LCD) , an output device 507 such as a speaker, a vibrator, etc.; a storage device 508 including a magnetic tape, a hard disk, etc.; and a communication device 509.
  • Communication device 509 may allow electronic device 500 to communicate wirelessly or wiredly with other devices to exchange data.
  • FIG. 6 illustrates electronic device 500 with various means, it should be understood that implementation or availability of all illustrated means is not required. More or fewer means may alternatively be implemented or provided.
  • embodiments of the present disclosure include a computer program product including a computer program carried on a non-transitory computer-readable medium, the computer program containing program code for performing the method illustrated in the flowchart.
  • the computer program may communicate via The device 509 is downloaded and installed from the network, or from the storage device 508, or from the ROM 502.
  • the computer program is executed by the processor 501, the above functions defined in the method of the embodiment of the present disclosure are performed.
  • the electronic device provided by the embodiment of the present disclosure belongs to the same concept as the method for generating special effect diagrams provided by the above embodiment.
  • Technical details that are not described in detail in this embodiment can be found in the above embodiment, and this embodiment has the same features as the above embodiment. Effect.
  • Embodiments of the present disclosure provide a computer-readable storage medium on which a computer program is stored.
  • the program is executed by a processor, the method for generating a special effect diagram provided in the above embodiments is implemented.
  • the computer-readable storage medium mentioned above in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the above two.
  • the computer-readable storage medium may be, for example, but is not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or any combination thereof.
  • Examples of computer readable storage media may include, but are not limited to: an electrical connection having at least one conductor, a portable computer disk, a hard drive, random access memory (RAM), read only memory (ROM), erasable programmable read only memory ( Erasable Programmable Read-Only Memory (EPROM), flash memory, optical fiber, portable compact disk read-only memory (Compact Disc Read-Only Memory, CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, carrying computer-readable program code therein. Such propagated data signals may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the above.
  • a computer-readable signal medium may also be any computer-readable storage medium other than a computer-readable storage medium that can be sent, propagated, or transmitted for use by or in connection with an instruction execution system, apparatus, or device program.
  • Program codes contained on computer-readable storage media can be transmitted using any appropriate medium, including but not limited to: wires, optical cables, radio frequency (Radio Frequency, RF), etc., or any suitable combination of the above.
  • the client and server can communicate using any currently known or future developed network protocol, such as HyperText Transfer Protocol (HTTP), and can communicate with digital data in any form or medium.
  • HTTP HyperText Transfer Protocol
  • Communications e.g., communications network
  • Examples of communication networks include Local Area Networks (LANs), Wide Area Networks (WANs), the Internet (e.g., the Internet), and end-to-end networks (e.g., ad hoc end-to-end networks), as well as any current network for knowledge or future research and development.
  • LANs Local Area Networks
  • WANs Wide Area Networks
  • the Internet e.g., the Internet
  • end-to-end networks e.g., ad hoc end-to-end networks
  • the above-mentioned computer-readable storage medium may be included in the above-mentioned electronic device; it may also exist independently without being assembled into the electronic device.
  • the computer-readable medium carries at least one program.
  • the electronic device When the at least one program is executed by the electronic device, the electronic device: obtains the current light source information, the normal map of the original image, and the target object mask map; wherein, the The light source information includes light source color, light source position and illumination intensity; generate a target illumination map according to the normal map, the target object mask map and the light source information; fuse the target illumination map and the original image, Obtain the target lighting effects map.
  • Computer program code for performing the operations of the present disclosure may be written in at least one programming language, including but not limited to object-oriented programming languages such as Java, Smalltalk, C++, and conventional programming languages, or a combination thereof.
  • a procedural programming language such as "C” or a similar programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (such as an Internet service provider through Internet connection).
  • LAN local area network
  • WAN wide area network
  • Internet service provider such as an Internet service provider through Internet connection
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains at least one operable function for implementing the specified logical function. Execute instructions. It should also Note that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown one after another may actually execute substantially in parallel, or they may sometimes execute in the reverse order, depending on the functionality involved.
  • the units involved in the embodiments of the present disclosure can be implemented in software or hardware.
  • the name of the unit does not constitute a limitation on the unit itself.
  • the first acquisition unit can also be described as "the unit that acquires at least two Internet Protocol addresses.”
  • exemplary types of hardware logic components include: field programmable gate array (Field Programmable Gate Array, FPGA), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), application specific standard product (Application Specific Standard Parts (ASSP), System on Chip (SOC), Complex Programmable Logic Device (CPLD), etc.
  • a machine-readable storage medium may be a tangible medium that may contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • the machine-readable storage medium may be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable storage media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices or devices, or any suitable combination of the foregoing.
  • machine-readable storage media include an electrical connection based on at least one wire, a portable computer disk, a hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM), flash memory Flash memory, optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory Flash memory flash memory Flash memory
  • optical fiber portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Arrangements Of Lighting Devices For Vehicle Interiors, Mounting And Supporting Thereof, Circuits Therefore (AREA)

Abstract

本公开实施例提供了一种特效图的生成方法、装置、设备及存储介质。获取当前光源信息、原始图像的法向图及目标对象掩膜图;其中,所述光源信息包括光源颜色、光源位置及光照强度;根据所述法向图、所述目标对象掩膜图及所述光源信息生成目标光照图;将所述目标光照图和所述原始图像进行融合,获得目标光照特效图。

Description

特效图的生成方法、装置、设备及存储介质
本申请要求在2022年8月26日提交中国专利局、申请号为202211035725.5的中国专利申请的优先权,该申请的全部内容通过引用结合在本申请中。
技术领域
本公开实施例涉及图像处理技术领域,例如涉及一种特效图的生成方法、装置、设备及存储介质。
背景技术
目前,移动终端中的拍照、制作视频的功能已经成为用户常用功能之一。在拍照、制作视频的过程中,用户经常发现在自然光线的场景下,所拍摄或制作的视频的光线效果不能够满足用户的个性化需求。
发明内容
本公开实施例提供一种特效图的生成方法、装置、设备及存储介质,可以生成具有打光效果的特效图,可以丰富图像的显示内容。
第一方面,本公开实施例提供了一种特效图的生成方法,包括:
获取当前光源信息、原始图像的法向图及目标对象掩膜图;其中,所述光源信息包括光源颜色、光源位置及光照强度;
根据所述法向图、所述目标对象掩膜图及所述光源信息生成目标光照图;
将所述目标光照图和所述原始图像进行融合,获得目标光照特效图。
第二方面,本公开实施例还提供了一种特效图的生成装置,包括:
获取模块,设置为获取当前光源信息、原始图像的法向图及目标对象掩膜图;其中,所述光源信息包括光源颜色、光源位置及光照强度;
目标光照图生成模块,设置为根据所述法向图、所述目标对象掩膜图及所 述光源信息生成目标光照图;
目标光照特效图获取模块,设置为将所述目标光照图和所述原始图像进行融合,获得目标光照特效图。
第三方面,本公开实施例还提供了一种电子设备,包括:
至少一个处理器;
存储装置,设置为存储至少一个程序,
当所述至少一个程序被所述至少一个处理器执行,使得所述至少一个处理器实现如本公开实施例任一所述的特效图的生成方法。
第四方面,本公开实施例还提供了一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时执行如本公开实施例任一所述的特效图的生成方法。
附图说明
图1是本公开实施例所提供的一种特效图的生成方法流程示意图;
图2a是本公开实施例所提供的一种目标对象掩膜图的示例图;
图2b是本公开实施例所提供的一种第二光照强度图的示例图;
图2c是本公开实施例所提供的一种光照强度掩膜图的示例图;
图3a是本公开实施例所提供的一种反向目标对象掩膜图的示例图;
图3b是本公开实施例所提供的一种第一背向光照强度图的示例图;
图3c是本公开实施例所提供的一种第一模糊掩膜图的示例性;
图3d是本公开实施例所提供的一种第二模糊掩膜图的示例图;
图3e是本公开实施例所提供的一种融合掩膜图的示例图;
图3f是本公开实施例所提供的一种第二背向光照强度图的示例图;
图3g是本公开实施例所提供的一种目标背光强度图的示例图;
图4a是本公开实施例所提供的一种灰度图的示例图;
图4b是本公开实施例所提供的一种局部对象掩膜图的示例图;
图4c是本公开实施例所提供的一种平滑灰度图的示例图;
图4d是本公开实施例所提供的一种平滑局部对象图的示例图;
图5是本公开实施例所提供的一种特效图的生成装置结构示意图;
图6是本公开实施例所提供的一种电子设备的结构示意图。
具体实施方式
本文使用的术语“包括”及其变形是开放性包括,即“包括但不限于”。术语“基于”是“至少部分地基于”。术语“一个实施例”表示“至少一个实施例”;术语“另一实施例”表示“至少一个另外的实施例”;术语“一些实施例”表示“至少一些实施例”。其他术语的相关定义将在下文描述中给出。
需要注意,本公开中提及的“第一”、“第二”等概念仅用于对不同的装置、模块或单元进行区分,并非用于限定这些装置、模块或单元所执行的功能的顺序或者相互依存关系。
需要注意,本公开中提及的“一个”、“多个”的修饰是示意性而非限制性的,本领域技术人员应当理解,除非在上下文另有明确指出,否则应该理解为“至少一个”。
本公开实施方式中的多个装置之间所交互的消息或者信息的名称仅用于说明,而并不是用于对这些消息或信息的范围进行限制。
可以理解的是,在使用本公开实施例公开的技术方案之前,均应当依据相关法律法规通过恰当的方式对本公开所涉及个人信息的类型、使用范围、使用场景等告知用户并获得用户的授权。
例如,在响应于接收到用户的主动请求时,向用户发送提示信息,以明确地提示用户,用户请求执行的操作将需要获取和使用到用户的个人信息。从而,使得用户可以根据提示信息来自主地选择是否向执行本公开技术方案的操作的电子设备、应用程序、服务器或存储介质等软件或硬件提供个人信息。
作为一种可选的但非限定性的实现方式,响应于接收到用户的主动请求, 向用户发送提示信息的方式例如可以是弹窗的方式,弹窗中可以以文字的方式呈现提示信息。此外,弹窗中还可以承载供用户选择“同意”或者“不同意”向电子设备提供个人信息的选择控件。
可以理解的是,上述通知和获取用户授权过程仅是示意性的,不对本公开的实现方式构成限定,其它满足相关法律法规的方式也可应用于本公开的实现方式中。
可以理解的是,本技术方案所涉及的数据(包括但不限于数据本身、数据的获取或使用)应当遵循相应法律法规及相关规定的要求。
图1为本公开实施例所提供的一种特效图的生成方法的流程示意图,本公开实施例适用于生成光照特效图的情形,该方法可以由特效图的生成装置来执行,该装置可以通过软件和硬件中至少之一的形式实现,可选的,通过电子设备来实现,该电子设备可以是移动终端、个人计算机(Personal Computer,PC)端或服务器等。
如图1所示,所述方法包括:
S110,获取当前光源信息、原始图像的法向图及目标对象掩膜图。
其中,光源信息包括光源颜色、光源位置及光照强度。光源可以是虚拟光源,光源位置可以基于用户的触发操作发生改变。本实施例中,虚拟光源的生成方式可以是:生成一个可变换(旋转、平移及缩放)的虚拟物体,将该虚拟物体放置于世界坐标原点;生成一个可变换的光源物体,偏移放置在与世界坐标原点相距d的位置;将光源物体作为虚拟物体的子物体,获取触屏拖动操作,将触屏拖动操作的坐标变换映射到虚拟物体的坐标变换,从而达到触屏带动光源物体在球面上移动的效果。其中,d可作为球面半径,光源物体即为虚拟光源。其中,光源颜色和光照强度可以是预设的,即可以是用户设置的。
其中,法向图可以是由原始图像中多个像素点的法向信息构成的图像。其中,法向信息由三维向量表征,即法向向量,法向信息的三个分量分别映射为三个颜色通道值,从而获得法向图。本实施例中,可以采用任意的法向估计算 法确定原始图像多个像素点的法向信息,基于法向信息生成法向图。
其中,目标对象可以是人物、动物或者植物等任意的物体,此处不做限定。目标对象掩膜图中像素点的像素值表征该像素点属于目标对象的置信度。本实施例中,获取原始图像的目标对象掩膜图的方式可以是:对原始图像的目标对象进行识别,获取每个像素点属于目标对象的置信度,基于所有像素的置信度生成目标对象掩膜图。其中,目标对象掩膜图的像素值为0-1之间的值,“0”表示像素点不属于目标对象,在目标对象掩膜图中呈黑色,“1”表示像素点属于目标对象,在目标对象掩膜图中呈白色。示例性的,图2a是本实施例中的一种目标对象掩膜图的示例图,如图2a所示,目标对象为人像,白色区域为人像,黑色区域为背景区域。
S120,根据法向图、目标对象掩膜图及光源信息生成目标光照图。
其中,目标光照图中像素点的颜色值表征该像素点的光照颜色。本实施例中,目标光照图由目标对象光照图、背景区域光照图及背向光照图三部分构成。
本实施例中,根据法向图、目标对象掩膜图及光源信息生成目标光照图的方式可以是:根据法向图、光源信息及目标对象掩膜图生成目标对象光照图;基于光源信息和目标对象掩膜图生成背景区域光照图;根据法向图和目标对象掩膜图生成背向光照图;将目标对象光照图、背景区域光照图和背向光照图进行融合,获得目标对象光照图。
其中,目标对象光照图根据目标对象多个像素点的光照颜色生成。背景区域光照图根据背景区域多个像素点的光照颜色生成。背向光照图根据原始图像多个像素点的背向光照颜色生成。
示例性的,根据法向图、光源信息及目标对象掩膜图生成目标对象光照图的过程可以是:根据法向图、光源信息及目标对象掩膜图确定目标对象的像素点的光照强度,将光照强度与光源颜色融合,获得目标对象的像素点的光照颜色,基于目标对象的像素点的光照颜色生成目标对象光照图。基于光源信息和目标对象掩膜图生成背景区域光照图的过程可以是:根据光源信息和目标对象 掩膜图确定背景区域的像素点的光照强度,将光照强度与光源颜色融合,获得背景区域的像素点的光照颜色,基于背景区域的像素点的光照颜色生成背景区域光照图。根据法向图和目标对象掩膜图生成背向光照图的过程可以是:根据法向图和目标对象掩膜图确定原始图像的像素点的背向光照强度,将背向光照强度与光源颜色进行融合,获得像素点背向光照颜色,基于背向光照颜色生成背向光照图。本实施例中,根据目标对象光照图、背景区域光照图和背向光照图生成目标光照图,基于图像区域及光照方向生成不同的光照图,从而最终呈现明暗交错的效果。
可选的,根据法向图、光源信息及目标对象掩膜图生成目标对象光照图的方式可以是:根据法向图和光源信息确定第一光照强度图;将第一光照强度图和光源颜色进行融合,获得初始光照图;将初始光照图与目标对象掩膜图进行融合,获得目标对象光照图。
其中,第一光照强度图中像素点的像素值表示该像素点的光照强度。示例性的,根据法向图和光源信息确定第一光照强度图的方式可以是:根据法向图中的法向信息和光源信息确定像素点的强度衰减信息,基于强度衰减信息对光照强度进行调整,获像素点的光照强度,基于所有像素点的光照强度生成第一光照强度图。将第一光照强度图和光源颜色进行融合的方式可以是:将第一光照强度图中的像素点的光照强度与光源颜色相乘,获得像素点的光照颜色。将初始光照图与目标对象掩膜图进行融合的方式可以是:将初始光照图的光照颜色与目标对象掩膜图中对应像素点的像素值进行相乘,获得目标对象光照图。本实施例中,将初始光照图与目标对象掩膜图进行融合,可以准确的确定出目标对象的像素点的光照颜色。
可选的,根据法向图和光源信息确定第一光照强度图的方式可以是:根据法向图和光源信息确定入射光线与原始图像中像素点间的第一夹角信息;根据光源与原始图像中像素点的距离确定强度衰减信息;根据第一夹角信息和衰减信息对光照强度进行调整,获得像素点的目标强度;基于像素点的目标强度生 成第一光照强度图。
示例性的,根据法向图和光源信息确定入射光线与像素点间的第一夹角信息的过程可以是:确定光照方向向量,并提取法向图中像素点的法向向量,分别对光照方向向量和每个像素点的法向向量进行归一化处理,将归一化后的光照方向向量和法向向量进行点乘,获得第一夹角信息。其中,光照方向向量可以是光源位置(由世界坐标表示)指向目标对象中心位置(将屏幕坐标转换为世界坐标)的向量,或者光源位置指向目标对象面部中心位置的向量。
示例性的,根据光源与原始图像中像素点的距离确定衰减信息的方式可以是:将光源位置由世界坐标转换至屏幕坐标,计算屏幕坐标系下的光源位置与原始图像中像素点的距离,由1减去距离与光晕半径的比值,获得中间结果值;对中间结果值设定衰减值的指数,获得衰减信息。其中,将光源位置由世界坐标转换至屏幕坐标的过程可以是:将光源世界坐标左乘模型视图投影(Model View Projectin,MVP)变换矩阵,获得光源投影坐标,再对光源投影坐标的x分量和y分量进行线性变换,获得光源屏幕坐标。
示例性的,根据第一夹角信息和强度衰减信息对光照强度进行调整的方式可以是:将光照强度依次与第一夹角信息和强度衰减信息相乘,获得像素点的目标强度。基于像素点的目标强度生成第一光照强度图的方式可以是:将目标强度作为像素的像素值,从而获得第一光照强度图。本实施例中,根据第一夹角信息和强度衰减信息对光照强度进行调整,可以提高确定像素点目标强度的准确性。
本实施例中,基于光源信息和目标对象掩膜图生成背景区域光照图的方式可以是:获取光源与原始图像中像素点的距离;根据距离和光照强度生成第二光照强度图;将第二光照强度图与目标对象掩膜图进行融合,获得光照强度掩膜图;基于光照强度掩膜图对设定颜色和光源颜色进行融合,获得背景区域光照图。
示例性的,获取光源与原始图像中像素点的距离的方式可以是:将光源位 置由世界坐标转换至屏幕坐标,计算屏幕坐标系下的光源位置与原始图像中像素点的距离。根据距离和光照强度生成第二光照强度图的方式可以是:对距离进行指数运算,用1减去指数运算结果,获得中间结果,由中间结果与光源的光照强度进行相乘,获得像素点的目标强度,再根据像素点的目标强度生成第二光照强度图。示例性的,图2b是本实施例中的第二光照强度图的示例图。
示例性的,将第二光照强度图与目标对象掩膜图进行融合的方式可以是:将第二光照强度图中像素点的像素值减去目标对象掩膜图中对应像素点的像素值,若相减得到的像素值小于0,则将该像素值调为0,若相减得到的像素值大于1,则将该像素值调为1,获得光照强度掩膜图。示例性的,图2c为本实施例中光照强度掩膜图的示例图,如图2c所示,光照强度掩膜图可以理解为从第二光照强度图中抠出目标对象的光照强度图。
其中,设定颜色可以是黑色,对应的颜色值为(0,0,0)。示例性的,基于光照强度掩膜图对设定颜色和光照颜色进行融合的方式可以是:将光照强度掩膜图的像素值作为光照颜色的加权系数,将1减去光照强度掩膜图的像素值的结果作为设定颜色的加权系数,基于加权系数对设定颜色和光照颜色进行加权求和,将加权求和后的颜色值与设定值相乘,获得像素点的背景区域光照颜色,从而获得背景区域光照图。其中,设定值可以设置为0.5。本实施例中,确定出的背景区域光照图与目标对象光照图的光照颜色不同,使得图像呈现明暗交替的效果。
可选的,根据法向图和目标对象掩膜图生成背向光照图的方式可以是:根据法向图、目标对象掩膜图和视角信息确定第一背向光照强度图;对目标对象掩膜图进行反向处理,获得反向目标对象掩膜图;根据目标对象掩膜图和反向目标对象掩膜图生成第二背向光照强度图;将第一背向光照强度图和第二背向光照强度图进行融合,获得目标背向光照强度图;将目标背向光照强度图和光源颜色进行融合,获得背向光照图。
其中,视角信息可以是当前图像对应的虚拟相机的视角,可以由视角方向 的向量表征。对目标对象掩膜图进行反向处理的方式可以是:用1分别减去目标对象掩膜图中每个像素点的像素值,获得反向目标对象掩膜图。示例性的,图3a是本实施例中反向目标对象掩膜图的示例图,如图3a所示,反向目标对象掩膜图与目标对象掩膜图相比,人像区域变为黑色,背景区域变为白色。
示例性的,根据法向图、目标对象掩膜图和视角信息确定第一背向光照强度图的方式可以是:确定法向图中每个像素点的法向信息与视角信息间的第二夹角信息;根据第二夹角信息确定该像素点的初始背向光照强度;基于初始背向光照强度生成初始背向光照强度图;将初始背向光照强度图与目标对象掩膜图进行融合,获得第一背向光照强度图。
其中,确定法向图中每个像素点的法向信息与视角信息间的第二夹角信息的过程可以是:对法向向量和视角方向向量分别进行归一化处理,将归一化处理的法向向量和视角方向向量进行点乘,并将点乘结果截取到0-1之间,获得第二夹角信息。根据第二夹角信息确定该像素点的初始背向光照强度的过程可以是:用1减去第二夹角信息,并对减法结果进行设定控制强度的指数运算,将指数运算结果确定为该像素点的初始背向光照强度。其中,设定控制强度可以是用户设置的值。将初始背向光照强度图与目标对象掩膜图进行融合的方式可以是:将初始背向光照强度图中像素点的像素值与目标对象掩膜图中对应像素点的像素值进行相乘。示例性的,图3b是本实施例中第一背向光照强度图的示例图。本实施例中,可以生成具有轮廓光效果的强度图。
可选的,根据目标对象掩膜图和反向目标对象掩膜图生成第二背向光照强度图的方式可以是:分别对目标对象掩膜图和反向目标对象掩膜图进行模糊处理,获得第一模糊掩膜图和第二模糊掩膜图;将第二模糊掩膜图和反向目标对象掩膜图进行融合,获得融合掩膜图;将融合掩膜图与第一模糊掩膜图进行融合,获得第二背向光照强度图。
其中,第一模糊掩膜图可以是对目标对象掩膜图模糊处理后的图,第二模糊掩膜图可以是对反向目标对象掩膜图进行模糊处理后的图。模糊可以是高斯 模糊。示例性的,图3c是第一模糊掩膜图的示例性,图3d是第二模糊掩膜图的示例图。示例性的,将第二模糊掩膜图和反向目标对象掩膜图进行融合的方式可以是:取第二模糊掩膜图的像素值和反向目标对象掩膜图对应的像素值中的最大值,基于最大像素值生成融合掩膜图。示例性的,图3e是本实施例中融合掩膜图的示例图。示例性的,将融合掩膜图与第一模糊掩膜图进行融合的方式可以是:取融合掩膜图的像素值和第一模糊掩膜图对应的像素值中的最小值,基于最小像素值生成第二背向光照强度图。示例性的,图3f是本实施例中第二背向光照强度图的示例图。本实施例中,第二背向光照强度图具有对目标对象描边的效果。
示例性的,将第一背向光照强度图和第二背向光照强度图进行融合的方式可以是:将第一背向光照强度图的像素值和第二背向光照强度图的像素值进行相加,获得目标背向光照强度图。示例性的,图3g是本实施例中目标背向光照强度图的示例图。将目标背向光照强度图和光源颜色进行融合的方式可以是:将目标背向光照强度图中的光照强度与光源颜色相乘,获得背向光照图。
可选的,将目标对象光照图、背景区域光照图和背向光照图进行融合,获得目标光照图的方式可以是:根据第一夹角信息确定光源与目标对象间的相对位置;基于相对位置将目标对象光照图、背景区域光照图和背向光照图进行融合,获得目标光照图。
其中,相对位置包括光源位于目标对象的正向、光源位于目标对象的背向以及光源位于目标对象的侧向。第一夹角信息为目标对象的法向向量与光照方向向量的点乘结果。根据第一夹角信息确定光源与目标对象间的相对位置的方式可以是:若第一夹角信息glare处于(t,1]范围内,即t<glare≤1,则光源位于目标对象的正向;若第一夹角信息处于[-1,-t)范围内,即-1≤glare<-t,则光源位于目标对象的背向;若第一夹角信息处于[-t,t]范围内,即-t≤glare≤t,则光源位于目标对象的侧向。其中,t可以是0-1之间的值,例如设置为0.1或者0.2。
示例性的,基于相对位置将目标对象光照图、背景区域光照图和背向光照图进行融合,获得目标光照图的方式可以是:响应于光源位于目标对象的正向,将目标对象光照图和背景区域光照图进行融合,获得目标光照图;响应于光源位于目标对象的背向,将背景区域光照图和背向光照图进行融合,获得目标光照图;响应于光源位于目标对象的侧向,则对目标对象光照图和背向光照图进行插值融合,获得中间光照图;将中间光照图和背景区域光照图进行融合,获得目标光照图。
示例性的,将目标对象光照图和背景区域光照图进行融合的方式可以是:将目标对象光照图的像素值和背景区域光照图的像素值进行相加。将背景区域光照图和背向光照图进行融合的方式可以是:将背景区域光照图的像素值和背向光照图的像素值进行相加。
示例性的,对目标对象光照图和背向光照图进行插值融合的方式可以是:通过插值运算确定[-t,t]与[0,1]的映射关系,根据映射关系确定第一夹角信息对应的目标值,将1减去目标值的结果确定为目标对象光照图的加权系数,将目标值确定为背向光照图的加权系数,根据加权系数对目标对象光照图和背向光照图进行加权计算,获得中间光照图。将中间光照图和背景区域光照图进行融合的方式可以是:将中间光照图的像素值和背景区域光照图的像素值进行相加。本实施例中,基于光源与目标对象间的相对位置确定目标光照图,可以提高目标光照图的精度及真实度。
S130,将目标光照图和原始图像进行融合,获得目标光照特效图。
示例性的,将目标光照图和原始图像进行融合的方式可以是:将目标光照图的颜色值和原始图像的颜色值进行相加。
可选的,还包括如下步骤:获取原始图像的灰度图及局部对象掩膜图;将灰度图和局部对象掩膜图进行融合,获得局部对象图;根据法向图、视角信息和光源位置对光源颜色进行调整,获得目标光照颜色;基于局部对象图对设定颜色和目标光照颜色进行融合,获得局部对象光照特效图;将局部对象光照特 效图和目标对象光照图进行融合,获得更新后的目标对象光照图。
其中,局部对象为目标对象的局部区域构成的对象。本实施例中,假设目标对象为人像,则局部对象为头发。
其中,获取原始图像的灰度图的方式可以是:对原始图像进行灰度处理,获得灰度图。示例性的,图4a为本实施例中的灰度图的示例图。
其中,获取原始图像的局部对象掩膜图的方式可以是:对原始图像中的局部对象进行识别,获得局部对象掩膜图。示例性的,图4b为本实施例中的局部对象掩膜图的示例图。
示例性的,将灰度图和局部对象掩膜图进行融合,获得局部对象图的方式可以是:对灰度图进行平滑处理,获得平滑灰度图;将平滑灰度图和局部对象掩膜图进行融合,获得平滑局部对象图。
其中,对灰度图进行平滑处理的方式可以是:确定灰度值与第一设定值间第一差值,确定第二设定值与第一设定值间的第二差值,计算第一差值和第二差值间的比例值,并将该比例值截取至0-1之间,将比例值进行N次方处理,获得处理后的灰度值,基于处理后的灰度值生成平滑灰度值。其中,第一设定值可以是0,第二设定值可以是0.5,N可以是3。其中将比例值进行N次方处理的公式可以是表示为x*x*(a-bx),其中,a和b为常量,x为比例值。示例性的,图4c为本实施例中的平滑灰度图的示例图,
示例性的,将平滑灰度图和局部对象掩膜图进行融合的方式可以是:将平滑灰度图的像素值和局部对象掩膜图的像素值进行相乘。示例性的,图4d是本实施例中的平滑局部对象图的示例图。本实施例中,对灰度图进行平滑处理,可以凸显局部对象更多的细节,从而使得局部对象具有高光的效果。对于本实施例的头发,对灰度图进行平滑处理,可以得到更细节的发丝,从而得到更准确的光照效果。
示例性的,根据法向图、视角信息和光源位置对光源颜色进行调整,获得目标光照颜色的方式可以是:根据法向图和光源位置确定反射光照方向;确定 视角和反射光照方向的第三夹角信息;根据第三夹角信息对光源颜色进行调整,获得目标光照颜色。
其中,反射光照方向可以理解为反射光线的方向。根据法向图和光源位置确定反射光照方向的过程可以是:根据光源位置和目标对象中心位置确定光照方向向量,将光照方向向量与法向方向向量进行线性计算,获得反射光照方向向量。将光照方向向量与法向方向向量进行线性计算的方式可以是:计算光照方向向量与法向方向向量的点乘结果,将点乘结果、设定值和法向方向向量相乘,再将光照方向向量减去相乘后的向量。其中,设定值可以是2。
其中,确定视角和反射光照方向的第三夹角信息的方式可以是:将视角方向向量和反射光照方向向量的点乘结果作为第三夹角信息。
其中,根据所述第三夹角信息对所述光源颜色进行调整,获得目标光照颜色的方式可以是:对第三夹角信息进行设定高光值的指数运算,并将指数运算结果与光源颜色相乘,获得目标光照颜色。其中,设定的高光值为用户设置的值。本实施例中,根据第三夹角信息对光源颜色进行调整,可以使得局部对象呈现高光效果。
相应的,基于局部对象图对设定颜色和目标光照颜色进行融合,获得局部对象光照特效图的方式可以是:基于平滑局部对象图对设定颜色和目标光照颜色进行融合,获得局部对象光照特效图。
其中,设定颜色可以是黑色,对应的颜色值为(0,0,0)。基于平滑局部对象图对设定颜色和目标光照颜色进行融合的方式可以是,将平滑局部对象图的像素值作为目标光照颜色的加权系数,将1减去平滑局部对象图的像素值的结果值作为设定颜色的加权系数,基于加权系数对设定颜色和目标光照颜色进行加权求和。
示例性的,将局部对象光照特效图和目标对象光照图进行融合的方式可以是:将局部对象光照特效图的像素值和目标对象光照图中对应的像素值进行相加。
相应的,将目标对象光照图、背景区域光照图和背向光照图进行融合,获得目标光照图的方式可以是:将更新后的目标对象光照图、背景区域光照图和背向光照图进行融合,获得目标光照图。
本公开实施例的技术方案,获取当前光源信息、原始图像的法向图及目标对象掩膜图;其中,光源信息包括光源颜色、光源位置及光照强度;根据法向图、目标对象掩膜图及光源信息生成目标光照图;将目标光照图和原始图像进行融合,获得目标光照特效图。本公开实施例提供的特效图的生成方法,根据法向图、目标对象掩膜图及光源信息生成目标光照图,可以生成具有打光效果的特效图,可以丰富图像的显示内容。
图5为本公开实施例所提供的一种特效图的生成装置的结构示意图,如图5所示,所述装置包括:
获取模块510,设置为获取当前光源信息、原始图像的法向图及目标对象掩膜图;其中,所述光源信息包括光源颜色、光源位置及光照强度;
目标光照图生成模块520,设置为根据所述法向图、所述目标对象掩膜图及所述光源信息生成目标光照图;
目标光照特效图获取模块530,设置为将所述目标光照图和所述原始图像进行融合,获得目标光照特效图。
可选的,目标光照图生成模块520,还设置为:
根据所述法向图、所述光源信息及所述目标对象掩膜图生成目标对象光照图;
基于所述光源信息和所述目标对象掩膜图生成背景区域光照图;
根据所述法向图和所述目标对象掩膜图生成背向光照图;
将所述目标对象光照图、所述背景区域光照图和所述背向光照图进行融合,获得目标光照图。
可选的,目标光照图生成模块520,还设置为:
根据所述法向图和所述光源信息确定第一光照强度图;
将所述第一光照强度图和所述光源颜色进行融合,获得初始光照图;
将所述初始光照图与所述目标对象掩膜图进行融合,获得目标对象光照图。
可选的,目标光照图生成模块520,还设置为:
根据所述法向图和所述光源信息确定入射光线与原始图像中像素点间的第一夹角信息;
根据光源与原始图像中像素点的距离确定衰减信息;
根据所述第一夹角信息和所述衰减信息对所述光照强度进行调整,获得像素点的目标强度;
基于像素点的目标强度生成第一光照强度图。
可选的,目标光照图生成模块520,还设置为:
获取光源与原始图像中像素点的距离;
根据所述距离和所述光照强度生成第二光照强度图;
将所述第二光照强度图与所述目标对象掩膜图进行融合,获得光照强度掩膜图;
基于所述光照强度掩膜图对设定颜色和所述光源颜色进行融合,获得背景区域光照图。
可选的,目标光照图生成模块520,还设置为:
根据所述法向图、所述目标对象掩膜图和视角信息确定第一背向光照强度图;
对所述目标对象掩膜图进行反向处理,获得反向目标对象掩膜图;
根据所述目标对象掩膜图和所述反向目标对象掩膜图生成第二背向光照强度图;
将所述第一背向光照强度图和所述第二背向光照强度图进行融合,获得目标背向光照强度图;
将所述目标背向光照强度图和所述光源颜色进行融合,获得背向光照图。
可选的,目标光照图生成模块520,还设置为:
确定所述法向图中多个像素点的法向信息与所述视角信息间的第二夹角信息;
根据所述第二夹角信息确定多个像素点的初始背向光照强度;
基于所述初始背向光照强度生成初始背向光照强度图;
将所述初始背向光照强度图与所述目标对象掩膜图进行融合,获得第一背向光照强度图。
可选的,目标光照图生成模块520,还设置为:
分别对所述目标对象掩膜图和所述反向目标对象掩膜图进行模糊处理,获得第一模糊掩膜图和第二模糊掩膜图;
将所述第二模糊掩膜图和所述反向目标对象掩膜图进行融合,获得融合掩膜图;
将所述融合掩膜图与所述第一模糊掩膜图进行融合,获得第二背向光照强度图。
可选的,还包括:目标对象光照图更新模块,设置为:
获取所述原始图像的灰度图及局部对象掩膜图;其中,所述局部对象为所述目标对象的局部区域构成的对象;
将所述灰度图和所述局部对象掩膜图进行融合,获得局部对象图;
根据所述法向图、视角信息和光源位置对光源颜色进行调整,获得目标光照颜色;
基于所述局部对象图对设定颜色和所述目标光照颜色进行融合,获得局部对象光照特效图;
将所述局部对象光照特效图和所述目标对象光照图进行融合,获得更新后的目标对象光照图。
可选的,目标对象光照图更新模块,还设置为:
对所述灰度图进行平滑处理,获得平滑灰度图;
将所述平滑灰度图和所述局部对象掩膜图进行融合,获得平滑局部对象图;
基于所述平滑局部对象图对设定颜色和所述目标光照颜色进行融合,获得局部对象光照特效图。
可选的,目标对象光照图更新模块,还设置为:
根据所述法向图和所述光源位置确定反射光照方向;
确定所述视角信息和所述反射光照方向的第三夹角信息;
根据所述第三夹角信息对所述光源颜色进行调整,获得目标光照颜色。
可选的,目标光照图生成模块520,还设置为:
根据所述第一夹角信息确定光源与目标对象间的相对位置;其中,所述相对位置包括光源位于目标对象的正向、光源位于目标对象的背向以及光源位于目标对象的侧向;
基于所述相对位置将所述目标对象光照图、所述背景区域光照图和所述背向光照图进行融合,获得目标光照图。
可选的,目标光照图生成模块520,还设置为:
响应于光源位于目标对象的正向,将所述目标对象光照图和所述背景区域光照图进行融合,获得目标光照图;
响应于光源位于目标对象的背向,将所述背景区域光照图和所述背向光照图进行融合,获得目标光照图;
响应于光源位于目标对象的侧向,对所述目标对象光照图和所述背向光照图进行插值融合,获得中间光照图;将所述中间光照图和所述背景区域光照图进行融合,获得目标光照图。
本公开实施例所提供的特效图的生成装置可执行本公开任意实施例所提供的特效图的生成方法,具备执行方法相应的功能模块和效果。
值得注意的是,上述装置所包括的各个单元和模块只是按照功能逻辑进行划分的,但并不局限于上述的划分,只要能够实现相应的功能即可;另外,各功能单元的名称也只是为了便于相互区分,并不用于限制本公开实施例的保护范围。
图6为本公开实施例所提供的一种电子设备的结构示意图。下面参考图6,图6示出了适于用来实现本公开实施例的电子设备(例如图6中的终端设备或服务器)500的结构示意图。本公开实施例中的电子设备可以包括但不限于诸如移动电话、笔记本电脑、数字广播接收器、个人数字助理(Personal Digital Assistant,PDA)、平板电脑(PAD)、便携式多媒体播放器(Portable Media Player,PMP)、车载终端(例如车载导航终端)等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。图6示出的电子设备仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图6所示,电子设备500可以包括处理器(例如中央处理器、图形处理器等)501,处理器501可以根据存储在只读存储器(Read-Only Memory,ROM)502中的程序或者从存储装置508加载到随机访问存储器(Random Access Memory,RAM)503中的程序而执行各种适当的动作和处理。在RAM 503中,还存储有电子设备500操作所需的各种程序和数据。处理器501、ROM 502以及RAM 503通过总线504彼此相连。输入/输出(Input/Output,I/O)接口505也连接至总线504。
通常,以下装置可以连接至I/O接口505:包括例如触摸屏、触摸板、键盘、鼠标、摄像头、麦克风、加速度计、陀螺仪等的输入装置506;包括例如液晶显示器(Liquid Crystal Display,LCD)、扬声器、振动器等的输出装置507;包括例如磁带、硬盘等的存储装置508;以及通信装置509。通信装置509可以允许电子设备500与其他设备进行无线或有线通信以交换数据。虽然图6示出了具有各种装置的电子设备500,但是应理解的是,并不要求实施或具备所有示出的装置。可以替代地实施或具备更多或更少的装置。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在非暂态计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信装 置509从网络上被下载和安装,或者从存储装置508被安装,或者从ROM 502被安装。在该计算机程序被处理器501执行时,执行本公开实施例的方法中限定的上述功能。
本公开实施例提供的电子设备与上述实施例提供的特效图的生成方法属于同一构思,未在本实施例中详尽描述的技术细节可参见上述实施例,并且本实施例与上述实施例具有相同的效果。
本公开实施例提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现上述实施例所提供的特效图的生成方法。
需要说明的是,本公开上述的计算机可读存储介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的例子可以包括但不限于:具有至少一个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(Erasable Programmable Read-Only Memory,EPROM)、闪存、光纤、便携式紧凑磁盘只读存储器(Compact Disc Read-Only Memory,CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读信号介质还可以是计算机可读存储介质以外的任何计算机可读存储介质,该计算机可读信号介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读存储介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:电线、光缆、射频(Radio Frequency,RF)等等,或者上述的任意合适的组合。
在一些实施方式中,客户端、服务器可以利用诸如超文本传输协议(HyperText Transfer Protocol,HTTP)之类的任何当前已知或未来研发的网络协议进行通信,并且可以与任意形式或介质的数字数据通信(例如,通信网络)互连。通信网络的示例包括局域网(Local Area Network,LAN),广域网(Wide Area Network,WAN),网际网(例如,互联网)以及端对端网络(例如,ad hoc端对端网络),以及任何当前已知或未来研发的网络。
上述计算机可读存储介质可以是上述电子设备中所包含的;也可以是单独存在,而未装配入该电子设备中。
上述计算机可读介质承载有至少一个程序,当上述至少一个程序被该电子设备执行时,使得该电子设备:获取当前光源信息、原始图像的法向图及目标对象掩膜图;其中,所述光源信息包括光源颜色、光源位置及光照强度;根据所述法向图、所述目标对象掩膜图及所述光源信息生成目标光照图;将所述目标光照图和所述原始图像进行融合,获得目标光照特效图。
可以以至少一种程序设计语言或其组合来编写用于执行本公开的操作的计算机程序代码,上述程序设计语言包括但不限于面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含至少一个用于实现规定的逻辑功能的可执行指令。也应当 注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和流程图中的每个方框、以及框图和流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本公开实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。其中,单元的名称并不构成对该单元本身的限定,例如,第一获取单元还可以被描述为“获取至少两个网际协议地址的单元”。
本文中以上描述的功能可以至少部分地由至少一个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:现场可编程门阵列(Field Programmable Gate Array,FPGA)、专用集成电路(Application Specific Integrated Circuit,ASIC)、专用标准产品(Application Specific Standard Parts,ASSP)、片上系统(System on Chip,SOC)、复杂可编程逻辑设备(Complex Programmable Logic Device,CPLD)等等。
在本公开的上下文中,机器可读存储介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读存储介质可以是机器可读信号介质或机器可读储存介质。机器可读存储介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的示例包括基于至少一个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM)、快闪存储器、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。

Claims (16)

  1. 一种特效图的生成方法,包括:
    获取当前光源信息、原始图像的法向图及目标对象掩膜图;其中,所述光源信息包括光源颜色、光源位置及光照强度;
    根据所述法向图、所述目标对象掩膜图及所述光源信息生成目标光照图;
    将所述目标光照图和所述原始图像进行融合,获得目标光照特效图。
  2. 根据权利要求1所述的方法,其中,根据所述法向图、所述目标对象掩膜图及所述光源信息生成目标光照图,包括:
    根据所述法向图、所述光源信息及所述目标对象掩膜图生成目标对象光照图;
    基于所述光源信息和所述目标对象掩膜图生成背景区域光照图;
    根据所述法向图和所述目标对象掩膜图生成背向光照图;
    将所述目标对象光照图、所述背景区域光照图和所述背向光照图进行融合,获得目标光照图。
  3. 根据权利要求2所述的方法,其中,根据所述法向图、所述光源信息及所述目标对象掩膜图生成目标对象光照图,包括:
    根据所述法向图和所述光源信息确定第一光照强度图;
    将所述第一光照强度图和所述光源颜色进行融合,获得初始光照图;
    将所述初始光照图与所述目标对象掩膜图进行融合,获得目标对象光照图。
  4. 根据权利要求3所述的方法,其中,根据所述法向图和所述光源信息确定第一光照强度图,包括:
    根据所述法向图和所述光源信息确定入射光线与原始图像中像素点间的第一夹角信息;
    根据光源与原始图像中像素点的距离确定强度衰减信息;
    根据所述第一夹角信息和所述强度衰减信息对所述光照强度进行调整,获得像素点的目标强度;
    基于像素点的目标强度生成第一光照强度图。
  5. 根据权利要求2所述的方法,其中,基于所述光源信息和所述目标对象掩膜图生成背景区域光照图,包括:
    获取光源与原始图像中像素点的距离;
    根据所述距离和所述光照强度生成第二光照强度图;
    将所述第二光照强度图与所述目标对象掩膜图进行融合,获得光照强度掩膜图;
    基于所述光照强度掩膜图对设定颜色和所述光源颜色进行融合,获得背景区域光照图。
  6. 根据权利要求2所述的方法,其中,根据所述法向图和所述目标对象掩膜图生成背向光照图,包括:
    根据所述法向图、所述目标对象掩膜图和视角信息确定第一背向光照强度图;
    对所述目标对象掩膜图进行反向处理,获得反向目标对象掩膜图;
    根据所述目标对象掩膜图和所述反向目标对象掩膜图生成第二背向光照强度图;
    将所述第一背向光照强度图和所述第二背向光照强度图进行融合,获得目标背向光照强度图;
    将所述目标背向光照强度图和所述光源颜色进行融合,获得背向光照图。
  7. 根据权利要求6所述的方法,其中,根据所述法向图、所述目标对象掩膜图和视角信息确定第一背向光照强度图,包括:
    确定所述法向图中每个像素点的法向信息与所述视角信息间的第二夹角信息;
    根据所述第二夹角信息确定所述每个像素点的初始背向光照强度;
    基于所述每个像素点的所述初始背向光照强度生成初始背向光照强度图;
    将所述初始背向光照强度图与所述目标对象掩膜图进行融合,获得第一背向光照强度图。
  8. 根据权利要求6所述的方法,其中,根据所述目标对象掩膜图和所述反向目标对象掩膜图生成第二背向光照强度图,包括:
    分别对所述目标对象掩膜图和所述反向目标对象掩膜图进行模糊处理,获得第一模糊掩膜图和第二模糊掩膜图;
    将所述第二模糊掩膜图和所述反向目标对象掩膜图进行融合,获得融合掩膜图;
    将所述融合掩膜图与所述第一模糊掩膜图进行融合,获得第二背向光照强度图。
  9. 根据权利要求2所述的方法,还包括:
    获取所述原始图像的灰度图及局部对象掩膜图;其中,所述局部对象为所述目标对象的局部区域构成的对象;
    将所述灰度图和所述局部对象掩膜图进行融合,获得局部对象图;
    根据所述法向图、视角信息和光源位置对光源颜色进行调整,获得目标光照颜色;
    基于所述局部对象图对设定颜色和所述目标光照颜色进行融合,获得局部对象光照特效图;
    将所述局部对象光照特效图和所述目标对象光照图进行融合,获得更新后的目标对象光照图。
  10. 根据权利要求9所述的方法,其中,将所述灰度图和所述局部对象掩膜图进行融合,获得局部对象图,包括:
    对所述灰度图进行平滑处理,获得平滑灰度图;
    将所述平滑灰度图和所述局部对象掩膜图进行融合,获得平滑局部对象图;
    基于所述局部对象图对设定颜色和所述目标光照颜色进行融合,获得局部对象光照特效图,包括:
    基于所述平滑局部对象图对设定颜色和所述目标光照颜色进行融合,获得局部对象光照特效图。
  11. 根据权利要求9所述的方法,其中,根据所述法向图、视角信息和光源位置对光源颜色进行调整,获得目标光照颜色,包括:
    根据所述法向图和所述光源位置确定反射光照方向;
    确定所述视角信息和所述反射光照方向的第三夹角信息;
    根据所述第三夹角信息对所述光源颜色进行调整,获得目标光照颜色。
  12. 根据权利要求4所述的方法,其中,将所述目标对象光照图、所述背景区域光照图和所述背向光照图进行融合,获得目标光照图,包括:
    根据所述第一夹角信息确定光源与目标对象间的相对位置;其中,所述相对位置包括光源位于目标对象的正向、光源位于目标对象的背向以及光源位于目标对象的侧向;
    基于所述相对位置将所述目标对象光照图、所述背景区域光照图和所述背向光照图进行融合,获得目标光照图。
  13. 根据权利要求12所述的方法,其中,基于所述相对位置将所述目标对象光照图、所述背景区域光照图和所述背向光照图进行融合,获得目标光照图,包括:
    响应于光源位于目标对象的正向,将所述目标对象光照图和所述背景区域光照图进行融合,获得目标光照图;
    响应于光源位于目标对象的背向,将所述背景区域光照图和所述背向光照图进行融合,获得目标光照图;
    响应于光源位于目标对象的侧向,对所述目标对象光照图和所述背向光照图进行插值融合,获得中间光照图;将所述中间光照图和所述背景区域光照图进行融合,获得目标光照图。
  14. 一种特效图的生成装置,包括:
    获取模块(510),设置为获取当前光源信息、原始图像的法向图及目标对象掩膜图;其中,所述光源信息包括光源颜色、光源位置及光照强度;
    目标光照图生成模块(520),设置为根据所述法向图、所述目标对象掩膜 图及所述光源信息生成目标光照图;
    目标光照特效图获取模块(530),设置为将所述目标光照图和所述原始图像进行融合,获得目标光照特效图。
  15. 一种电子设备,包括:
    至少一个处理器;
    存储装置,设置为存储至少一个程序,
    当所述至少一个程序被所述至少一个处理器执行,使得所述至少一个处理器实现如权利要求1-13中任一所述的特效图的生成方法。
  16. 一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时执行如权利要求1-13中任一所述的特效图的生成方法。
PCT/CN2023/114831 2022-08-26 2023-08-25 特效图的生成方法、装置、设备及存储介质 WO2024041623A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211035725.5 2022-08-26
CN202211035725.5A CN115358959A (zh) 2022-08-26 2022-08-26 特效图的生成方法、装置、设备及存储介质

Publications (1)

Publication Number Publication Date
WO2024041623A1 true WO2024041623A1 (zh) 2024-02-29

Family

ID=84004509

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/114831 WO2024041623A1 (zh) 2022-08-26 2023-08-25 特效图的生成方法、装置、设备及存储介质

Country Status (2)

Country Link
CN (1) CN115358959A (zh)
WO (1) WO2024041623A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115358959A (zh) * 2022-08-26 2022-11-18 北京字跳网络技术有限公司 特效图的生成方法、装置、设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112884637A (zh) * 2021-01-29 2021-06-01 北京市商汤科技开发有限公司 特效生成方法、装置、设备及存储介质
CN113096231A (zh) * 2021-03-18 2021-07-09 北京达佳互联信息技术有限公司 一种图像处理方法、装置、电子设备及存储介质
CN114331823A (zh) * 2021-12-29 2022-04-12 北京字跳网络技术有限公司 图像处理方法、装置、电子设备及存储介质
CN115358959A (zh) * 2022-08-26 2022-11-18 北京字跳网络技术有限公司 特效图的生成方法、装置、设备及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112884637A (zh) * 2021-01-29 2021-06-01 北京市商汤科技开发有限公司 特效生成方法、装置、设备及存储介质
CN113096231A (zh) * 2021-03-18 2021-07-09 北京达佳互联信息技术有限公司 一种图像处理方法、装置、电子设备及存储介质
CN114331823A (zh) * 2021-12-29 2022-04-12 北京字跳网络技术有限公司 图像处理方法、装置、电子设备及存储介质
CN115358959A (zh) * 2022-08-26 2022-11-18 北京字跳网络技术有限公司 特效图的生成方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN115358959A (zh) 2022-11-18

Similar Documents

Publication Publication Date Title
CN111242881A (zh) 显示特效的方法、装置、存储介质及电子设备
WO2024041623A1 (zh) 特效图的生成方法、装置、设备及存储介质
WO2024104248A1 (zh) 虚拟全景图的渲染方法、装置、设备及存储介质
WO2023193642A1 (zh) 视频处理方法、装置、设备及介质
CN113724391A (zh) 三维模型构建方法、装置、电子设备和计算机可读介质
CN116310036A (zh) 场景渲染方法、装置、设备、计算机可读存储介质及产品
CN114842120A (zh) 一种图像渲染处理方法、装置、设备及介质
WO2024016923A1 (zh) 特效图的生成方法、装置、设备及存储介质
WO2024067320A1 (zh) 虚拟物体的渲染方法、装置、设备及存储介质
WO2024032752A1 (zh) 转场特效图的生成方法、装置、设备及存储介质
WO2024037556A1 (zh) 图像处理方法、装置、设备及存储介质
CN111583102B (zh) 人脸图像处理方法、装置、电子设备及计算机存储介质
WO2023231926A1 (zh) 图像处理方法、装置、设备及存储介质
US20230360286A1 (en) Image processing method and apparatus, electronic device and storage medium
WO2023193613A1 (zh) 高光渲染方法、装置、介质及电子设备
WO2023193639A1 (zh) 图像渲染方法、装置、可读介质及电子设备
WO2023169287A1 (zh) 美妆特效的生成方法、装置、设备、存储介质和程序产品
US20230090457A1 (en) Face image displaying method and apparatus, electronic device, and storage medium
CN111292406A (zh) 模型渲染方法、装置、电子设备及介质
CN115330925A (zh) 图像渲染方法、装置、电子设备及存储介质
CN114419298A (zh) 虚拟物体的生成方法、装置、设备及存储介质
CN112233207A (zh) 图像处理方法、装置、设备和计算机可读介质
CN112991147B (zh) 图像处理方法、装置、电子设备及计算机可读存储介质
CN112395826B (zh) 文字特效处理方法及装置
WO2023025181A1 (zh) 图像识别方法、装置和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23856710

Country of ref document: EP

Kind code of ref document: A1