CN113935892A - Oil painting style scene rendering method and equipment and storage medium - Google Patents

Oil painting style scene rendering method and equipment and storage medium Download PDF

Info

Publication number
CN113935892A
CN113935892A CN202111056761.5A CN202111056761A CN113935892A CN 113935892 A CN113935892 A CN 113935892A CN 202111056761 A CN202111056761 A CN 202111056761A CN 113935892 A CN113935892 A CN 113935892A
Authority
CN
China
Prior art keywords
scene
pixel
color
current pixel
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111056761.5A
Other languages
Chinese (zh)
Inventor
国家玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202111056761.5A priority Critical patent/CN113935892A/en
Publication of CN113935892A publication Critical patent/CN113935892A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides a scene rendering method, device and storage medium with oil painting style. In the method, for a three-dimensional scene to be processed, the position of a stroked line in each scene element model is determined; performing delineation coloring on each scene element model according to the position of the delineation line to obtain a three-dimensional scene containing the delineation line; and in a three-dimensional scene containing a delineation line, carrying out fuzzy processing on the scene color of the three-dimensional scene according to the texture coordinates of each pixel in the three-dimensional scene to obtain an oil painting stylized scene. The method describes the outline of each scene primary color model through edge detection and edge tracing coloring, realizes the scene color with oil painting texture through fuzzy processing, can complete conversion to the oil painting style without redesigning modeling, greatly improves the scene rendering efficiency, and improves the scene expansibility.

Description

Oil painting style scene rendering method and equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, and a storage medium for rendering an oil painting style scene.
Background
Oil painting is a common painting mode. Generally, in oil painting works, the details of a painting object are usually expressed by the texture, hiding power and transparency of various pigments. The oil painting has the characteristics of rich colors and strong three-dimensional texture of the painting.
Taking game development as an example, a plurality of scenes are usually set in a game. In the related art, if a game scene is to be changed from a realistic style to an oil painting style, each scene in the game needs to be redrawn and modeled, so that the development efficiency of the game is greatly reduced, and the expansibility of the existing game scene is poor. Therefore, a new solution is desired to overcome the existing technical problems.
Disclosure of Invention
Various aspects of the application provide a scene rendering method, device and storage medium with oil painting style, which are used for realizing oil painting stylized scene, improving scene rendering efficiency and enhancing scene expansibility.
The embodiment of the application further provides a scene rendering method in the oil painting style, and the method comprises the following steps:
for a three-dimensional scene to be processed, determining the positions of the stroked lines in each scene element model;
performing delineation coloring on each scene element model according to the position of the delineation line to obtain a three-dimensional scene containing the delineation line;
and in a three-dimensional scene containing a delineation line, carrying out fuzzy processing on the scene color of the three-dimensional scene according to the texture coordinates of each pixel in the three-dimensional scene to obtain an oil painting stylized scene.
Further optionally, the blurring the scene color of the three-dimensional scene according to the texture coordinate of each pixel in the three-dimensional scene includes:
according to texture coordinates of each pixel in the three-dimensional scene, each pixel in the three-dimensional scene is taken as a center, and a corresponding coordinate range is divided in a screen space coordinate system;
and carrying out fuzzy processing on the scene color of each pixel according to the coordinate range corresponding to each pixel to obtain the target scene color of each pixel in the oil painting stylized scene.
Wherein, further optionally, according to the texture coordinates of each pixel in the three-dimensional scene, with each pixel in the three-dimensional scene as the center, the respective corresponding coordinate range is divided in the screen space coordinate system, including:
and for each pixel in the three-dimensional scene, shifting the texture coordinate of the current pixel in a screen space coordinate system to obtain a coordinate range corresponding to the current pixel.
Wherein, further optionally, the current pixel corresponds to a first coordinate range;
the method for obtaining the target scene color of each pixel in the oil painting stylized scene by carrying out fuzzy processing on the scene color of each pixel according to the coordinate range corresponding to each pixel comprises the following steps:
acquiring scene colors of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels in the first coordinate range;
and according to preset weight parameters, carrying out weighted average calculation on the scene colors of the current pixel and the surrounding pixels in the first coordinate range to obtain a weighted average value of the scene colors, and taking the weighted average value as the target scene color of the current pixel.
Wherein, further optionally, the current pixel corresponds to a first coordinate range;
the method for obtaining the target scene color of each pixel in the oil painting stylized scene by carrying out fuzzy processing on the scene color of each pixel according to the coordinate range corresponding to each pixel comprises the following steps:
acquiring scene colors of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels in the first coordinate range;
acquiring the similarity between the scene color of the current pixel and the scene colors of the surrounding pixels;
and fusing the scene color of the surrounding pixels with the similarity in the preset order with the scene color of the current pixel, and taking the fused scene color as the target scene color of the current pixel.
Further optionally, determining each color block where each pixel is located from the scene map based on the coordinate range corresponding to each pixel;
determining the arrangement sequence of each color block in the scene map;
and superposing the color blocks subjected to the fuzzy processing according to the arrangement sequence to obtain the oil painting stylized scene.
Wherein, further optionally, determining an arrangement order of the color blocks in the scene map includes:
acquiring the visual characteristics of each color block from the scene map according to the pixel texture coordinates of each color block; the visual features comprise background texture features and/or illumination features, the background texture features are used for representing the relative position relation of each color block in the scene map, and the illumination features are used for representing the light and shade relation of each color block in the scene map;
and determining the color block range and the arrangement sequence corresponding to each color block based on the visual features.
Optionally, the step of superposing the color blocks subjected to the fuzzy processing according to the arrangement order to obtain the oil painting stylized scene includes:
and for each color block subjected to the fuzzy processing, overlapping each color block in a three-dimensional scene containing a delineation line based on the color block range and the arrangement sequence of each color block to obtain the oil painting stylized scene.
Further optionally, the stroking lines comprise contour lines and/or polylines, the contour lines being the stroking lines for representing the outer contour of the model, the polylines being the stroking lines for representing the inner contour of the model;
determining the positions of the stroked lines in the scene element models, including:
and acquiring edge pixels on the contour lines and/or the broken lines of the scene element models in a screen space coordinate system.
Wherein, further optionally, if the stroked line includes a contour line, acquiring edge pixels on the contour line and/or the polyline of each scene element model in a screen space coordinate system, including:
for a pixel to be detected in each scene element model, shifting texture coordinates of a current pixel in a screen space coordinate system to obtain texture coordinates of surrounding pixels; the offset direction is the normal direction of the current pixel, and the offset distance is the product of the scene depth of the current pixel and the offset coefficient;
acquiring the scene depth of the current pixel and the surrounding pixels from the scene map according to the texture coordinates of the current pixel and the surrounding pixels;
and if the scene depth difference between the current pixel and the surrounding pixels is larger than a set depth difference threshold value, taking the current pixel as an edge pixel on a contour line in the scene element model.
Wherein, further optionally, if the stroked line includes a polyline, acquiring edge pixels on the contour line and/or the polyline of each scene element model in the screen space coordinate system, including:
for a pixel to be detected in each scene element model, shifting texture coordinates of a current pixel in a screen space coordinate system to obtain texture coordinates of surrounding pixels; the offset direction is the normal direction of the current pixel, and the offset distance is the product of the scene depth of the current pixel and the offset coefficient;
acquiring normal values of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels;
and if the normal difference value between the current pixel and the surrounding pixels is larger than the set normal difference threshold value, taking the current pixel as an edge pixel on a broken line in the scene element model.
Further optionally, performing edge rendering on each scene element model according to the position of the edge line to obtain a three-dimensional scene including the edge line, including:
generating a first mask map containing edge pixels in each scene element model according to the edge pixels on the stroked lines in each scene element model;
and performing linear interpolation between each edge pixel in the first mask map according to the scene color of each edge pixel to obtain a three-dimensional scene containing a delineation line.
Optionally, sampling the three-dimensional noise map according to texture coordinates of each edge pixel to obtain a noise value corresponding to each edge pixel;
and carrying out offset processing on the scene color of each edge pixel according to the noise value corresponding to each edge pixel to obtain a three-dimensional scene containing a discontinuous delineation line.
Further optionally, filtering the target scene elements which do not need to be subjected to fuzzy processing by adopting a custom depth template item;
and generating a second mask map used for marking the target scene element according to the filtering result, wherein the second mask map is used for representing the position of the target scene element.
In this way, the step of blurring the scene colors of the three-dimensional scene is performed using the second mask map, and the initial scene colors of the target scene elements can be retained in the oil painting stylized scene.
Further optionally, filtering the occluded scene elements by using a custom depth template item;
and generating a third mask map used for marking the occluded scene element according to the filtering result, wherein the third mask map is used for representing the position of the occluded scene element.
In this way, the third mask map is adopted to execute the step of performing oil painting style processing on the scene color of the three-dimensional scene, so that the abnormal situation of the shielded scene elements in the oil painting stylized scene can be avoided.
An embodiment of the present application further provides an electronic device, including: a memory and a processor; the memory is to store one or more computer instructions; the processor is to execute the one or more computer instructions to: the steps in the method provided by the embodiments of the present application are performed.
Embodiments of the present application further provide a computer-readable storage medium storing a computer program, where the computer program can implement the steps in the method provided in the embodiments of the present application when executed.
In the technical scheme provided by the embodiment of the application, for the three-dimensional scene to be processed, the positions of the stroked lines in the scene element models are determined; and performing edge tracing coloring on each scene element model according to the position of the edge tracing line to obtain a three-dimensional scene containing the edge tracing line, thereby tracing the outline of each scene element model. Furthermore, in the three-dimensional scene containing the delineation line, the scene color of the three-dimensional scene is subjected to fuzzy processing according to the texture coordinates of each pixel in the three-dimensional scene, and an oil painting stylized scene is obtained. Therefore, the oil painting texture of the surface of each scene element model in the three-dimensional scene is realized, and the oil painting stylized scene is obtained.
In the embodiment of the application, the outline of each scene element model is described by performing edge detection and edge tracing coloring on each scene element model, and the scene color with oil painting texture is realized by fuzzy processing, so that the oil painting stylized scene is realized. In the embodiment of the application, the three-dimensional scene to be processed can be converted into the oil painting style without redesigning and modeling, so that the scene rendering efficiency is greatly improved, and the scene expansibility is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart of a scene rendering method in an oil painting style according to an exemplary embodiment of the present application;
FIG. 2 is a schematic diagram illustrating an effect of a stylized scene of an oil painting according to an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram illustrating an effect of a stylized scene of an oil painting according to another exemplary embodiment of the present application;
FIG. 4 is a schematic diagram illustrating an effect of a stylized scene of an oil painting according to yet another exemplary embodiment of the present application;
FIG. 5 is a diagram illustrating effects of scene elements provided in an exemplary embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Oil painting is a common painting mode. Generally, in oil painting works, the details of a painting object are usually expressed by the texture, hiding power and transparency of various pigments. The oil painting has the characteristics of rich colors and strong three-dimensional texture of the painting. In the application, a three-dimensional scene is expressed by adopting an oil painting mode, and the three-dimensional scene is called an oil painting stylized scene.
At present, the application range of the three-dimensional scene is wider, such as games, movies, virtual transactions, online travel and the like. Taking game development as an example, a plurality of scenes are usually set in a game. In the related art, if a game scene is to be changed from a realistic style to an oil painting style, each scene in the game needs to be redrawn and modeled, so that the development efficiency of the game is greatly reduced, and the expansibility of the existing game scene is poor.
In view of the above technical problems, in some embodiments of the present application, a solution is provided, and the technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
In the embodiment of the application, the three-dimensional scene is expressed by adopting an oil painting mode, and the three-dimensional scene is called an oil painting stylized scene.
An embodiment of the present application provides an oil painting style scene rendering method, and fig. 1 is a schematic flow diagram of the oil painting style scene rendering method provided in an exemplary embodiment of the present application. As shown in fig. 1, the method includes:
101. for a three-dimensional scene to be processed, determining the positions of the stroked lines in each scene element model;
102. performing delineation coloring on each scene element model according to the position of the delineation line to obtain a three-dimensional scene containing the delineation line;
103. and in a three-dimensional scene containing a delineation line, carrying out fuzzy processing on the scene color of the three-dimensional scene according to the texture coordinates of each pixel in the three-dimensional scene to obtain an oil painting stylized scene.
The three-dimensional scene to be processed is a three-dimensional scene needing scene rendering. For example, suppose that the three-dimensional scene to be processed is a realistic game scene in a certain game, and after post-processing, another realistic game scene in the oil painting style in the game is obtained. In the two game scenes before and after the post-processing, the scene elements, the scene element layout and the attributes of the scene elements are all consistent, but the scene styles of the two game scenes are different. In particular, the scene style can be embodied as artistic style features in a three-dimensional scene, such as hue, lines, shading, and the like.
For a game scene, different scene styles can be understood as different artistic styles in the game scene. For example, in a realistic-style game scene, the scene hue and the light-dark relation of the scene elements are closer to the real world, and the scene elements are not outlined by a distinct line. In the oil painting style game scene, the outline and the light-dark relation of the scene elements are described through the color blocks of the oil painting material and the brush strokes, so that the detailed characteristics of the scene elements are embodied.
In the embodiment of the application, each step in the oil painting style scene rendering method can be realized through a material code. For example, various rendering logics are written in the material codes to realize the rendering principle of the corresponding oil painting stylized scene.
Each step in the oil painting style scene rendering method provided in fig. 1 is described below with reference to a specific embodiment.
First, in 101, for a three-dimensional scene to be processed, the positions of the stroked lines in the respective scene element models are determined.
Wherein the stroking lines comprise contour lines and/or polylines. The contour line is a stroked line for representing the outer contour of the model, and the polygonal line is a stroked line for representing the inner contour of the model. Optionally, edges of the scene element model are determined by edge detection to provide a basis for the shading. In practical application, the coordinates of surrounding pixels are obtained from the currently detected pixel to detect the pixel at the edge of the scene element model. How to obtain the contour lines and the polylines, respectively, is described below.
Specifically, an alternative embodiment of determining the positions of the stroked lines in the scene element models in 101 may be implemented as: and acquiring edge pixels on contour lines and/or broken lines in each scene element model.
In an alternative embodiment, if the stroked line includes a contour line, the obtaining of the edge pixel on the contour line and/or the polyline in each scene element model in the above steps may be implemented as:
for a pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel to obtain the texture coordinate of the surrounding pixels; acquiring the scene depth of the current pixel and the surrounding pixels from the scene map according to the texture coordinates of the current pixel and the surrounding pixels; and if the scene depth difference between the current pixel and the surrounding pixels is larger than a set depth difference threshold value, taking the current pixel as an edge pixel on a contour line in the scene element model.
The offset direction is the normal direction of the current pixel, and the offset distance is the product of the scene depth of the current pixel and the offset coefficient. Specifically, the normal direction of the current pixel is extracted from the Scene map (Scene Texture) as the offset direction; and extracting the Scene Depth (Scene Depth) of the current pixel from the Scene map, and taking the product of the Scene Depth of the current pixel and the offset coefficient as the offset distance.
For the pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel in the screen space coordinate system based on the above-described shifting direction and shifting distance to obtain the texture coordinate of the surrounding pixels. Specifically, texture coordinates of surrounding pixels are obtained through vector operation according to coordinates of the current pixel point in a screen space coordinate system.
And then, according to the texture coordinates of the current pixel and the surrounding pixels, acquiring the respective scene depths of the current pixel and the surrounding pixels from the scene map, and calculating the scene depth difference between the current pixel and the surrounding pixels. If the scene depth difference between the current pixel and any surrounding pixel is greater than the set depth difference threshold, it indicates that the current pixel is on the contour line of the scene element model, and in this case, the current pixel is used as an edge pixel on the contour line in the scene element model.
In another alternative embodiment, if the stroked line includes a polyline, the obtaining of the edge pixels on the contour line and/or the polyline in each scene element model in the above steps may be implemented as:
for a pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel to obtain the texture coordinate of the surrounding pixels; acquiring normal information of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels; and if the normal difference value between the current pixel and the surrounding pixels is larger than the set normal difference threshold value, the current pixel is considered to be in the outline of the scene element model, and in this case, the current pixel is taken as an edge pixel on a broken line in the scene element model.
The offset direction is the normal direction of the current pixel, and the offset distance is the product of the scene depth of the current pixel and the offset coefficient. The specific obtaining manner of the offset direction and the offset distance of the current pixel is similar to that described above, and is not described herein again.
For the pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel in the screen space coordinate system based on the above-described shifting direction and shifting distance to obtain the texture coordinate of the surrounding pixels. Then, normal information of the pixel is collected from the scene map through texture coordinates of the current pixel and the surrounding pixels. And then, determining respective normal values of the current pixel and the surrounding pixels in a world coordinate system according to the acquired normal information, and calculating a normal difference value of the current pixel and the surrounding pixels. And if the normal difference value of the current pixel and any surrounding pixel is greater than the set normal difference threshold value, the current pixel is considered to be in the inner contour of the scene element model, and in this case, the current pixel is taken as an edge pixel on a broken line in the scene element model.
In practical application, the threshold may be set and optimized according to a specific scenario, which is not limited in the embodiment of the present application. In addition to the edge detection method described above, other edge detection methods may be used to determine the position of the stroked line in the embodiment of the present application, which is not limited in the present application.
Further, after the position of the stroking line is detected, at 102, each scene element model is stroked and colored according to the position of the stroking line, and a three-dimensional scene including the stroking line is obtained.
Specifically, in an optional embodiment, in 102, rendering each scene element model based on the position of the stroked line to obtain a three-dimensional scene including the stroked line may be implemented as:
generating a first mask map containing edge pixels in each scene element model according to the edge pixels on the stroked lines in each scene element model; and performing Linear interpolation (Linear Lerp) among the edge pixels in the first mask map according to the scene color of each edge pixel to obtain a three-dimensional scene containing a delineation line.
In the above step, a first mask map is generated according to the edge pixels of each scene element model on the stroked line, and the first mask map includes the edge pixels of each scene element model. And then, carrying out linear interpolation on the scene color of each edge pixel and the preset stroking line color of each edge pixel in the first mask image to obtain a three-dimensional scene containing the stroking line.
Therefore, rendering of the stroking line can be achieved in the three-dimensional scene through the steps, so that the stroking line with the oil painting style is automatically generated, and the three-dimensional scene rendering efficiency is assisted to be improved.
In fact, the painting brush touch in the real world is not continuous, and continuous painting lines can be obtained after the painting brush touch is painted through the steps, so that the painting effect is unnatural. Aiming at the problem, in order to enable the oil painting brush stroke to have a truer effect, further optionally, sampling is carried out on the three-dimensional noise image according to texture coordinates of each edge pixel so as to obtain a noise value corresponding to each edge pixel; and carrying out offset processing on the scene color of each edge pixel according to the noise value corresponding to each edge pixel to obtain a three-dimensional scene containing a discontinuous delineation line.
In the above steps, for a three-dimensional Scene including a delineation line, inverse operation is first performed according to the Scene Depth (Scene Depth) of the current pixel to obtain the world coordinate of the current pixel in the world coordinate system. Then, the world coordinates of the current pixel are used for remapping, and texture coordinates (marked as UVW) of the current pixel for the three-dimensional noise map are obtained. Further, a three-dimensional Berlin Noise (Perlin Noise) map is sampled using texture coordinates of the current pixel, resulting in a continuous Noise value. In this way, the continuous noise values are used to offset the delineation lines, thereby simulating the effect of short lines and obtaining a three-dimensional scene containing discontinuous delineation lines. Among them, berlin noise is a continuous noise often used to simulate clouds.
In short, a continuous noise value is obtained by mapping the texture coordinates of the current pixel to the three-dimensional berlin noise, and the value range of the continuous noise value is usually between 0 and 1 in practical application. Then, the continuous noise value is multiplied by the color value of the edge line, so as to obtain the edge line containing an invalid value (for example, a continuous noise value with a value of 0). Such as the interrupted, discontinuous wall stitches in the three-dimensional scene shown in fig. 2.
Furthermore, after the outline of the scene element model is outlined by the above-mentioned edge painting, in order to simulate the texture of the material in the oil painting work, it is necessary to blur the three-dimensional scene so as to make it have the color blocks and brush strokes in the oil painting style. Specifically, in 103, in a three-dimensional scene including a delineation line, a scene color of the three-dimensional scene is blurred according to texture coordinates of each pixel in the three-dimensional scene, so as to obtain an oil painting stylized scene.
In an optional embodiment, the blurring processing on the scene color of the three-dimensional scene according to the texture coordinate of each pixel in the three-dimensional scene may be implemented as:
according to texture coordinates of each pixel in the three-dimensional scene, each pixel in the three-dimensional scene is taken as a center, and a corresponding coordinate range is divided in a screen space coordinate system; and carrying out fuzzy processing on the scene color of each pixel according to the coordinate range corresponding to each pixel to obtain the target scene color of each pixel in the oil painting stylized scene.
In the above steps, a plurality of coordinate ranges are respectively divided by taking each pixel in the three-dimensional scene as a center, and the scene color is subjected to fuzzy processing based on the coordinate range corresponding to each pixel, so that a scene color block corresponding to each pixel is formed to be used for simulating a color block and a brush stroke in the oil painting work. It is worth to be noted that scene color blocks corresponding to each pixel may be mutually overlapped, so that a stroke effect of mutually overlapping one stroke in the oil painting work can be further simulated.
Specifically, in 103, according to the texture coordinates of each pixel in the three-dimensional scene, each pixel in the three-dimensional scene is taken as a center, and a corresponding coordinate range is divided in the screen space coordinate system, which can be implemented as:
and for each pixel in the three-dimensional scene, shifting the texture coordinate of the current pixel in a screen space coordinate system to obtain a coordinate range corresponding to the current pixel.
Similarly to the above, the coordinate range corresponding to the current pixel can also be obtained by the offset processing here. In practical applications, the size of the coordinate range may be preset. Optionally, in the screen space coordinate system, with the current pixel as a center, the coordinates in the set range around the center are all divided into the coordinate range corresponding to the current pixel. The setting range is, for example, 9 × 9 unit pixels.
Furthermore, after the coordinate range corresponding to each pixel in the three-dimensional scene is obtained, the scene color of each pixel needs to be subjected to fuzzy processing, so that the brush-touch effect of the oil painting style can be simulated.
A specific implementation manner of performing blur processing on the scene color of each pixel to obtain a target scene color is described below with reference to a specific embodiment:
in an optional embodiment, in 103, the blurring processing is performed on the scene color of each pixel according to the coordinate range corresponding to each pixel to obtain the target scene color of each pixel in the oil painting stylized scene, which may be implemented as:
acquiring scene colors of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels in the first coordinate range; and according to preset weight parameters, carrying out weighted average calculation on the scene colors of the current pixel and the surrounding pixels in the first coordinate range to obtain a weighted average value of the scene colors, and taking the weighted average value as the target scene color of the current pixel.
Specifically, assume that the current pixel is pixel i, and assume that the coordinate range corresponding to pixel i is the first coordinate range. Assume that the surrounding pixel around pixel i is denoted as { i1,i2,……,in}. Suppose pixel i and surrounding pixels { i }1,i2,……,inThe corresponding weight parameters are respectively { x }1,x2,……,xn}. Let n be an integer less than or equal to 9.
Based on the above assumption, the texture coordinates of the pixel i in the first coordinate range and the texture coordinates of the surrounding pixels surrounding the pixel i are obtained from the first coordinate range. Further, the pixel i and the surrounding pixel { i }1,i2,……,inSampling the scene map by texture coordinates of the pixel i and the scene color { y } corresponding to the surrounding pixelsi,y1,y2,……,yn}。
Then, based on the preset weighting parameter { xi,x1,x2,……,xnTo a first coordinate rangeIn-surrounding pixel i and surrounding pixels { i1,i2,……,inColor of scene { y }i,y1,y2,……,ynPerforming weighted average calculation to obtain a weighted average value Y of the scene color, and taking the weighted average value Y as the target scene color of the pixel i.
In practical applications, the fuzzy processing is optionally performed by using the X Radius parameter and the Y Radius parameter in a game editing tool (e.g., UE 4). Wherein, X Radius represents the length of the horizontal axis in the coordinate range corresponding to each pixel, and Y Radius represents the length of the vertical axis in the coordinate range corresponding to each pixel.
It can be understood that the larger the values of the length of the transverse axis and the length of the longitudinal axis in the coordinate range corresponding to the pixel for performing the blurring processing are, the more pixels are covered by the blurring processing, the better the blurring processing effect is, but the higher the required computational performance is. For example, fig. 3 shows a three-dimensional scene in which the transverse axis radius length and the longitudinal axis radius length take values of 10 × 5, and fig. 4 shows a three-dimensional scene in which the transverse axis radius length and the longitudinal axis radius length take values of 3 × 3. The overall brightness and appearance of the image in fig. 3 are more natural, and the blurring effect in fig. 3 is better than that in fig. 4. Further optionally, the values of the transverse axis radius length and the longitudinal axis radius length may also be set as follows: 8*3.
In another optional embodiment, in 103, the blur processing is performed on the scene color of each pixel according to the coordinate range corresponding to each pixel to obtain the target scene color of each pixel in the oil painting stylized scene, and may be implemented as follows:
acquiring scene colors of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels in the first coordinate range; acquiring the similarity between the scene color of the current pixel and the scene colors of the surrounding pixels; and fusing the scene color of the surrounding pixels with the similarity in the preset order with the scene color of the current pixel, and taking the fused scene color as the target scene color of the current pixel.
Specifically, assume whenThe former pixel is a pixel i, and the coordinate range corresponding to the pixel i is assumed to be a first coordinate range. Assume that the surrounding pixel around pixel i is denoted as { i1,i2,……,in}. Let n be an integer less than or equal to 9. Assume that the preset order is the first 4 bits (where the 1 st bit is the surrounding pixel with the highest similarity).
Based on the above assumption, the texture coordinates of the pixel i in the first coordinate range and the texture coordinates of the surrounding pixels surrounding the pixel i are obtained from the first coordinate range. Further, the pixel i and the surrounding pixel { i }1,i2,……,inSampling the scene map by texture coordinates of the pixel i and the scene color { y } corresponding to the surrounding pixelsi,y1,y2,……,yn}。
Next, a pixel i and each of the surrounding pixels { i }are calculated1,i2,……,inSimilarity of scene colors between. Further assume that surrounding pixel i1,i2,……,inSorting pixels i according to the similarity from high to low1,i2,……,in. Based on this, the surrounding pixels { i } whose similarity is at the top 4 bits are set1,i2,i3,i4Calculating the average value of the scene color of the pixel i and the scene color of the pixel i, and taking the average scene color value obtained after the average value calculation as the pixel i and the surrounding pixels { i }1,i2,i3,i4The target scene color of.
In addition, in another embodiment, the scene color of the preset cis-position surrounding pixels can be directly used as the target scene color of the pixel i, so that the oil painting style color block is formed.
Through the embodiments, the scene colors of the current pixel and the surrounding pixels can be fused in the coordinate range corresponding to each pixel, so that richer and continuous oil painting style colors are presented in the finally displayed three-dimensional scene, and the brush strokes of the oil painting style are simulated in the three-dimensional scene.
Of course, the blurring processing method of the above embodiment may be applied to all pixels in the three-dimensional scene, or only to a part of pixels in the three-dimensional scene. Optionally, the to-be-processed pixels which need to be subjected to the blurring processing are selected from all the pixels according to a preset period, so that the overhead required by the blurring processing is reduced, and the scene rendering efficiency is further improved.
In this embodiment, the outline of each scene element model is described by performing edge detection and edge tracing coloring on each scene element model, and then continuous colors and strokes of oil painting texture are simulated in a three-dimensional scene through fuzzy processing, so that the scene colors in an oil painting style are filled in each scene element model, and oil painting stylization of the three-dimensional scene is realized. In the embodiment of the application, the three-dimensional scene to be processed can be converted into the oil painting style without redesigning and modeling, so that the scene rendering efficiency is greatly improved, and the scene expansibility is improved.
In the above or following embodiments, since some scene elements in the three-dimensional scene do not need to cover the oil painting texture, the scene elements need to be filtered, and the second mask map is used to prevent the scene elements from being covered by the oil painting texture, so that the scene elements retain the original scene color.
Specifically, a Depth item (Custom Depth) stored in an object vertex of a Custom Depth template (Custom Stencil) can be adopted to filter target scene elements which do not need to use an oil painting brush stroke map; and generating a second mask map for marking the target scene element according to the filtering result, and performing a step of blurring the scene color of the three-dimensional scene by adopting the second mask map so as to retain the initial scene color of the target scene element in the oil painting stylized scene.
The above-mentioned partial scene elements are, for example: virtual characters, sky balls, scene elements with words or special marks (such as scene icons), and other scene elements without blurring. Wherein the scene icon is shown in fig. 5, for example.
For example, according to the scene depth of the virtual character in the three-dimensional scene, the corresponding scene depth is set in the custom depth template item, so that the virtual character is filtered out, and a corresponding second mask map is generated to keep the appearance image of the virtual character in the three-dimensional scene.
For example, according to the marks corresponding to various scene icons, corresponding marks to be filtered are set in the custom depth template item. Optionally, the scene icon is marked in the 3DUI to obtain a mark to be filtered for the scene icon. Specifically, in the process of Depth filtering, the Scene Depth (i.e., Scene Depth) of the current pixel is compared with the Depth term (i.e., Custom Depth) corresponding to the current pixel. If the Custom Depth is less than or equal to Scene Depth, it is indicated that the vertex of the marked Scene icon is at the current view angle, in this case, the initial Scene color of the Scene icon needs to be preserved, and it is avoided that the Scene icon after the blurring processing cannot provide accurate visual information. Therefore, various scene icons are filtered out, and corresponding second mask images are generated and used for representing the positions of the target scene elements, so that various scene icons can be flexibly reserved, the initial scene colors can be reserved, and the rendering efficiency can be improved.
In this embodiment, through the above steps, in the oil painting stylized scene, the initial scene color of part of the scene elements is retained, and the scene rendering efficiency of the oil painting stylized scene is improved. Meanwhile, the reserved initial scene color is marked, the post-processing overhead can be reduced, and the scene rendering time can be shortened.
In practical application, because some scene elements in a three-dimensional scene are mutually occluded, the occluded scene elements are abnormal in processing flows such as edge shading or fuzzy processing. For this situation, in this application, further optionally, before 101, the occluded scene element may be further filtered by using a custom depth template, so as to generate a third mask map, where the third mask map is used to indicate a location of the occluded scene element. Thus, the above-described steps shown in fig. 1 may be performed in conjunction with the third mask map.
Specifically, assume that the Custom Depth template stores Depth terms (i.e., Custom Depth) for each object vertex. Based on this, in the process of Depth filtering, comparing the Scene Depth (namely Scene Depth) of the current pixel with the Depth item (namely Custom Depth) corresponding to the current pixel. If the Custom Depth is larger than the Scene Depth, it is indicated that the vertex of the marked object corresponding to the current pixel is invisible to the user at the current view angle, and under this condition, the oil painting style processing is not required to be performed on the current pixel, so as to avoid the processing flows such as edge painting coloring or blurring processing and the like from being abnormal. Meanwhile, pixels which do not need to be subjected to oil painting processing are marked, so that the post-processing overhead is further reduced, and the scene rendering efficiency is improved.
It should be noted that the third mask map can be merged with the first and second mask maps to form a mask map, which is input into the same post-processing node and triggers the node to complete the related post-processing flow. Of course, according to the actual application requirement, the first, second, and third mask maps may also be input into different post-processing nodes, so as to trigger different post-processing nodes to implement the above-mentioned processes, respectively.
In the above or below embodiments, the oil painting works can be overlaid with various pigments to create a stereoscopic visual effect. Therefore, in order to increase the texture of the oil painting stylized scene, in this embodiment, optionally, based on the coordinate range corresponding to each pixel, each color block where each pixel is located is determined from the scene map; determining the arrangement sequence of each color block in the scene map; and superposing the color blocks subjected to the fuzzy processing according to the arrangement sequence to obtain the oil painting stylized scene.
In the above step, optionally, the process of determining the arrangement order of the color blocks in the scene map may be implemented as: acquiring the visual characteristics of each color block from the scene map according to the pixel texture coordinates of each color block; and determining the color block range and the arrangement sequence corresponding to each color block based on the visual features.
Wherein the visual features comprise background texture features and/or lighting features. The background texture feature is used for representing the relative position relation of each color block in the scene map, and the illumination feature is used for representing the light and shade relation of each color block in the scene map.
Furthermore, the process of superimposing the color blocks subjected to the fuzzy processing according to the arrangement order to obtain the oil painting stylized scene can be realized as follows: and for each color block subjected to the fuzzy processing, overlapping each color block in a three-dimensional scene containing a delineation line based on the color block range and the arrangement sequence of each color block to obtain the oil painting stylized scene.
Therefore, the color blocks subjected to the fuzzy processing can be superposed according to the arrangement sequence through the embodiment to obtain the finally displayed oil painting stylized scene, a stereoscopic visual effect is created, and the texture of the oil painting stylized scene is greatly improved.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 101 to 104 may be device a; for another example, the execution subject of steps 101 and 102 may be device a, and the execution subject of step 103 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 101, 102, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Fig. 6 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application, and as shown in fig. 6, the electronic device includes: memory 601, processor 602, communication component 603, and display component 604.
The memory 601 is used for storing computer programs and may be configured to store other various data to support operations on the electronic device. Examples of such data include instructions for any application or method operating on the electronic device, contact data, phonebook data, messages, pictures, videos, and so forth.
The memory 601 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 602, coupled to the memory 601, for executing the computer programs in the memory 601 to: for a three-dimensional scene to be processed, determining the positions of the stroked lines in each scene element model; performing delineation coloring on each scene element model according to the position of the delineation line to obtain a three-dimensional scene containing the delineation line; and in a three-dimensional scene containing a delineation line, carrying out fuzzy processing on the scene color of the three-dimensional scene according to the texture coordinates of each pixel in the three-dimensional scene to obtain an oil painting stylized scene.
Further optionally, the oil painting stylized scene is presented by a display component 604.
Further optionally, when the processor 602 performs the blurring processing on the scene color of the three-dimensional scene according to the texture coordinate of each pixel in the three-dimensional scene, specifically configured to:
according to texture coordinates of each pixel in the three-dimensional scene, each pixel in the three-dimensional scene is taken as a center, and a corresponding coordinate range is divided in a screen space coordinate system;
and carrying out fuzzy processing on the scene color of each pixel according to the coordinate range corresponding to each pixel to obtain the target scene color of each pixel in the oil painting stylized scene.
Further optionally, when the processor 602 marks out respective corresponding coordinate ranges in the screen space coordinate system with each pixel in the three-dimensional scene as a center according to the texture coordinate of each pixel in the three-dimensional scene, specifically:
and for each pixel in the three-dimensional scene, shifting the texture coordinate of the current pixel in a screen space coordinate system to obtain a coordinate range corresponding to the current pixel.
Wherein, further optionally, the current pixel corresponds to the first coordinate range. Based on this, when the processor 602 performs the blurring processing on the scene color of each pixel according to the coordinate range corresponding to each pixel to obtain the target scene color of each pixel in the oil painting stylized scene, the processor is specifically configured to:
acquiring scene colors of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels in the first coordinate range; and according to preset weight parameters, carrying out weighted average calculation on the scene colors of the current pixel and the surrounding pixels in the first coordinate range to obtain a weighted average value of the scene colors, and taking the weighted average value as the target scene color of the current pixel.
Wherein, further optionally, the current pixel corresponds to the first coordinate range. Based on this, when the processor 602 performs the blurring processing on the scene color of each pixel according to the coordinate range corresponding to each pixel to obtain the target scene color of each pixel in the oil painting stylized scene, the processor is specifically configured to:
acquiring scene colors of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels in the first coordinate range; acquiring the similarity between the scene color of the current pixel and the scene colors of the surrounding pixels; and fusing the scene color of the surrounding pixels with the similarity in the preset order with the scene color of the current pixel, and taking the fused scene color as the target scene color of the current pixel.
Further optionally, the stroking lines comprise contour lines for representing an outer contour of the model and/or polylines for representing an inner contour of the model.
When determining the positions of the stroked lines in each scene element model, the processor 602 is specifically configured to: and acquiring edge pixels on the contour lines and/or the broken lines of the scene element models in a screen space coordinate system.
Further optionally, if the stroked line includes a contour line, the processor 602 is specifically configured to, when obtaining edge pixels on the contour line and/or the polyline of each scene element model in the screen space coordinate system:
for a pixel to be detected in each scene element model, shifting texture coordinates of a current pixel in a screen space coordinate system to obtain texture coordinates of surrounding pixels; the offset direction is the normal direction of the current pixel, and the offset distance is the product of the scene depth of the current pixel and the offset coefficient;
acquiring the scene depth of the current pixel and the surrounding pixels from the scene map according to the texture coordinates of the current pixel and the surrounding pixels; and if the scene depth difference between the current pixel and the surrounding pixels is larger than a set depth difference threshold value, taking the current pixel as an edge pixel on a contour line in the scene element model.
Further optionally, if the stroked line includes a polyline, the processor 602 is specifically configured to, when obtaining edge pixels on the contour line and/or the polyline of each scene element model in the screen space coordinate system:
for a pixel to be detected in each scene element model, shifting texture coordinates of a current pixel in a screen space coordinate system to obtain texture coordinates of surrounding pixels; the offset direction is the normal direction of the current pixel, and the offset distance is the product of the scene depth of the current pixel and the offset coefficient; acquiring normal values of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels; and if the normal difference value between the current pixel and the surrounding pixels is larger than the set normal difference threshold value, taking the current pixel as an edge pixel on a broken line in the scene element model.
Further optionally, when the processor 602 performs the stroking and coloring on each scene element model according to the position of the stroking line to obtain the three-dimensional scene including the stroking line, the processor is specifically configured to:
generating a first mask map containing edge pixels in each scene element model according to the edge pixels on the stroked lines in each scene element model; and performing linear interpolation between each edge pixel in the first mask map according to the scene color of each edge pixel to obtain a three-dimensional scene containing a delineation line.
Wherein, further optionally, the processor 602 is further configured to: sampling the three-dimensional noise image according to the texture coordinates of each edge pixel to obtain a noise value corresponding to each edge pixel; and carrying out offset processing on the scene color of each edge pixel according to the noise value corresponding to each edge pixel to obtain a three-dimensional scene containing a discontinuous delineation line.
Further optionally, the processor 602 is further configured to: filtering target scene elements which do not need fuzzy processing by adopting a user-defined depth template item; and generating a second mask map used for marking the target scene element according to the filtering result, wherein the second mask map is used for representing the position of the target scene element.
Further optionally, the processor 602 is further configured to: filtering the shielded scene elements by adopting a user-defined depth template item; and generating a third mask map used for marking the occluded scene element according to the filtering result, wherein the third mask map is used for representing the position of the occluded scene element.
Further optionally, the processor 602 is further configured to: determining each color block of each pixel from the scene map based on the corresponding coordinate range of each pixel; determining the arrangement sequence of each color block in the scene map; and superposing the color blocks subjected to the fuzzy processing according to the arrangement sequence to obtain the oil painting stylized scene.
Wherein, further optionally, the processor 602 determines an arrangement order of the color blocks in the scene map, for:
acquiring the visual characteristics of each color block from the scene map according to the pixel texture coordinates of each color block; the visual features comprise background texture features and/or illumination features, the background texture features are used for representing the relative position relation of each color block in the scene map, and the illumination features are used for representing the light and shade relation of each color block in the scene map; and determining the color block range and the arrangement sequence corresponding to each color block based on the visual features.
Further optionally, the processor 602 superimposes the color blocks subjected to the blurring processing according to an arrangement order to obtain an oil painting stylized scene, which is used for: and for each color block subjected to the fuzzy processing, overlapping each color block in a three-dimensional scene containing a delineation line based on the color block range and the arrangement sequence of each color block to obtain the oil painting stylized scene.
Further, as shown in fig. 6, the electronic device further includes: power component 605, audio component 606, and the like. Only some of the components are schematically shown in fig. 6, and the electronic device is not meant to include only the components shown in fig. 6.
Wherein the communication component 603 is configured to facilitate communication between the device in which the communication component is located and other devices in a wired or wireless manner. The device in which the communication component is located may access a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G, or 5G, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may be implemented based on Near Field Communication (NFC) technology, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
Among other things, display component 604 may be implemented as a display including a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). The screen may be implemented as a touch screen to receive an input signal from a user, without the screen including a touch panel. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply 605 provides power to various components of the device in which the power supply is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
Accordingly, the present application further provides a computer-readable storage medium storing a computer program, where the computer program is capable of implementing the steps that can be executed by the electronic device in the foregoing method embodiments when executed.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory. The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (15)

1. A method for rendering an oil painting style scene, the method comprising:
for a three-dimensional scene to be processed, determining the positions of the stroked lines in each scene element model;
performing delineation coloring on each scene element model according to the position of the delineation line to obtain a three-dimensional scene containing the delineation line;
and in the three-dimensional scene containing the delineation line, carrying out fuzzy processing on the scene color of the three-dimensional scene according to the texture coordinates of each pixel in the three-dimensional scene to obtain the oil painting stylized scene.
2. The method of claim 1, wherein the blurring the scene color of the three-dimensional scene according to the texture coordinates of each pixel in the three-dimensional scene comprises:
according to texture coordinates of each pixel in the three-dimensional scene, each pixel in the three-dimensional scene is taken as a center, and a corresponding coordinate range is divided in a screen space coordinate system;
and carrying out fuzzy processing on the scene color of each pixel according to the coordinate range corresponding to each pixel to obtain the target scene color of each pixel in the oil painting stylized scene.
3. The method according to claim 2, wherein the dividing of the respective coordinate ranges in the screen space coordinate system based on the texture coordinates of the respective pixels in the three-dimensional scene with the respective pixels in the three-dimensional scene as the center comprises:
and for each pixel in the three-dimensional scene, shifting the texture coordinate of the current pixel in a screen space coordinate system to obtain a coordinate range corresponding to the current pixel.
4. The method of claim 2, wherein the current pixel corresponds to a first range of coordinates;
the step of performing fuzzy processing on the scene color of each pixel according to the coordinate range corresponding to each pixel to obtain the target scene color of each pixel in the oil painting stylized scene includes:
acquiring scene colors of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels in the first coordinate range;
according to preset weight parameters, carrying out weighted average calculation on the scene colors of the current pixel and the surrounding pixels in the first coordinate range to obtain a weighted average value of the scene colors, and taking the weighted average value as the target scene color of the current pixel.
5. The method of claim 2, wherein the current pixel corresponds to a first range of coordinates;
the step of performing fuzzy processing on the scene color of each pixel according to the coordinate range corresponding to each pixel to obtain the target scene color of each pixel in the oil painting stylized scene includes:
acquiring scene colors of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels in the first coordinate range;
acquiring the similarity between the scene color of the current pixel and the scene colors of the surrounding pixels;
and fusing the scene color of the surrounding pixels with the similarity in the preset order with the scene color of the current pixel, and taking the fused scene color as the target scene color of the current pixel.
6. The method of claim 2, further comprising:
determining each color block of each pixel from the scene map based on the corresponding coordinate range of each pixel;
determining the arrangement sequence of each color block in the scene map;
and superposing the color blocks subjected to the fuzzy processing according to the arrangement sequence to obtain the oil painting stylized scene.
7. The method of claim 6, wherein determining an arrangement order of the color blocks in the scene map comprises:
acquiring the visual characteristics of each color block from the scene map according to the pixel texture coordinates of each color block; the visual features comprise background texture features and/or illumination features, the background texture features are used for representing the relative position relation of each color block in the scene map, and the illumination features are used for representing the light shading relation of each color block in the scene map;
determining color block ranges and arrangement sequences corresponding to the color blocks based on the visual features;
and superposing the color blocks subjected to the fuzzy processing according to the arrangement sequence to obtain the oil painting stylized scene, wherein the method comprises the following steps:
and for each color block subjected to fuzzy processing, superposing each color block in a three-dimensional scene containing the delineation line based on the color block range and the arrangement sequence of each color block to obtain the oil painting stylized scene.
8. The method according to claim 1, wherein the stroked line comprises a stroked line and/or a polyline, the stroked line is a stroked line for representing an outer contour of the model, and the polyline is a stroked line for representing an inner contour of the model;
the determining the positions of the stroked lines in the scene element models comprises:
and acquiring edge pixels on the contour lines and/or the broken lines of the scene element models in a screen space coordinate system.
9. The method of claim 8, wherein if the stroked lines include contour lines, the obtaining edge pixels in the screen space coordinate system that are on the contour lines and/or polylines of each scene element model comprises:
for a pixel to be detected in each scene element model, shifting texture coordinates of a current pixel in a screen space coordinate system to obtain texture coordinates of surrounding pixels; the offset direction is the normal direction of the current pixel, and the offset distance is the product of the scene depth of the current pixel and the offset coefficient;
acquiring the scene depth of the current pixel and the surrounding pixels from the scene map according to the texture coordinates of the current pixel and the surrounding pixels;
and if the scene depth difference between the current pixel and the surrounding pixels is larger than a set depth difference threshold value, taking the current pixel as an edge pixel on a contour line in the scene element model.
10. The method of claim 8, wherein if the stroked lines include a polyline, the obtaining edge pixels in the screen space coordinate system that are on the contour lines and/or polyline of each scene element model comprises:
for a pixel to be detected in each scene element model, shifting texture coordinates of a current pixel in a screen space coordinate system to obtain texture coordinates of surrounding pixels; the offset direction is the normal direction of the current pixel, and the offset distance is the product of the scene depth of the current pixel and the offset coefficient;
acquiring normal values of the current pixel and the surrounding pixels from the scene map according to texture coordinates of the current pixel and the surrounding pixels;
and if the normal difference value between the current pixel and the surrounding pixels is larger than the set normal difference threshold value, taking the current pixel as an edge pixel on a broken line in the scene element model.
11. The method of claim 1, wherein said rendering each scene element model based on the position of the stroked line to obtain a three-dimensional scene including the stroked line comprises:
generating a first mask map containing edge pixels in each scene element model according to the edge pixels on the delineation line in each scene element model;
and performing linear interpolation between each edge pixel in the first mask map according to the scene color of each edge pixel to obtain a three-dimensional scene containing the delineation line.
12. The method of claim 1, further comprising:
filtering target scene elements which do not need fuzzy processing by adopting a user-defined depth template item;
and generating a second mask map used for marking the target scene element according to the filtering result, wherein the second mask map is used for representing the position of the target scene element.
13. The method of claim 1, further comprising:
filtering the shielded scene elements by adopting a user-defined depth template item;
and generating a third mask map used for marking the occluded scene element according to the filtering result, wherein the third mask map is used for representing the position of the occluded scene element.
14. An electronic device, comprising: a memory and a processor;
the memory is to store one or more computer instructions;
the processor is to execute the one or more computer instructions to: performing the steps of the method of any one of claims 1-13.
15. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium;
the computer program is arranged to carry out the steps of the method of any one of claims 1 to 13 when executed.
CN202111056761.5A 2021-09-09 2021-09-09 Oil painting style scene rendering method and equipment and storage medium Pending CN113935892A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111056761.5A CN113935892A (en) 2021-09-09 2021-09-09 Oil painting style scene rendering method and equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111056761.5A CN113935892A (en) 2021-09-09 2021-09-09 Oil painting style scene rendering method and equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113935892A true CN113935892A (en) 2022-01-14

Family

ID=79275314

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111056761.5A Pending CN113935892A (en) 2021-09-09 2021-09-09 Oil painting style scene rendering method and equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113935892A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853517A (en) * 2010-05-26 2010-10-06 西安交通大学 Real image oil painting automatic generation method based on stroke limit and texture
US20190347771A1 (en) * 2018-05-10 2019-11-14 Google Llc Generating and displaying blur in images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853517A (en) * 2010-05-26 2010-10-06 西安交通大学 Real image oil painting automatic generation method based on stroke limit and texture
US20190347771A1 (en) * 2018-05-10 2019-11-14 Google Llc Generating and displaying blur in images

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
冯乐乐: "《Unity Shader入门精要》", 30 June 2016, 人民邮电出版社, pages: 7 - 15 *
卢少平;张松海;: "基于视觉重要性的图像油画风格化绘制算法", 计算机辅助设计与图形学学报, no. 07, 15 July 2010 (2010-07-15) *
蔡小波;张学杰;: "一种油画生成方法及其在FPGA上的实现", 云南大学学报(自然科学版), no. 2, 15 September 2007 (2007-09-15) *

Similar Documents

Publication Publication Date Title
CN109003325B (en) Three-dimensional reconstruction method, medium, device and computing equipment
Zollmann et al. Image-based ghostings for single layer occlusions in augmented reality
US20220189095A1 (en) Method and computer program product for producing 3 dimensional model data of a garment
WO2018040511A1 (en) Method for implementing conversion of two-dimensional image to three-dimensional scene based on ar
RU2427918C2 (en) Metaphor of 2d editing for 3d graphics
CN108876886B (en) Image processing method and device and computer equipment
CN108305312A (en) The generation method and device of 3D virtual images
CN105354876A (en) Mobile terminal based real-time 3D fitting method
EP3533218B1 (en) Simulating depth of field
CN108765520B (en) Text information rendering method and device, storage medium and electronic device
CN113658316B (en) Rendering method and device of three-dimensional model, storage medium and computer equipment
CN109448137A (en) Exchange method, interactive device, electronic equipment and storage medium
CN111583379A (en) Rendering method and device of virtual model, storage medium and electronic equipment
CN110033507B (en) Method, device and equipment for drawing internal trace of model map and readable storage medium
CN106204746A (en) A kind of augmented reality system realizing 3D model live paint
CN108230434B (en) Image texture processing method and device, storage medium and electronic device
CN113144613B (en) Model-based method for generating volume cloud
CN110610504A (en) Pencil drawing generation method and device based on skeleton and tone
KR20060108271A (en) Method of image-based virtual draping simulation for digital fashion design
CN112516595B (en) Magma rendering method, device, equipment and storage medium
US20180286130A1 (en) Graphical image augmentation of physical objects
CN114612641A (en) Material migration method and device and data processing method
CN113935893A (en) Sketch style scene rendering method and device and storage medium
CN113935891B (en) Pixel-style scene rendering method, device and storage medium
CN113935892A (en) Oil painting style scene rendering method and equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination