CN113935893A - Sketch style scene rendering method and device and storage medium - Google Patents

Sketch style scene rendering method and device and storage medium Download PDF

Info

Publication number
CN113935893A
CN113935893A CN202111058004.1A CN202111058004A CN113935893A CN 113935893 A CN113935893 A CN 113935893A CN 202111058004 A CN202111058004 A CN 202111058004A CN 113935893 A CN113935893 A CN 113935893A
Authority
CN
China
Prior art keywords
scene
sketch
element model
map
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111058004.1A
Other languages
Chinese (zh)
Inventor
国家玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202111058004.1A priority Critical patent/CN113935893A/en
Publication of CN113935893A publication Critical patent/CN113935893A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides a sketch style scene rendering method, equipment and a storage medium. In the method, for a three-dimensional scene to be processed, the position of a stroked line in each scene element model is determined; performing delineation coloring on each scene element model according to the position of the delineation line to obtain a three-dimensional scene containing the delineation line; according to the normal direction of each scene element model, tiling a sketch stroke mapping picture into a three-dimensional scene containing a delineation line through a material editor; and in a three-dimensional scene tiled with the sketch pen touch map, fusing the color of the scene and the color of pen touch of the sketch pen touch map to obtain a sketch stylized scene. According to the method, the sketch stylization of the three-dimensional scene can be realized by performing the tracing coloring and the sketch pen touch and map tiling processing on the scene element model, the three-dimensional scene to be processed can be converted into the sketch style without redesigning modeling, the scene rendering efficiency is greatly improved, and the scene expansibility is improved.

Description

Sketch style scene rendering method and device and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a sketch-style scene rendering method, device, and storage medium.
Background
Sketch is a common drawing method. Generally, in a sketch work, the relationship between an object and a light source is usually depicted by a line stroke, so that the outline and the detail characteristics of the object are embodied.
Taking game development as an example, a plurality of scenes are usually set in a game. In the related art, if a game scene is to be changed from a realistic writing style to a sketch style, each scene in the game needs to be redrawn and modeled, so that the game development efficiency is greatly reduced, and the existing game scene has poor expansibility. Therefore, a new solution is desired to overcome the existing technical problems.
Disclosure of Invention
Aspects of the present application provide a sketch-style scene rendering method, device, and storage medium, so as to improve scene rendering efficiency and enhance scene extensibility.
The embodiment of the present application further provides a sketch style scene rendering method, where the method includes:
for a three-dimensional scene to be processed, determining the positions of the stroked lines in each scene element model;
performing delineation coloring on each scene element model according to the position of the delineation line to obtain a three-dimensional scene containing the delineation line;
according to the normal direction of each scene element model, tiling a sketch stroke mapping picture into a three-dimensional scene containing the delineation line through a material editor;
and in a three-dimensional scene tiled with the sketch pen touch map, fusing the color of the scene and the color of pen touch of the sketch pen touch map to obtain a sketch stylized scene.
Further optionally, the material editor comprises a material ball acting in a post-processing stage.
Based on this, the tiling the sketch stroke mapping into the three-dimensional scene containing the delineation line through the material editor according to the normal direction of each scene element model includes:
inputting the normal direction of each scene element model, the sketch pen touch map and the tiling position and area of the sketch pen touch map in the three-dimensional scene containing the delineation line into the world alignment texture node added in the material ball;
according to the tiled position of the sketch pen touch map, setting the sketch pen touch map into each scene element model of the three-dimensional scene containing the delineation line through the world alignment texture node; and are
And adjusting the orientation of the sketch pen touch map in each scene element model through the world alignment texture nodes according to the normal direction of each scene element model and the initial normal direction of the sketch pen touch map.
Further optionally, the orientation of the sketch pen-touch map in each scene element model includes:
the plane of the sketch pen touch map is vertical to the normal direction of each scene element model.
Further optionally, the sketch pen-touch map is a map with a style of four-side continuity and with the sketch pen-touch effect.
Further optionally, the stroking lines comprise contour lines and/or polylines, the contour lines are the stroking lines used for representing the outer contour of the model, and the polylines are the stroking lines used for representing the inner contour of the model;
the determining the positions of the stroked lines in the scene element models comprises:
and acquiring edge pixels on contour lines and/or broken lines in each scene element model.
Further optionally, if the stroked line includes a contour line, the obtaining edge pixels on the contour line and/or the polyline in each scene element model includes:
for a pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel to obtain the texture coordinate of the surrounding pixels; wherein, the offset direction is the normal direction of the current pixel, and the offset distance is the product of the current screen view port size and the offset coefficient;
acquiring the depth difference between a current pixel and surrounding pixels;
and if the depth difference is larger than a set depth difference threshold value, taking the current pixel as an edge pixel on a contour line in the scene element model.
Further optionally, if the stroking line includes a polyline, the obtaining edge pixels on the contour line and/or the polyline in each scene element model includes:
for a pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel to obtain the texture coordinate of the surrounding pixels; wherein, the offset direction is the normal direction of the current pixel, and the offset distance is the product of the current screen view port size and the offset coefficient;
comparing respective normal values of the current pixel and the surrounding pixels in a world coordinate system;
and if the normal difference value between the current pixel and the surrounding pixels is larger than the set normal difference threshold value, taking the current pixel as an edge pixel on a broken line in the scene element model.
Further optionally, the rendering each scene element model based on the position of the stroked line to obtain a three-dimensional scene including the stroked line includes:
generating a first mask map containing edge pixels in each scene element model according to the edge pixels on the delineation line in each scene element model;
and performing linear interpolation between each edge pixel in the first mask map according to the scene color of each edge pixel to obtain a three-dimensional scene containing the delineation line.
Further optionally, the method further comprises:
sampling the three-dimensional noise image according to the texture coordinates of each edge pixel to obtain a noise value corresponding to each edge pixel;
and carrying out offset processing on the scene color of each edge pixel according to the noise value corresponding to each edge pixel to obtain a three-dimensional scene containing a discontinuous delineation line.
Further optionally, the fusing the scene color with the stroke color of the sketch stroke mapping to obtain the sketch stylized scene includes:
superposing the scene color of each pixel with the stroke color of the corresponding pixel in the sketch stroke mapping to obtain the target scene color of each pixel;
and rendering based on the target scene color of each pixel to obtain a sketch stylized scene.
Further optionally, if there are different types of sketch pen touch maps for different scene elements, the method further includes:
filtering scene elements using different types of sketch pen touch maps by adopting a custom depth template item;
generating a second mask map matched with the sketch pen touch map of different types according to the filtering result;
and adopting different types of sketch pen touch maps and the matched second mask map to execute the step of tiling the different types of sketch pen touch maps into the three-dimensional scene containing the delineation line.
Further optionally, the method further comprises:
filtering target scene elements which do not need to use a sketch stroke mapping by adopting a custom depth template item;
and generating a third mask map for marking the target scene element according to the filtering result, and adopting the third mask map to execute the step of tiling different types of sketch pen touch maps into the three-dimensional scene containing the sketch lines so as to keep the initial scene color of the target scene element in the sketch stylized scene.
An embodiment of the present application further provides an electronic device, including: a memory and a processor; the memory is to store one or more computer instructions; the processor is to execute the one or more computer instructions to: the steps in the method provided by the embodiments of the present application are performed.
Embodiments of the present application further provide a computer-readable storage medium storing a computer program, where the computer program can implement the steps in the method provided in the embodiments of the present application when executed.
In the technical scheme provided by the embodiment of the application, for the three-dimensional scene to be processed, the positions of the stroked lines in the scene element models are determined; and performing edge tracing coloring on each scene element model according to the position of the edge tracing line to obtain a three-dimensional scene containing the edge tracing line, thereby tracing the outline of each scene element model. Further, according to the normal direction of each scene element model, tiling the sketch pen touch map into a three-dimensional scene containing a delineation line through a material editor; in a three-dimensional scene tiled with sketch pen touch maps, the scene colors and the pen touch colors of the sketch pen touch maps are fused, so that sketch pen touches on the surfaces of various scene element models are realized, and a sketch stylized scene is obtained.
In the embodiment of the application, the outlines of the scene element models are described by performing edge detection and edge tracing coloring on the scene element models, and then the sketch pen touch maps are tiled in the scene element models through the material editor to fill the scene colors with the sketch style, so that the sketch stylization of the three-dimensional scene is realized. In the embodiment of the application, the three-dimensional scene to be processed can be converted into the sketch style without redesigning and modeling, so that the scene rendering efficiency is greatly improved, and the scene expansibility is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a flowchart illustrating a sketch-style scene rendering method according to an exemplary embodiment of the present application;
FIG. 2 is a schematic diagram illustrating an effect of a scene element according to an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram of a sketch pen touch map provided in an exemplary embodiment of the present application;
FIG. 4 is a diagram illustrating an effect of another scene element according to an exemplary embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Sketch is a common drawing method. At present, in a scene represented by a sketch mode, a relationship between an object and a light source is described by a line stroke, so that an object outline and object detail characteristics are represented.
Taking game development as an example, a plurality of scenes are usually set in a game. In the related art, if a game scene is to be changed from a realistic writing style to a sketch style, each scene in the game needs to be redrawn and modeled, so that the game development efficiency is greatly reduced, and the existing game scene has poor expansibility.
In view of the above technical problems, in some embodiments of the present application, a solution is provided, and the technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
An embodiment of the present application provides a sketch-style scene rendering method, and fig. 1 is a schematic flow chart of the sketch-style scene rendering method provided in an exemplary embodiment of the present application. As shown in fig. 1, the method includes:
101. for a three-dimensional scene to be processed, determining the positions of the stroked lines in each scene element model;
102. performing delineation coloring on each scene element model according to the position of the delineation line to obtain a three-dimensional scene containing the delineation line;
103. according to the normal direction of each scene element model, tiling a sketch stroke mapping picture into a three-dimensional scene containing a delineation line through a material editor;
104. and in a three-dimensional scene tiled with the sketch pen touch map, fusing the color of the scene and the color of pen touch of the sketch pen touch map to obtain a sketch stylized scene.
The three-dimensional scene to be processed is a three-dimensional scene needing scene rendering. For example, it is assumed that the three-dimensional scene to be processed is a realistic game scene in a certain game, and after post-processing, another game scene in a sketch style in the game is obtained. In the two game scenes before and after the post-processing, the scene elements, the scene element layout and the attributes of the scene elements are all consistent, but the scene styles of the two game scenes are different. In particular, the scene style can be embodied as artistic style features in a three-dimensional scene, such as hue, lines, shading, and the like.
For a game scene, different scene styles can be understood as different artistic styles in the game scene. For example, in a realistic-style game scene, the scene hue and the light-dark relation of the scene elements are closer to the real world, and the scene elements are not outlined by a distinct line. In a game scene with a sketch style, the outline and the light-dark relation of scene elements are drawn by line strokes, so that the detailed characteristics of objects are embodied.
Each step in the sketch style scene rendering method provided in fig. 1 is described below with reference to a specific embodiment.
First, in 101, for a three-dimensional scene to be processed, the positions of the stroked lines in the respective scene element models are determined.
Wherein the stroking lines comprise contour lines and/or polylines. The contour line is a stroked line for representing the outer contour of the model, and the polygonal line is a stroked line for representing the inner contour of the model. Optionally, edges of the scene element model are determined by edge detection to provide a basis for the shading. In practical application, the coordinates of surrounding pixels are obtained from the currently detected pixel to detect the pixel at the edge of the scene element model. How to obtain the contour lines and the polylines, respectively, is described below.
Specifically, an alternative embodiment of determining the positions of the stroked lines in the scene element models in 101 may be implemented as: and acquiring edge pixels on contour lines and/or broken lines in each scene element model.
In an alternative embodiment, if the stroked line includes a contour line, the obtaining of the edge pixel on the contour line and/or the polyline in each scene element model in the above steps may be implemented as:
for a pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel to obtain the texture coordinate of the surrounding pixels; acquiring the depth difference between a current pixel and surrounding pixels; and if the depth difference is larger than the set depth difference threshold value, taking the current pixel as an edge pixel on the contour line in the scene element model.
Wherein, the offset direction is the normal direction of the current pixel, and the offset distance is the product of the current screen viewport size and the offset coefficient. Specifically, the normal direction of the current pixel is extracted from the Scene map (Scene Texture) as the offset direction; and extracting the current screen view port size from the scene map, and taking the product of the current screen view port size and the offset coefficient as the offset distance.
For the pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel based on the above-described shifting direction and shifting distance to obtain the texture coordinate of the surrounding pixels. Specifically, texture coordinates of surrounding pixels are obtained through vector operation according to coordinates of a current pixel point in a screen space. Further, a corresponding depth difference is calculated from the respective scene depths of the current pixel and the surrounding pixels. And if the depth difference between the current pixel and any surrounding pixel is greater than a set depth difference threshold value, taking the current pixel as an edge pixel on a contour line in the scene element model.
In another alternative embodiment, if the stroked line includes a polyline, the obtaining of the edge pixels on the contour line and/or the polyline in each scene element model in the above steps may be implemented as:
for a pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel to obtain the texture coordinate of the surrounding pixels; comparing respective normal values of the current pixel and the surrounding pixels in a world coordinate system; and if the normal difference value between the current pixel and the surrounding pixels is larger than the set normal difference threshold value, the current pixel is considered to be in the outline of the scene element model, and in this case, the current pixel is taken as an edge pixel on a broken line in the scene element model.
Wherein, the offset direction is the normal direction of the current pixel, and the offset distance is the product of the current screen viewport size and the offset coefficient. The specific obtaining manner of the offset direction and the offset distance of the current pixel is similar to that described above, and is not described herein again.
For the pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel based on the above-described shifting direction and shifting distance to obtain the texture coordinate of the surrounding pixels. And calculating corresponding normal difference values according to respective normal values of the current pixel and the surrounding pixels in the world coordinate system. And if the normal difference value of the current pixel and any surrounding pixel is greater than the set normal difference threshold value, the current pixel is considered to be in the inner contour of the scene element model, and in this case, the current pixel is taken as an edge pixel on a broken line in the scene element model.
In practical application, the threshold may be set and optimized according to a specific scenario, which is not limited in the embodiment of the present application.
Further, after the position of the stroking line is detected, at 102, each scene element model is stroked and colored according to the position of the stroking line, and a three-dimensional scene including the stroking line is obtained.
Specifically, in an optional embodiment, in 102, rendering each scene element model based on the position of the stroked line to obtain a three-dimensional scene including the stroked line may be implemented as:
generating a first mask map containing edge pixels in each scene element model according to the edge pixels on the stroked lines in each scene element model; and performing Linear interpolation (Linear Lerp) among the edge pixels in the first mask map according to the scene color of each edge pixel to obtain a three-dimensional scene containing a delineation line.
In the above step, a first mask map is generated according to the edge pixels of each scene element model on the stroked line, and the first mask map includes the edge pixels of each scene element model. And then, carrying out linear interpolation on the scene color of each edge pixel and the preset stroking line color of each edge pixel in the first mask image to obtain a three-dimensional scene containing the stroking line.
Therefore, the rendering of the stroked line can be realized in the three-dimensional scene through the steps, so that the stroked line with the sketch style is automatically generated, and the three-dimensional scene rendering efficiency is assisted to be improved.
In fact, since the sketch strokes in the real world are not all continuous line segments, continuous sketch lines can be obtained after the sketch strokes are rendered and colored through the steps, and the sketch effect is unnatural. For the problem, in order to make the sketch pen-touch have a more real effect, further optionally, sampling the three-dimensional noise map according to texture coordinates of each edge pixel to obtain a noise value corresponding to each edge pixel; and carrying out offset processing on the scene color of each edge pixel according to the noise value corresponding to each edge pixel to obtain a three-dimensional scene containing a discontinuous delineation line.
In the above steps, for a three-dimensional Scene including a delineation line, inverse operation is first performed according to the Scene Depth (Scene Depth) of the current pixel to obtain the world coordinate of the current pixel in the world coordinate system. Then, the world coordinates of the current pixel are used for remapping, and texture coordinates (marked as UVW) of the current pixel for the three-dimensional noise map are obtained. Further, a three-dimensional Berlin Noise (Perlin Noise) map is sampled using texture coordinates of the current pixel, resulting in a continuous Noise value. In this way, the continuous noise values are used to offset the delineation lines, thereby simulating the effect of short lines and obtaining a three-dimensional scene containing discontinuous delineation lines. Among them, berlin noise is a continuous noise often used to simulate clouds.
In short, a continuous noise value is obtained by mapping the texture coordinates of the current pixel to the three-dimensional berlin noise, and the value range of the continuous noise value is usually between 0 and 1 in practical application. Then, the continuous noise value is multiplied by the color value of the edge line, so as to obtain the edge line containing an invalid value (for example, a continuous noise value with a value of 0). I.e. the non-continuous stubs shown in the boxes of fig. 2.
Further, after the rendering of the delineation is completed, in 103, the sketch stroke map is tiled into the three-dimensional scene containing the delineation line by the material editor according to the normal direction of each scene element model.
The sketch pen touch map is a map with a continuous style in four directions and a sketch pen touch effect. The four-way continuity means that the texture edge of the mapping is continuous, and the texture is continuous after a plurality of mappings are connected. Such as the sketch pen touch map shown in fig. 3.
Alternatively, different types of sketch stroke maps can be used for tiling into a three-dimensional scene containing the sketch lines to adapt to different types of scene element models. In practical applications, the types of sketch maps include, but are not limited to: at least one of the size of the map, the texture of the map and the thickness of the lines.
In an alternative embodiment, in 103, according to the normal direction of each scene element model, tiling the sketch stroke map into the three-dimensional scene containing the delineation line by the material editor may be implemented as:
the normal direction of each scene element model, the tiling position and the area of the sketch pen-touch map and the sketch pen-touch map in the three-dimensional scene containing the drawing line are input into a World Aligned Texture (World Aligned Texture) node added to a material ball.
Wherein the material editor comprises a material ball acting in a post-processing stage. Various material nodes can be called in the material ball to realize corresponding functions. The material nodes are visual script nodes which convert the blueprint state machine into a material expression. In an alternative embodiment, world-aligned texture nodes are added to the texture ball. In particular, in the world-aligned texture node, different positions and different downward axial projection textures can be specified in a world coordinate system, and the size of the texture is kept not to scale along with the scaling of the scene element model.
And further, according to the tiled position of the sketch stroke mapping graph, the sketch stroke mapping graph is arranged into each scene element model of the three-dimensional scene containing the delineation line through the world alignment texture nodes. And adjusting the orientation of the sketch pen touch map in each scene element model through the world alignment texture nodes according to the normal direction of each scene element model and the initial normal direction of the sketch pen touch map. Wherein, the orientation of the sketch stroke mapping in each scene element model at least comprises: the plane of the sketch pen touch map is vertical to the normal direction of each scene element model.
For example, assume that the scene element model is a box shown in fig. 4, in which six surfaces are oriented differently, and therefore, the tiling direction of the sketch pen touch map needs to be adjusted according to the orientation of the six surfaces to obtain the sketch texture effect adapted to each surface of the box.
Briefly, a sketch pen-touch map (Texture Object), a map Size (Texture Size), a Position coordinate (World Position) of each pixel in a screen Space, and Normal information (World Space Normal) are input to World-aligned Texture nodes so that the orientation of the sketch pen-touch map superimposed on the pixels is adjusted by the World-aligned Texture nodes.
Finally, in 104, in the three-dimensional scene tiled with the sketch pen touch map, the scene color and the pen touch color of the sketch pen touch map are fused to obtain a sketch stylized scene.
Specifically, in an optional embodiment, the scene color of each pixel may be superimposed with the stroke color of the corresponding pixel in the sketch stroke map to obtain the target scene color of each pixel; and rendering based on the target scene color of each pixel to obtain a sketch stylized scene.
In this embodiment, the outline of each scene element model is described by performing edge detection and edge painting on each scene element model, and then a sketch pen touch map is tiled in each scene element model by a material editor to fill in the scene colors with a sketch style, thereby realizing the sketch stylization of a three-dimensional scene. In the embodiment of the application, the three-dimensional scene to be processed can be converted into the sketch style without redesigning and modeling, so that the scene rendering efficiency is greatly improved, and the scene expansibility is improved.
In the above or following embodiments, the strokes used on the surfaces of different objects in the actual sketch work are different, for example, the strokes on the surface of a large object may be spaced more closely, and the strokes on the surface of a small object may be spaced more closely and less closely. Thus, different types of sketch pen touch maps need to be used for rendering for different types of objects, where the types of sketch pen touch maps include at least one of: the size of a map, the texture of the map and the thickness of lines. In an alternative embodiment, the following steps are also performed:
if different types of sketch stroke maps for different scene elements exist, filtering the scene elements using the different types of sketch stroke maps by using a Custom depth template item (Custom Stencil); generating a second mask map matched with the sketch pen touch map of different types according to the filtering result; and adopting different types of sketch pen touch maps and the matched second mask map to execute the step of tiling the different types of sketch pen touch maps into the three-dimensional scene containing the delineation line.
It should be noted that the step of tiling different types of sketch stroke maps into the three-dimensional scene including the delineation line is similar to the step 103, and is not repeated herein.
Therefore, different types of strokes in different types of scene elements can be achieved through the steps, the sketch stroke effect of each scene element is more consistent with the stroke effect in the actual sketch work, the stroke effect is improved, and the scene rendering efficiency of the sketch stylized scene is improved.
In the above or following embodiments, since some scene elements in the three-dimensional scene do not need to cover the sketch texture, the scene elements need to be filtered, and the scene elements are prevented from being covered by the sketch-style texture through the third mask map, so that the scene elements retain the original scene color.
Specifically, the user-defined depth template item can be adopted to filter the target scene elements without using the sketch pen touch map; and generating a third mask map for marking the target scene element according to the filtering result, and adopting the third mask map to execute the step of tiling different types of sketch pen touch maps into a three-dimensional scene containing the sketch lines so as to reserve the initial scene color of the target scene element in the sketch stylized scene.
The above-mentioned partial scene elements are, for example: various scene icons, virtual characters, and other scene elements that need not be overlaid with sketch textures.
For example, according to marks corresponding to various scene icons, corresponding marks to be filtered are set in the custom depth template item, so that various scene icons are filtered, a corresponding third mask is generated, various scene icons are flexibly reserved, initial scene colors are reserved, and rendering efficiency is improved.
For example, according to the scene depth of the virtual character in the three-dimensional scene, the corresponding scene depth is set in the custom depth template item, so that the virtual character is filtered out, and a corresponding third mask map is generated to keep the appearance image of the virtual character in the three-dimensional scene.
Therefore, in the sketch stylized scene with the steps, the initial scene color of partial scene elements is reserved, and the scene rendering efficiency of the sketch stylized scene is improved.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 101 to 104 may be device a; for another example, the execution subject of steps 101 and 102 may be device a, and the execution subject of step 103 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 101, 102, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Fig. 5 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application, and as shown in fig. 5, the electronic device includes: memory 501, processor 502, communication component 503, and display component 504.
The memory 501 is used for storing computer programs and may be configured to store other various data to support operations on the electronic device. Examples of such data include instructions for any application or method operating on the electronic device, contact data, phonebook data, messages, pictures, videos, and so forth.
The memory 501 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 502, coupled to the memory 501, for executing computer programs in the memory 501 for: for a three-dimensional scene to be processed, determining the positions of the stroked lines in each scene element model; performing delineation coloring on each scene element model according to the position of the delineation line to obtain a three-dimensional scene containing the delineation line; according to the normal direction of each scene element model, tiling a sketch stroke mapping picture into a three-dimensional scene containing the delineation line through a material editor; and in a three-dimensional scene tiled with the sketch pen touch map, fusing the color of the scene and the color of pen touch of the sketch pen touch map to obtain a sketch stylized scene.
Further optionally, the material editor comprises a material ball acting in a post-processing stage.
Based on this, when the sketch stroke map is tiled into the three-dimensional scene including the delineation line by the material editor according to the normal direction of each scene element model, the processor 502 is specifically configured to:
inputting the normal direction of each scene element model, the sketch pen touch map and the tiling position and area of the sketch pen touch map in the three-dimensional scene containing the delineation line into a world alignment texture node added in the material ball; according to the tiled position of the sketch pen touch map, setting the sketch pen touch map into each scene element model of the three-dimensional scene containing the delineation line through the world alignment texture node; and adjusting the orientation of the sketch pen touch map in each scene element model through the world alignment texture nodes according to the normal direction of each scene element model and the initial normal direction of the sketch pen touch map.
Optionally, when the orientation of the sketch pen-touch map in each scene element model is determined by the processor 502, the method is specifically configured to: the plane of the sketch pen touch map is vertical to the normal direction of each scene element model.
Wherein, further optionally, the sketch pen-touch map is a map with four continuous styles and the sketch pen-touch effect.
Further optionally, the stroking lines comprise contour lines for representing an outer contour of the model and/or polylines for representing an inner contour of the model.
When the processor 502 determines the positions of the stroked lines in each scene element model, it is specifically configured to: and acquiring edge pixels on contour lines and/or broken lines in each scene element model.
Further optionally, if the stroked line includes a contour line, when the processor 502 obtains an edge pixel on the contour line and/or the polyline in each scene element model, it is specifically configured to:
for a pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel to obtain the texture coordinate of the surrounding pixels; wherein, the offset direction is the normal direction of the current pixel, and the offset distance is the product of the current screen view port size and the offset coefficient; acquiring the depth difference between a current pixel and surrounding pixels; and if the depth difference is larger than a set depth difference threshold value, taking the current pixel as an edge pixel on a contour line in the scene element model.
Further optionally, if the stroking line includes a polyline, when the processor 502 obtains an edge pixel on the contour line and/or the polyline in each scene element model, it is specifically configured to:
for a pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel to obtain the texture coordinate of the surrounding pixels; wherein, the offset direction is the normal direction of the current pixel, and the offset distance is the product of the current screen view port size and the offset coefficient; comparing respective normal values of the current pixel and the surrounding pixels in a world coordinate system; and if the normal difference value between the current pixel and the surrounding pixels is larger than the set normal difference threshold value, taking the current pixel as an edge pixel on a broken line in the scene element model.
Further optionally, when the processor 502 performs the stroking and coloring on each scene element model according to the position of the stroking line to obtain the three-dimensional scene including the stroking line, the processor is specifically configured to:
generating a first mask map containing edge pixels in each scene element model according to the edge pixels on the delineation line in each scene element model; and performing linear interpolation between each edge pixel in the first mask map according to the scene color of each edge pixel to obtain a three-dimensional scene containing the delineation line.
Wherein, further optionally, the processor 502 is further configured to:
sampling the three-dimensional noise image according to the texture coordinates of each edge pixel to obtain a noise value corresponding to each edge pixel; and carrying out offset processing on the scene color of each edge pixel according to the noise value corresponding to each edge pixel to obtain a three-dimensional scene containing a discontinuous delineation line.
Optionally, the processor 502 may further blend the scene color and the stroke color of the sketch stroke mapping chart to obtain a sketch stylized scene, which is specifically configured to:
superposing the scene color of each pixel with the stroke color of the corresponding pixel in the sketch stroke mapping to obtain the target scene color of each pixel; and rendering based on the target scene color of each pixel to obtain a sketch stylized scene.
Wherein, further optionally, if there are different types of sketch pen touch maps for different scene elements, the processor 502 is further configured to:
filtering scene elements using different types of sketch pen touch maps by adopting a custom depth template item; generating a second mask map matched with the sketch pen touch map of different types according to the filtering result; and adopting different types of sketch pen touch maps and the matched second mask map to execute the step of tiling the different types of sketch pen touch maps into the three-dimensional scene containing the delineation line.
Wherein, further optionally, the processor 502 is further configured to:
filtering target scene elements which do not need to use a sketch stroke mapping by adopting a custom depth template item; and generating a third mask map for marking the target scene element according to the filtering result, and adopting the third mask map to execute the step of tiling different types of sketch pen touch maps into the three-dimensional scene containing the sketch lines so as to keep the initial scene color of the target scene element in the sketch stylized scene.
Further, as shown in fig. 5, the electronic device further includes: power component 505, audio component 506, and the like. Only some of the components are schematically shown in fig. 5, and it is not meant that the electronic device comprises only the components shown in fig. 5.
Wherein the communication component 503 is configured to facilitate communication between the device in which the communication component is located and other devices in a wired or wireless manner. The device in which the communication component is located may access a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G, or 5G, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may be implemented based on Near Field Communication (NFC) technology, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
Among other things, display component 504 may be implemented as a display that includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). The screen may be implemented as a touch screen to receive an input signal from a user, without the screen including a touch panel. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply unit 505 provides power to various components of the device in which the power supply unit is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
In this embodiment, the outlines of the scene element models are described by performing edge detection and edge tracing coloring on the scene element models, and then sketch pen touch maps are tiled in the scene element models by a material editor to fill the scene colors with a sketch style, so that the sketch stylization of a three-dimensional scene is realized. In the embodiment of the application, the three-dimensional scene to be processed can be converted into the sketch style without redesigning and modeling, so that the scene rendering efficiency is greatly improved, and the scene expansibility is improved.
Accordingly, the present application further provides a computer-readable storage medium storing a computer program, where the computer program is capable of implementing the steps that can be executed by the electronic device in the foregoing method embodiments when executed.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include transitory computer readable media (transient media) such as modulated data signals and batches.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (14)

1. A sketch-style scene rendering method, the method comprising:
for a three-dimensional scene to be processed, determining the positions of the stroked lines in each scene element model;
performing delineation coloring on each scene element model according to the position of the delineation line to obtain a three-dimensional scene containing the delineation line;
according to the normal direction of each scene element model, tiling a sketch stroke mapping picture into a three-dimensional scene containing the delineation line through a material editor;
and in a three-dimensional scene tiled with the sketch pen touch map, fusing the color of the scene and the color of pen touch of the sketch pen touch map to obtain a sketch stylized scene.
2. The method of claim 1, wherein the material editor comprises a material ball that acts on a post-processing stage;
according to the normal direction of each scene element model, tiling a sketch stroke mapping picture into a three-dimensional scene containing the delineation line through a material editor, and the method comprises the following steps:
inputting the normal direction of each scene element model, the sketch pen touch map and the tiling position and area of the sketch pen touch map in the three-dimensional scene containing the delineation line into the world alignment texture node added in the material ball;
according to the tiled position of the sketch pen touch map, setting the sketch pen touch map into each scene element model of the three-dimensional scene containing the delineation line through the world alignment texture node; and are
And adjusting the orientation of the sketch pen touch map in each scene element model through the world alignment texture nodes according to the normal direction of each scene element model and the initial normal direction of the sketch pen touch map.
3. The method of claim 2, wherein the orientation of the sketch pen-touch map in each scene element model comprises:
the plane of the sketch pen touch map is vertical to the normal direction of each scene element model.
4. A method according to any one of claims 1 to 3, wherein the sketch pen-touch map is a map with a four-way continuation of the pattern and with the effect of sketch pen-touch.
5. The method according to claim 1, wherein the stroked line comprises a stroked line and/or a polyline, the stroked line is a stroked line for representing an outer contour of the model, and the polyline is a stroked line for representing an inner contour of the model;
the determining the positions of the stroked lines in the scene element models comprises:
and acquiring edge pixels on contour lines and/or broken lines in each scene element model.
6. The method of claim 5, wherein if the stroked lines include contour lines, the obtaining edge pixels on the contour lines and/or polylines in each scene element model comprises:
for a pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel to obtain the texture coordinate of the surrounding pixels; wherein, the offset direction is the normal direction of the current pixel, and the offset distance is the product of the current screen view port size and the offset coefficient;
acquiring the depth difference between a current pixel and surrounding pixels;
and if the depth difference is larger than a set depth difference threshold value, taking the current pixel as an edge pixel on a contour line in the scene element model.
7. The method of claim 5, wherein if the stroked line includes a polyline, the obtaining edge pixels on the contour line and/or the polyline in each scene element model comprises:
for a pixel to be detected in each scene element model, shifting the texture coordinate of the current pixel to obtain the texture coordinate of the surrounding pixels; wherein, the offset direction is the normal direction of the current pixel, and the offset distance is the product of the current screen view port size and the offset coefficient;
comparing respective normal values of the current pixel and the surrounding pixels in a world coordinate system;
and if the normal difference value between the current pixel and the surrounding pixels is larger than the set normal difference threshold value, taking the current pixel as an edge pixel on a broken line in the scene element model.
8. The method of claim 1, wherein said rendering each scene element model based on the position of the stroked line to obtain a three-dimensional scene including the stroked line comprises:
generating a first mask map containing edge pixels in each scene element model according to the edge pixels on the delineation line in each scene element model;
and performing linear interpolation between each edge pixel in the first mask map according to the scene color of each edge pixel to obtain a three-dimensional scene containing the delineation line.
9. The method of claim 8, further comprising:
sampling the three-dimensional noise image according to the texture coordinates of each edge pixel to obtain a noise value corresponding to each edge pixel;
and carrying out offset processing on the scene color of each edge pixel according to the noise value corresponding to each edge pixel to obtain a three-dimensional scene containing a discontinuous delineation line.
10. The method of claim 1, wherein fusing the scene colors with the stroke colors of the sketch stroke map to obtain the sketch stylized scene comprises:
superposing the scene color of each pixel with the stroke color of the corresponding pixel in the sketch stroke mapping to obtain the target scene color of each pixel;
and rendering based on the target scene color of each pixel to obtain a sketch stylized scene.
11. The method of claim 1, wherein if there are different types of sketch stroke maps for different scene elements, the method further comprises:
filtering scene elements using different types of sketch pen touch maps by adopting a custom depth template item;
generating a second mask map matched with the sketch pen touch map of different types according to the filtering result;
and adopting different types of sketch pen touch maps and the matched second mask map to execute the step of tiling the different types of sketch pen touch maps into the three-dimensional scene containing the delineation line.
12. The method of claim 1, further comprising:
filtering target scene elements which do not need to use a sketch stroke mapping by adopting a custom depth template item;
and generating a third mask map for marking the target scene element according to the filtering result, and adopting the third mask map to execute the step of tiling different types of sketch pen touch maps into the three-dimensional scene containing the sketch lines so as to keep the initial scene color of the target scene element in the sketch stylized scene.
13. An electronic device, comprising: a memory and a processor;
the memory is to store one or more computer instructions;
the processor is to execute the one or more computer instructions to: performing the steps of the method of any one of claims 1-12.
14. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium;
the computer program is arranged to carry out the steps of the method of any one of claims 1 to 12 when executed.
CN202111058004.1A 2021-09-09 2021-09-09 Sketch style scene rendering method and device and storage medium Pending CN113935893A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111058004.1A CN113935893A (en) 2021-09-09 2021-09-09 Sketch style scene rendering method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111058004.1A CN113935893A (en) 2021-09-09 2021-09-09 Sketch style scene rendering method and device and storage medium

Publications (1)

Publication Number Publication Date
CN113935893A true CN113935893A (en) 2022-01-14

Family

ID=79275205

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111058004.1A Pending CN113935893A (en) 2021-09-09 2021-09-09 Sketch style scene rendering method and device and storage medium

Country Status (1)

Country Link
CN (1) CN113935893A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117274432A (en) * 2023-09-20 2023-12-22 书行科技(北京)有限公司 Method, device, equipment and readable storage medium for generating image edge special effect

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7061501B1 (en) * 2000-11-07 2006-06-13 Intel Corporation Rendering a pencil-sketch image
CN104778739A (en) * 2015-03-27 2015-07-15 浙江慧谷信息技术有限公司 Computer-based real-time sketch rendering algorithm
CN106910237A (en) * 2017-02-24 2017-06-30 盐城工学院 A kind of real-time sketch rendering system of computer and its algorithm
WO2020207202A1 (en) * 2019-04-11 2020-10-15 腾讯科技(深圳)有限公司 Shadow rendering method and apparatus, computer device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7061501B1 (en) * 2000-11-07 2006-06-13 Intel Corporation Rendering a pencil-sketch image
CN104778739A (en) * 2015-03-27 2015-07-15 浙江慧谷信息技术有限公司 Computer-based real-time sketch rendering algorithm
CN106910237A (en) * 2017-02-24 2017-06-30 盐城工学院 A kind of real-time sketch rendering system of computer and its algorithm
WO2020207202A1 (en) * 2019-04-11 2020-10-15 腾讯科技(深圳)有限公司 Shadow rendering method and apparatus, computer device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
冯乐乐: "《Unity Shader入门精要》", 30 June 2016, 人民邮电出版社, pages: 7 - 15 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117274432A (en) * 2023-09-20 2023-12-22 书行科技(北京)有限公司 Method, device, equipment and readable storage medium for generating image edge special effect
CN117274432B (en) * 2023-09-20 2024-05-14 书行科技(北京)有限公司 Method, device, equipment and readable storage medium for generating image edge special effect

Similar Documents

Publication Publication Date Title
WO2018040511A1 (en) Method for implementing conversion of two-dimensional image to three-dimensional scene based on ar
AU2017235889B2 (en) Digitizing physical sculptures with a desired control mesh in 3d
US11961200B2 (en) Method and computer program product for producing 3 dimensional model data of a garment
Zollmann et al. Image-based ghostings for single layer occlusions in augmented reality
CN106997613B (en) 3D model generation from 2D images
CN107918549B (en) Marking method and device for three-dimensional expansion drawing, computer equipment and storage medium
US20130057540A1 (en) Methods and apparatus for digital stereo drawing
EP3533218B1 (en) Simulating depth of field
CN111583379B (en) Virtual model rendering method and device, storage medium and electronic equipment
CN107578367B (en) Method and device for generating stylized image
CN110033507B (en) Method, device and equipment for drawing internal trace of model map and readable storage medium
CN106447756B (en) Method and system for generating user-customized computer-generated animations
CN112184852A (en) Auxiliary drawing method and device based on virtual imaging, storage medium and electronic device
Zhang et al. NK-CDS: A creative design system for museum art derivatives
CN113935893A (en) Sketch style scene rendering method and device and storage medium
CN111402385B (en) Model processing method and device, electronic equipment and storage medium
CN113935891B (en) Pixel-style scene rendering method, device and storage medium
CN110378948B (en) 3D model reconstruction method and device and electronic equipment
Liu et al. Fog effect for photography using stereo vision
CN113935892A (en) Oil painting style scene rendering method and equipment and storage medium
CN113935894B (en) Ink and wash style scene rendering method and equipment and storage medium
CN113763546A (en) Card preview method and device and electronic equipment
CN108171784B (en) Rendering method and terminal
JP2001266176A (en) Picture processor, picture processing method and recording medium
CN112417061B (en) Three-dimensional dynamic plotting display method based on time drive

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination