CN113935894B - Ink and wash style scene rendering method and equipment and storage medium - Google Patents

Ink and wash style scene rendering method and equipment and storage medium Download PDF

Info

Publication number
CN113935894B
CN113935894B CN202111058018.3A CN202111058018A CN113935894B CN 113935894 B CN113935894 B CN 113935894B CN 202111058018 A CN202111058018 A CN 202111058018A CN 113935894 B CN113935894 B CN 113935894B
Authority
CN
China
Prior art keywords
scene
pixel
light source
value
ink
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111058018.3A
Other languages
Chinese (zh)
Other versions
CN113935894A (en
Inventor
国家玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202111058018.3A priority Critical patent/CN113935894B/en
Publication of CN113935894A publication Critical patent/CN113935894A/en
Application granted granted Critical
Publication of CN113935894B publication Critical patent/CN113935894B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T3/04
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Abstract

The embodiment of the application provides a method and equipment for rendering a scene in a wash ink style and a storage medium. In the method, for a three-dimensional scene to be processed, the position relation between the object surface orientation of each scene element in the three-dimensional scene and the light source direction is obtained; determining color gradient values of the scene elements in the three-dimensional scene based on the position relation; and carrying out post-processing on the scene tone of the three-dimensional scene according to the color gradient value of each scene element in the three-dimensional scene to obtain the ink and wash stylized scene. According to the method, the gray scale color value of each scene element in the three-dimensional scene is extracted through the position relation between the object surface orientation of each scene element and the light source direction, so that the three-dimensional scene with the ink and wash style and the black and white gray tone is rendered. The method can convert the three-dimensional scene to be processed into the ink and wash style without redesigning modeling, greatly improves the scene rendering efficiency, and improves the expandability of the game scene.

Description

Ink and wash style scene rendering method and equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, and a storage medium for rendering a scene in an ink-wash style.
Background
A wash painting is a common painting mode. Scenes rendered by ink-wash typically have a single color tone, such as a black and white gray tone.
Taking game development as an example, a plurality of scenes are usually set in a game. In the related art, if a game scene is to be changed from a realistic style to an ink-wash style, each scene in the game needs to be redrawn and modeled, so that the development efficiency of the game is greatly reduced, and the existing game scene has poor expansibility. Therefore, a new solution is desired to overcome the existing technical problems.
Disclosure of Invention
Aspects of the application provide a method, device and storage medium for rendering scenes in a wash-and-ink style, so as to improve scene rendering efficiency and enhance expandability of game scenes.
The embodiment of the application also provides a scene rendering method in the ink and wash style, wherein the game comprises a virtual character controlled by a player; the method comprises the following steps:
for a three-dimensional scene to be processed, acquiring the position relation between the object surface orientation of each scene element in the three-dimensional scene and the light source direction;
determining color gradient values of the scene elements in the three-dimensional scene based on the position relations;
and carrying out post-processing on the scene tone of the three-dimensional scene according to the color gradient value of each scene element in the three-dimensional scene to obtain the ink and wash stylized scene.
Further optionally, for a three-dimensional scene to be processed, acquiring a position relation between the object surface orientation of each scene element in the three-dimensional scene and the light source direction; determining color gradient values of the scene elements in the three-dimensional scene based on the position relations; and carrying out post-processing on the scene tone of the three-dimensional scene according to the color gradient value of each scene element in the three-dimensional scene to obtain the ink and wash stylized scene.
Wherein, further optionally, the object surface orientation of each scene element comprises normal direction information of each scene element.
The acquiring the position relation between the object surface orientation of each scene element in the three-dimensional scene and the light source direction includes: determining a light source direction set for a three-dimensional scene; determining the normal direction of each pixel in each scene element under a world coordinate system from the normal direction information; and performing dot multiplication on the light source direction and the normal direction of each pixel to obtain an included angle between the normal direction of each pixel and the light source direction.
Wherein, further optionally, the determining the color gradient value of each scene element in the three-dimensional scene based on the position relationship comprises:
converting an included angle between the normal direction of each pixel in each scene element and the light source direction into a brightness gradient value of each pixel; taking the brightness gradient value of each pixel as a preset axis coordinate value in the texture coordinate of each pixel to obtain the texture coordinate value of each pixel; and sampling the gray map according to the texture coordinate value of each pixel to obtain the gray color value of each pixel in the three-dimensional scene.
Optionally, after sampling the grayscale map according to the texture coordinate value of each pixel, the method further includes: and multiplying the gray color value of each pixel in the three-dimensional scene by the scene color of each pixel in the scene map to obtain the optimized gray color value of each pixel in the three-dimensional scene.
Optionally, if the preset axis coordinate value is a horizontal axis coordinate value in the texture coordinate, the black, white and gray color bands in the gray map are arranged in a horizontal direction from deep to light.
Further optionally, the method further comprises: determining whether a self-defined template item of each scene element in the scene map is a preset value; and taking the scene element with the custom template item as a preset value as a specific scene element, and setting the scene tone of the specific scene element in the ink and wash stylized scene as a preset tone so as to enable the tone of the specific scene element in the ink and wash stylized scene to be different from the tones of other scene elements.
Wherein, further optionally, the preset hue comprises: bright side color values facing the light source direction and dark side color values facing away from the light source direction.
The setting the scene tone of the specific scene element in the ink and wash stylized scene to the preset tone comprises: judging whether each pixel in the specific scene element faces to a light source direction; if the current pixel is judged to face the light source direction, setting the color value of the current pixel in the ink and wash stylized scene as a bright surface color value; or if the current pixel is judged to face away from the light source direction, setting the color value of the current pixel in the ink and water stylized scene as the dark surface color value.
Further optionally, the method further comprises: and controlling the moving track of the light source through the blueprint component so that the direction of the light source is within a preset range relative to the virtual camera.
Further optionally, before determining the color gradient value of each scene element in the three-dimensional scene based on the position relationship, the method further includes: acquiring respective corresponding distances between each scene element and the virtual camera; if the distance corresponding to the current scene element is larger than the set distance threshold, determining that the current scene element is a sky ball, and setting the sky ball as a preset color value.
An embodiment of the present application further provides an electronic device, including: a memory and a processor; the memory is to store one or more computer instructions; the processor is to execute the one or more computer instructions to: the steps in the method provided by the embodiments of the present application are performed.
Embodiments of the present application further provide a computer-readable storage medium storing a computer program, where the computer program can implement the steps in the method provided in the embodiments of the present application when executed.
In the technical scheme provided by the embodiment of the application, a three-dimensional scene to be processed is obtained by first obtaining the position relation between the object surface orientation of each scene element in the three-dimensional scene and the light source direction. And then, based on the position relation between the object surface orientation of each scene element and the light source direction, the color depth degree of the scene elements is kept, and meanwhile, the color information of the scene elements is filtered, so that the color gradient value of each scene element in the three-dimensional scene is obtained. Because the color gradient value of each scene element can reflect the visual characteristics of each scene element in the color depth degree, the scene tone of the three-dimensional scene is post-processed according to the color gradient value of each scene element, and the ink and water stylized scene is obtained.
In the embodiment of the application, the gray color values of the scene elements in the three-dimensional scene can be extracted through the position relation between the object surface orientations of the scene elements and the light source direction, and then the three-dimensional scene with the ink and wash style with black and white gray tones can be rendered according to the gray color values, so that the ink and wash stylization of the three-dimensional scene is realized. In the embodiment of the application, the three-dimensional scene to be processed can be converted into the ink and wash style without redesigning and modeling, so that the scene rendering efficiency is greatly improved, and the expandability of the game scene is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart of a method for rendering a scene in an ink-wash style according to an exemplary embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a method for rendering a scene in an ink-wash style according to an exemplary embodiment of the present application;
fig. 3 is a schematic flowchart of a method for rendering a scene in an ink and wash style according to another exemplary embodiment of the present application;
FIG. 4 is a schematic diagram illustrating a method for rendering a scene in an ink-wash style according to an exemplary embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only a few embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
A wash painting is a common painting mode. Scenes rendered by ink-wash typically have a single color tone, such as a black and white gray tone.
Taking game development as an example, a plurality of scenes are usually set in a game. In the related art, if a game scene is to be changed from a realistic style to an ink and wash style, each scene in a game needs to be redrawn and modeled, so that the development efficiency of the game is greatly reduced, and the existing game scene has poor expansibility.
In view of the above technical problems, in some embodiments of the present application, a solution is provided, and the technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
An embodiment of the present application provides a method for rendering a scene in an ink and wash style, and fig. 1 is a schematic flowchart of the method for rendering a scene in an ink and wash style according to an exemplary embodiment of the present application. As shown in fig. 1, the method includes:
101. for a three-dimensional scene to be processed, acquiring the position relation between the object surface orientation of each scene element in the three-dimensional scene and the light source direction;
102. determining color gradient values of the scene elements in the three-dimensional scene based on the position relations;
103. and carrying out post-processing on the scene tone of the three-dimensional scene according to the color gradient value of each scene element in the three-dimensional scene to obtain the ink and wash stylized scene.
In the embodiment of the application, the three-dimensional scene can be a three-dimensional virtual scene in the fields of games, movies, online travel and the like. The three-dimensional scene includes one or more scene elements. Scene elements may be implemented as virtual characters, scene props, buildings, natural scenery, etc. in a game scene. In this embodiment of the present application, the scene element may be preset according to a game scenario, or may be randomly generated in the game, which is not limited in this embodiment of the present application.
The three-dimensional scene to be processed is a three-dimensional scene needing scene rendering. For example, assuming that the three-dimensional scene to be processed is a realistic style of game scene in a game, after post-processing, another realistic style of game scene in the ink in the game is obtained. In the two game scenes before and after the post-processing, the scene elements, the scene element layout and the attributes of the scene elements are all consistent, but the scene styles of the two game scenes are different. In particular, the scene style can be embodied as artistic style features in a three-dimensional scene, such as hue, lines, shading, and the like.
For a game scene, different scene styles can be understood as different artistic styles in the game scene. For example, in a realistic-style game scene, the scene hue and the light-dark relation of the scene elements are closer to the real world, and the scene elements are not outlined by a distinct line. In the ink and wash stylized scene, the scene tone is a single tone, such as a black-and-white gray tone, a yellow-and-white tone, and the like, and the light and shade relationship of the scene elements is represented by the shade of color.
Each step in the method for rendering an ink-wash style scene provided in fig. 1 is described below with reference to a specific embodiment.
In 101, for a three-dimensional scene to be processed, the position relation between the object surface orientation and the light source direction of each scene element in the three-dimensional scene is obtained.
In the above step, it is assumed that the object surface orientation of each scene element includes normal direction information of each scene element. Based on this, in 101, an alternative embodiment of acquiring the position relationship between the object surface orientation and the light source direction of each scene element in the three-dimensional scene may be implemented as:
determining a light source direction set for a three-dimensional scene; determining the normal direction of each pixel in each scene element under a world coordinate system from the normal direction information; and performing dot multiplication on the light source direction and the normal direction of each pixel to obtain an included angle between the normal direction of each pixel and the light source direction.
In the above steps, taking a game scene as an example, in the post-processing flow, a Light source direction (Light Vector) item preset in the three-dimensional scene is read to determine a Light source direction set for the three-dimensional scene, that is, Light Vector data in the post-processing flow. Further, from the Scene map (Scene Texture), the normal vector direction of each pixel in each Scene element in the world space coordinate system is acquired. And finally, performing dot multiplication on the Light source direction in the Light Vector data and the normal Vector direction of each pixel to obtain an included angle between the normal Vector direction of each pixel and the Light source direction. The included angle may be mapped to a specific included angle cosine value, and the specific range ratio of the included angle cosine value is, for example, -1 to 1, where 0 represents that the included angle between the normal direction and the light source direction is 180 degrees, and 1 represents that the included angle between the normal direction and the light source direction is 0 degree.
In practical application, the angle between the normal vector direction of each pixel and the light source direction can represent the relationship between the surface orientation of the object and the light source direction.
In an alternative embodiment, prior to 101, the light source directions are custom set for a three dimensional scene. In practical applications, the Light source direction can be defined by setting the Light Vector item in the post-processing flow.
Specifically, the light source direction may be adjusted according to the position relationship between the viewing angle of the player and the three-dimensional scene, that is, according to the shooting angle of the virtual camera in the three-dimensional scene. For example, the light source direction may be set to be toward the player, so that the shadow orientation is on the player side. In short, the light source direction is set to point from the player to the three-dimensional scene, so that the shadow orientation of the light source is set to the side facing the player, and thus a better picture effect can be obtained.
Further optionally, the moving track of the light source is controlled by the blueprint component, so that the direction of the light source is within a preset range relative to the virtual camera. Therefore, the shadow direction can be flexibly adjusted according to different picture requirements through the self-defined light source direction, the picture contrast is strengthened, and the scene rendering effect is improved.
Further, after acquiring the positional relationship between the object surface orientation of each scene element and the light source direction, 102, the color gradient value of each scene element in the three-dimensional scene is determined based on the positional relationship.
In the above steps, an optional embodiment of determining the color gradient value of each scene element in the three-dimensional scene based on the position relationship may be implemented as follows:
converting an included angle between the normal direction of each pixel in each scene element and the light source direction into a brightness gradient value of each pixel; taking the brightness gradient value of each pixel as a preset axis coordinate value in the texture coordinate of each pixel to obtain the texture coordinate value of each pixel; and sampling the gray map according to the texture coordinate value of each pixel to obtain the gray color value of each pixel in the three-dimensional scene.
In the above steps, the scene tone is assumed to be a black-white gray tone. The gray map is assumed to be the gray map shown in fig. 2. In fig. 2, black, white and gray color bands are arranged in the horizontal direction in order from dark to light. The preset axis coordinate value is assumed to be a horizontal axis coordinate value in the texture coordinate.
Based on the above assumptions, the angle between the normal direction of each pixel in each scene element and the light source direction is converted into a brightness Gradient value (Gradient) of each pixel through a remapping operation. A specific range is, for example, 0 to 1. In practical applications, the remapping operation can be implemented as the following process, that is: the luminance gradient value is (X +1) × 0.5. Wherein X is the cosine value of the included angle.
Further, based on the arrangement direction of the color bars shown in fig. 2, the luminance gradient value of each pixel is used as a horizontal coordinate value of the Texture coordinate (UV) of each pixel in the Texture map (Sample Texture), and the Texture coordinate value of each pixel is obtained. In this way, the texture coordinate value of each pixel is sampled in the grayscale map, so that the grayscale color value of each pixel in the three-dimensional scene is output.
In another embodiment, if the predetermined axis coordinate value is a longitudinal axis coordinate value in the texture coordinate, the black, white and gray stripes in the gray map are arranged longitudinally in the order from dark to light. The specific sampling steps are similar to the above, with the difference that: and taking the brightness gradient value of each pixel as a vertical coordinate value of a texture coordinate (UV) of each pixel in the texture map to obtain the texture coordinate value of each pixel.
In this way, the gray color values of the scene elements in the three-dimensional scene from the bright surface to the dark surface can be obtained through the steps. In practical application, the Contrast can be adjusted by using the Contrast parameter, so that the picture effect is improved.
Optionally, after sampling the grayscale map according to the texture coordinate value of each pixel, multiplying the grayscale color value of each pixel in the three-dimensional scene by the scene color of each pixel in the scene map to obtain the optimized grayscale color value of each pixel in the three-dimensional scene. Alternatively, a Desaturation node (Desaturation) may be used to reduce the saturation of each pixel to a preset value, so that the scene color value does not have the original scene color, and only the details of the scene element and the grayscale information are retained.
In addition, since the depth of the celestial sphere is large and is far from the virtual camera, it is easy to turn the celestial sphere black or white when performing gradient calculation. To avoid rendering the sky ball as an abnormal color value in the three-dimensional scene, optionally, before 102, respective distances of the respective scene elements from the virtual camera are acquired. And if the distance corresponding to the current scene element is greater than the set distance threshold, determining that the current scene element is a sky ball, and setting the sky ball as a preset color value. For example, pixels belonging to the sky sphere are filtered by setting the depth value. Specifically, the depth threshold is set to 50000, and if the current pixel is greater than the depth threshold, it is determined that the current pixel belongs to the sky sphere.
Finally, in 103, post-processing is performed on the scene tone of the three-dimensional scene according to the color gradient value of each scene element in the three-dimensional scene, so as to obtain the ink and wash stylized scene. Note that the ink-wash stylized scene obtained through post-processing is still a three-dimensional scene.
Specifically, in 103, the color gradient values of the scene elements in the three-dimensional scene are assigned to a post-processing (Postprocess Volume) component in the three-dimensional scene through a scene texture map, so that the three-dimensional scene with a certain non-ink-style (such as a realistic-style) is processed into an ink-and-water stylized scene through the post-processing component.
In this embodiment, the gray color value of each scene element in the three-dimensional scene can be extracted through the position relationship between the object surface orientation of each scene element and the light source direction, and then the three-dimensional scene with the ink style of black and white gray tones can be rendered according to the gray color value, so that the ink stylization of the three-dimensional scene is realized. In the embodiment of the application, the three-dimensional scene to be processed can be converted into the ink and wash style without redesigning and modeling, so that the scene rendering efficiency is greatly improved, and the expandability of the game scene is improved.
The embodiment of the application further provides another ink and wash style scene rendering method. Fig. 3 is a flowchart illustrating a method for rendering a scene in an ink-wash style according to an exemplary embodiment of the present application. As shown in fig. 3, the method includes:
301. for a three-dimensional scene to be processed, acquiring the position relation between the object surface orientation of each scene element in the three-dimensional scene and the light source direction;
302. determining color gradient values of the scene elements in the three-dimensional scene based on the position relation;
303. selecting a specific scene element from a three-dimensional scene, and designating the specific scene element as a preset tone;
304. and carrying out post-processing on the scene tone of the three-dimensional scene according to the color gradient value of each scene element in the three-dimensional scene and the preset tone of the specific scene element to obtain the ink and wash stylized scene.
It should be noted that steps 301 and 304 in the scene rendering method shown in fig. 3 are similar to step 101 and 103 in the scene rendering method shown in fig. 1, and the description is omitted here for the sake of brevity.
To enrich the visual effect in a three-dimensional scene, certain scene elements may also be specified in the three-dimensional scene as distinctive tones that are distinguished from the ink-wash style. Specifically, 303, a specific scene element is selected from the three-dimensional scene and specified as a preset hue.
In the foregoing or following embodiments, optionally, determining whether a Custom template item (Custom step) of each scene element in the scene map is a preset value; and taking the scene element with the custom template item as a preset value as a specific scene element, and setting the scene tone of the specific scene element in the ink and water stylized scene as a preset tone so as to enable the tone of the specific scene element in the ink and water stylized scene to be different from the tones of other scene elements.
Wherein the preset tone set for the specific scene element includes: bright surface color values facing the light source direction and dark surface color values facing away from the light source direction.
Specifically, whether the self-defined template item of each scene element in the scene map is a preset value is judged, and accordingly, the scene element with the self-defined template item as the preset value is selected as the specific scene element. For example, for plant scene elements such as leaves, grass and the like in the three-dimensional scene, the plant scene elements are subjected to color filtering through the custom template items in the scene map, so that the plant scene elements are taken as specific scene elements to be specified as the color tones set by intentions. Specifically, assuming that plant scene elements in the three-dimensional scene are set as preset mark values in the custom template item, based on this, whether the custom template item is the preset mark value is judged from the scene material map, and the plant scene elements with the custom template item as the preset mark value are selected as the specific scene elements. And generating a mask map based on the selected plant scene elements.
Then, whether each pixel in the specific scene element faces the light source direction is judged. And if the current pixel is judged to face the light source direction, the current pixel is indicated to be on a bright surface, and under the condition, the color value of the current pixel in the ink and wash stylized scene is set as a bright surface color value. And if the current pixel is judged to face away from the light source direction, the current pixel is indicated to be in a dark surface, and under the condition, the color value of the current pixel in the ink and water stylized scene is set as the color value of the dark surface. For example, based on the mask map generated above, it is determined whether each pixel in the plant scene element is oriented in the light source direction, and the pixel is set to a bright color value (e.g., bright yellow) or a dark color value (e.g., dark yellow) based on the determination result. The specific scene element is, for example, a plant scene element such as a leaf, a grass, or the like in the game scene shown in fig. 4.
In practical applications, interpolation from a bright color value to a dark color value in a specific scene element can be realized through an interpolation node (Linear Lerp), so that the visual effect of the specific scene element is improved by fusing two hues in the specific scene element.
In this embodiment, through the above steps, not only the ink stylization of the three-dimensional scene can be realized, but also the specific scene element can be rendered into the preset tone, so that the tone of the specific scene element in the ink stylized scene is different from the tones of other scene elements, and the visual effect of the three-dimensional scene is greatly enriched.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 101 to 104 may be device a; for another example, the execution subject of steps 101 and 102 may be device a, and the execution subject of step 103 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 101, 102, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Fig. 5 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application, and as shown in fig. 5, the electronic device includes: memory 501, processor 502, communication component 503, and display component 504.
The memory 501 is used for storing computer programs and may be configured to store other various data to support operations on the electronic device. Examples of such data include instructions for any application or method operating on the electronic device, contact data, phonebook data, messages, pictures, videos, and so forth.
The memory 501 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 502, coupled to the memory 501, for executing computer programs in the memory 501 for: for a three-dimensional scene to be processed, acquiring the position relation between the object surface orientation of each scene element in the three-dimensional scene and the light source direction; determining color gradient values of the scene elements in the three-dimensional scene based on the position relation; and performing post-processing on the scene tone of the three-dimensional scene according to the color gradient value of each scene element in the three-dimensional scene to obtain the ink and wash stylized scene.
Further optionally, the object surface orientation of each scene element comprises normal direction information of each scene element.
When acquiring the position relationship between the object surface orientation of each scene element in the three-dimensional scene and the light source direction, the processor 502 is specifically configured to:
determining a light source direction set for a three-dimensional scene;
determining the normal direction of each pixel in each scene element under a world coordinate system from the normal direction information;
and performing dot multiplication on the light source direction and the normal direction of each pixel to obtain an included angle between the normal direction of each pixel and the light source direction.
Further optionally, when the processor 502 determines the color gradient value of each scene element in the three-dimensional scene based on the position relationship, it is specifically configured to:
converting an included angle between the normal direction of each pixel in each scene element and the light source direction into a brightness gradient value of each pixel;
taking the brightness gradient value of each pixel as a preset axis coordinate value in the texture coordinate of each pixel to obtain the texture coordinate value of each pixel;
and sampling the gray map according to the texture coordinate value of each pixel to obtain the gray color value of each pixel in the three-dimensional scene.
Wherein, further optionally, the processor 502, after sampling the grayscale map according to the texture coordinate value of each pixel, is further configured to:
and multiplying the gray color value of each pixel in the three-dimensional scene by the scene color of each pixel in the scene map to obtain the optimized gray color value of each pixel in the three-dimensional scene.
Optionally, if the preset axis coordinate value is a horizontal axis coordinate value in the texture coordinate, the black, white and gray color bands in the gray map are arranged in a horizontal direction from deep to light.
Further optionally, the processor 502 is further configured to:
determining whether a self-defined template item of each scene element in the scene map is a preset value;
and taking the scene element with the custom template item as a preset value as a specific scene element, and setting the scene tone of the specific scene element in the ink and wash stylized scene as a preset tone so as to enable the tone of the specific scene element in the ink and wash stylized scene to be different from the tones of other scene elements.
Wherein, further optionally, the preset hue comprises: bright surface color values facing the light source direction and dark surface color values facing away from the light source direction.
The processor 502 is specifically configured to, when setting the scene tone of the specific scene element in the ink and wash stylized scene as a preset tone:
judging whether each pixel in the specific scene element faces to the light source direction;
if the current pixel is judged to face the light source direction, setting the color value of the current pixel in the ink and wash stylized scene as a bright surface color value; or alternatively
And if the current pixel is judged to face away from the light source direction, setting the color value of the current pixel in the ink and water stylized scene as the dark surface color value.
Further optionally, the processor 502 is further configured to: and controlling the moving track of the light source through the blueprint component so that the direction of the light source is within a preset range relative to the virtual camera.
Further optionally, the processor 502, before determining the color gradient values of the respective scene elements in the three-dimensional scene based on the positional relationship, is further configured to:
acquiring respective corresponding distances between each scene element and the virtual camera;
if the distance corresponding to the current scene element is larger than the set distance threshold value, determining that the current scene element is a sky ball, and setting the sky ball as a preset color value.
Further, as shown in fig. 5, the electronic device further includes: power component 505, audio component 506, and the like. Only some of the components are schematically shown in fig. 5, and the electronic device is not meant to include only the components shown in fig. 4.
Wherein the communication component 503 is configured to facilitate communication between the device in which the communication component is located and other devices in a wired or wireless manner. The device in which the communication component is located may access a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G, or 5G, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may be implemented based on Near Field Communication (NFC) technology, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
Among other things, display component 504 may be implemented as a display that includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). The screen may be implemented as a touch screen to receive an input signal from a user, without the screen including a touch panel. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply unit 505 provides power to various components of the device in which the power supply unit is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
In this embodiment, through the position relationship between the object surface orientation of each scene element and the light source direction, the gray color value of each scene element in the three-dimensional scene can be extracted, and then the three-dimensional scene with the ink and wash style with the black and white gray tone can be rendered according to the gray color value, so that the ink and wash stylization of the three-dimensional scene is realized. In the embodiment of the application, the three-dimensional scene to be processed can be converted into the ink and wash style without redesigning and modeling, so that the scene rendering efficiency is greatly improved, and the expandability of the game scene is improved.
Accordingly, the present application further provides a computer-readable storage medium storing a computer program, where the computer program is capable of implementing the steps that can be executed by the electronic device in the foregoing method embodiments when executed.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include transitory computer readable media (transient media) such as modulated data signals and batches.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (9)

1. A method of ink-wash style scene rendering, the method comprising:
for a three-dimensional scene to be processed, acquiring the position relation between the object surface orientation of each scene element in the three-dimensional scene and the light source direction;
determining color gradient values of the scene elements in the three-dimensional scene based on the position relations;
according to the color gradient value of each scene element in the three-dimensional scene, carrying out post-processing on the scene tone of the three-dimensional scene to obtain an ink and wash stylized scene;
the object surface orientation of each scene element comprises normal direction information of each scene element;
the acquiring the position relation between the object surface orientation of each scene element in the three-dimensional scene and the light source direction includes:
determining a light source direction set for a three-dimensional scene;
determining the normal direction of each pixel in each scene element under a world coordinate system from the normal direction information;
performing dot multiplication on the light source direction and the normal direction of each pixel to obtain an included angle between the normal direction of each pixel and the light source direction;
the determining the color gradient value of each scene element in the three-dimensional scene based on the position relation comprises:
converting an included angle between the normal direction of each pixel in each scene element and the light source direction into a brightness gradient value of each pixel;
taking the brightness gradient value of each pixel as a preset axis coordinate value in the texture coordinate of each pixel to obtain the texture coordinate value of each pixel;
and sampling the gray map according to the texture coordinate value of each pixel to obtain the gray color value of each pixel in the three-dimensional scene.
2. The method of claim 1, wherein after sampling the grayscale map according to the texture coordinate value of each pixel, further comprising:
and multiplying the gray color value of each pixel in the three-dimensional scene by the scene color of each pixel in the scene map to obtain the optimized gray color value of each pixel in the three-dimensional scene.
3. The method according to claim 1, wherein if the predetermined axis coordinate value is a horizontal axis coordinate value in texture coordinates, the black, white and gray color bands in the gray map are arranged in a horizontal direction from dark to light.
4. The method of claim 1, further comprising:
determining whether a self-defined template item of each scene element in the scene map is a preset value;
and taking the scene element with the custom template item as a preset value as a specific scene element, and setting the scene tone of the specific scene element in the ink and wash stylized scene as a preset tone so as to enable the tone of the specific scene element in the ink and wash stylized scene to be different from the tones of other scene elements.
5. The method of claim 4, wherein the preset hue comprises: bright surface color values facing the light source direction and dark surface color values facing away from the light source direction;
the setting the scene tone of the specific scene element in the ink and wash stylized scene to the preset tone comprises:
judging whether each pixel in the specific scene element faces to a light source direction;
if the current pixel is judged to face the light source direction, setting the color value of the current pixel in the ink and wash stylized scene as a bright surface color value; or
And if the current pixel is judged to face away from the light source direction, setting the color value of the current pixel in the ink and water stylized scene as the dark surface color value.
6. The method of claim 1, further comprising:
and controlling the moving track of the light source through the blueprint component so that the direction of the light source is within a preset range relative to the virtual camera.
7. The method of claim 1, wherein determining the color gradient value of each scene element in the three-dimensional scene based on the positional relationship further comprises:
acquiring respective corresponding distances between each scene element and the virtual camera;
if the distance corresponding to the current scene element is larger than the set distance threshold, determining that the current scene element is a sky ball, and setting the sky ball as a preset color value.
8. An electronic device, comprising: a memory and a processor;
the memory is to store one or more computer instructions;
the processor is to execute the one or more computer instructions to: performing the steps in the method of any one of claims 1-7.
9. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium;
the computer program is arranged to carry out the steps of the method of any one of claims 1 to 7 when executed.
CN202111058018.3A 2021-09-09 2021-09-09 Ink and wash style scene rendering method and equipment and storage medium Active CN113935894B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111058018.3A CN113935894B (en) 2021-09-09 2021-09-09 Ink and wash style scene rendering method and equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111058018.3A CN113935894B (en) 2021-09-09 2021-09-09 Ink and wash style scene rendering method and equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113935894A CN113935894A (en) 2022-01-14
CN113935894B true CN113935894B (en) 2022-08-26

Family

ID=79275212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111058018.3A Active CN113935894B (en) 2021-09-09 2021-09-09 Ink and wash style scene rendering method and equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113935894B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112652046A (en) * 2020-12-18 2021-04-13 完美世界(重庆)互动科技有限公司 Game picture generation method, device, equipment and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016168378A1 (en) * 2015-04-13 2016-10-20 Gerard Dirk Smits Machine vision for ego-motion, segmenting, and classifying objects
CN105719327B (en) * 2016-02-29 2018-09-07 北京中邮云天科技有限公司 A kind of artistic style image processing method
CN109993822B (en) * 2019-04-10 2023-02-21 创新先进技术有限公司 Ink and wash style rendering method and device
CN110660112B (en) * 2019-09-29 2021-09-24 浙江大学 Drawing spectrum reconstruction method based on special color card and multispectral imaging
CN112070873B (en) * 2020-08-26 2021-08-20 完美世界(北京)软件科技发展有限公司 Model rendering method and device
CN113192168A (en) * 2021-04-01 2021-07-30 广州三七互娱科技有限公司 Game scene rendering method and device and electronic equipment
CN113177878B (en) * 2021-04-28 2023-09-22 广州光锥元信息科技有限公司 Method and device for realizing American cartoon style filter effect based on image transformation

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112652046A (en) * 2020-12-18 2021-04-13 完美世界(重庆)互动科技有限公司 Game picture generation method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN113935894A (en) 2022-01-14

Similar Documents

Publication Publication Date Title
KR20220030263A (en) texture mesh building
CN109377546B (en) Virtual reality model rendering method and device
CN111583379B (en) Virtual model rendering method and device, storage medium and electronic equipment
CN106447756B (en) Method and system for generating user-customized computer-generated animations
CN102750685A (en) Image processing method and device
CN108140251A (en) Video cycle generation
CN102572219B (en) Mobile terminal and image processing method thereof
CN110363837B (en) Method and device for processing texture image in game, electronic equipment and storage medium
CN113658316B (en) Rendering method and device of three-dimensional model, storage medium and computer equipment
KR20150079387A (en) Illuminating a Virtual Environment With Camera Light Data
CN113935891B (en) Pixel-style scene rendering method, device and storage medium
CN111402385B (en) Model processing method and device, electronic equipment and storage medium
CN113935894B (en) Ink and wash style scene rendering method and equipment and storage medium
KR20130095488A (en) Apparatus and method for providing three-dimensional model in the smart device
CN111640190A (en) AR effect presentation method and apparatus, electronic device and storage medium
CN108648255B (en) Asynchronous balance-based custom rendering method and device for samples
CN113935893A (en) Sketch style scene rendering method and device and storage medium
CN114170359A (en) Volume fog rendering method, device and equipment and storage medium
CN103795925A (en) Interactive main-and-auxiliary-picture real-time rendering photographing method
KR101352737B1 (en) Method of setting up effect on mobile movie authoring tool using effect configuring data and computer-readable meduim carring effect configuring data
CN112149745A (en) Method, device, equipment and storage medium for determining difficult example sample
CN112686984B (en) Rendering method, device, equipment and medium for sub-surface scattering effect
CN113935892A (en) Oil painting style scene rendering method and equipment and storage medium
JP6914369B2 (en) Vector format small image generation
Čejka et al. Tackling problems of marker-based augmented reality under water

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant