CN113144611A - Scene rendering method and device, computer storage medium and electronic equipment - Google Patents

Scene rendering method and device, computer storage medium and electronic equipment Download PDF

Info

Publication number
CN113144611A
CN113144611A CN202110281256.4A CN202110281256A CN113144611A CN 113144611 A CN113144611 A CN 113144611A CN 202110281256 A CN202110281256 A CN 202110281256A CN 113144611 A CN113144611 A CN 113144611A
Authority
CN
China
Prior art keywords
illumination
light source
map
coordinate space
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110281256.4A
Other languages
Chinese (zh)
Inventor
李沉思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110281256.4A priority Critical patent/CN113144611A/en
Publication of CN113144611A publication Critical patent/CN113144611A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/55Radiosity

Abstract

The disclosure relates to the technical field of scene rendering, and provides a scene rendering method, a scene rendering device, a computer storage medium and an electronic device, wherein the scene rendering method comprises the following steps: creating an environment map according to the direction of a basic light source in a user-defined coordinate space; the user-defined coordinate space is obtained by rotating the coordinate axis of the world coordinate space; baking the decolored game scene according to the environment map to obtain an illumination map; the illumination map comprises illumination values of different pixel points in the game scene in different directions of the basic light source; determining target illumination values of different pixel points of the target light source direction in the game scene according to the illumination map; and determining the final illumination color according to the target illumination value and the input color information, and rendering the game scene according to the final illumination color. The scene rendering method simplifies the data processing flow, does not need to customize a baking tool, and reduces the development cost.

Description

Scene rendering method and device, computer storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of scene rendering technologies, and in particular, to a scene rendering method, a scene rendering apparatus, a computer storage medium, and an electronic device.
Background
During the development of the game, the developer needs to render virtual objects (such as characters and scenery in the game) in the game. For the illumination information of the static illumination in the game, the intensity, color and shielding information of the illumination are projected and baked to an illumination map for storage according to the UV arrangement of the static object in the game in advance by a baking mode, and then the object in the game is rendered through the illumination map.
In the related art, light sources in three directions are generally required to be placed, baked three times from the three directions, brightness information is separated, and then combined into an illumination map recording brightness sources in the three directions.
In view of this, there is a need in the art to develop a new scene rendering method and apparatus.
It is to be noted that the information disclosed in the background section above is only used to enhance understanding of the background of the present disclosure.
Disclosure of Invention
The present disclosure is directed to a scene rendering method, a scene rendering apparatus, a computer storage medium, and an electronic device, so as to overcome, at least to some extent, the drawback of the related art that an additional customized baking tool is required.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a scene rendering method, including: creating an environment map according to the direction of a basic light source in a user-defined coordinate space; the user-defined coordinate space is obtained after rotation processing is carried out on the coordinate axis of the world coordinate space; baking the game scene after the color is removed according to the environment map to obtain an illumination map; the illumination map comprises illumination values of different pixel points of different basic light source directions in the game scene; determining target illumination values of different pixel points of the target light source direction in the game scene according to the illumination map; and determining a final illumination color according to the target illumination value and the input color information, and rendering the game scene according to the final illumination color.
In an exemplary embodiment of the present disclosure, the basic light source direction in the custom coordinate space is obtained by: obtaining a base vector of the user-defined coordinate space; and determining the direction of the base vector as the direction of the base light source in the custom coordinate space.
In an exemplary embodiment of the disclosure, the creating an environment map according to a basic light source direction in a custom coordinate space includes: mapping the base vectors to a spherical mapping to determine pixel points corresponding to each base vector; normalizing the basis vectors, and converting the obtained normalized basis vectors into color values; and generating the environment map according to the color value of each pixel point.
In an exemplary embodiment of the present disclosure, the mapping the basis vectors to a spherical map to determine a pixel point corresponding to each of the basis vectors includes: and mapping the base vectors to a three-dimensional rectangular coordinate system taking the center of the spherical mapping as a coordinate origin to determine pixel points corresponding to each base vector.
In an exemplary embodiment of the present disclosure, the determining, according to the illumination map, target illumination values of different pixel points of a target light source direction in the game scene includes: converting the target light source direction into a specified light source direction in the custom coordinate space; normalizing the specified light source direction to obtain a normalized direction; and determining the target illumination values of different pixel points of the target light source direction in the game scene according to the product of the normalization direction and the illumination value stored in the illumination map.
In an exemplary embodiment of the present disclosure, the converting the target light source direction into a specified light source direction in the custom coordinate space includes: and converting the target light source direction into a specified light source direction in the custom coordinate space according to the base vector of the custom coordinate space.
In an exemplary embodiment of the present disclosure, the determining a final illumination color according to the target illumination value and the input color information includes: and determining the product of the target illumination value and the input color information as the final illumination color.
According to a second aspect of the present disclosure, there is provided a scene rendering apparatus including: the environment map generating module is used for creating an environment map according to the direction of the basic light source in the user-defined coordinate space; the user-defined coordinate space is obtained after rotation processing is carried out on the coordinate axis of the world coordinate space; the baking processing module is used for baking the decolored game scene according to the environment map to obtain an illumination map; the illumination map comprises illumination values of different pixel points of different basic light source directions in the game scene; the illumination value determining module is used for determining target illumination values of different pixel points of a target light source direction in the game scene according to the illumination map; and the rendering module is used for determining the final illumination color according to the target illumination value and the input color information and rendering the game scene according to the final illumination color.
According to a third aspect of the present disclosure, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the scene rendering method of the first aspect described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the scene rendering method of the first aspect via execution of the executable instructions.
As can be seen from the foregoing technical solutions, the scene rendering method, the scene rendering apparatus, the computer storage medium, and the electronic device in the exemplary embodiments of the present disclosure have at least the following advantages and positive effects:
in some embodiments of the present disclosure, an environment map is created according to a basic light source direction in a custom coordinate space (the custom coordinate space is obtained by rotating a coordinate axis of a world coordinate space), and different light source directions from the sky can be simultaneously stored in the environment map. Furthermore, the game scene after being decolored is baked according to the environment map to obtain the illumination map (the illumination map contains illumination values of different pixel points of different basic light source directions in the game scene), so that illumination distribution of different light source directions can be obtained only by baking once, the technical problems that in the related technology, three times of baking are needed from three directions respectively, the data processing flow is complex are solved, the data processing flow is simplified, the calculated data amount is reduced, the use of a memory is reduced, the performance in operation is improved, and single baking can be directly realized through a universal baking tool of an engine, the technical problem that in the related technology, a baking device needs to be customized is solved, and the development cost is reduced. On the other hand, the target illumination values of different pixel points of the target light source direction in the game scene are determined according to the illumination map, so that the technical problem that the data processing flow is complex due to the fact that the weights corresponding to three channels (RGB) of the illumination map need to be determined according to the current basic light source direction and then the contribution value of the current basic light source direction to illumination is calculated in the related technology is solved, and the related data processing flow is simplified. And determining the final illumination color according to the target illumination value and the input color information, rendering the game scene according to the final illumination color, displaying different illumination colors according to the tone of the game scene, and improving the immersion and fidelity of the game.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 shows a schematic flow diagram of a scene rendering method in an exemplary embodiment of the present disclosure;
FIG. 2 illustrates a schematic diagram of a custom coordinate space in an exemplary embodiment of the present disclosure;
FIG. 3 is a flow diagram illustrating the creation of an environment map from a base luminaire direction in a custom coordinate space in an exemplary embodiment of the present disclosure;
FIG. 4 illustrates a schematic diagram of an environment map generated in an exemplary embodiment of the present disclosure;
FIG. 5 is a flow diagram illustrating a process for determining a target illumination value for a target light source direction at a game scene from an illumination map in an exemplary embodiment of the present disclosure;
6A-6E illustrate display effects of a game scene when the direction of a target light source is changed in an exemplary embodiment of the disclosure;
fig. 7 shows a schematic structural diagram of a rendering apparatus in an exemplary embodiment of the present disclosure;
fig. 8 shows a schematic structural diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
At present, the common global illumination implementation modes are mainly as follows: placing light sources in three directions, respectively baking the light sources three times from the three directions, separating brightness information of the light sources, combining the light sources into a light map recording brightness sources in the three directions, further calculating weights corresponding to three channels of the light map in a shader according to the source direction of the light source in the current direction in a three-direction coordinate space to obtain contribution of the light source direction to illumination brightness, and dyeing illumination of a dark part and the brightness according to the tone of a scene environment to obtain final indirect illumination.
However, on one hand, the data generation process of the above scheme is complex, and requires three separate bakes and then combines the data, and on the other hand, the above scheme requires a customized baking tool (e.g. houdini) and is costly because the general baking process only supports a single baking with a single direction of the main light source.
In an embodiment of the present disclosure, a scene rendering method is first provided, which overcomes, at least to some extent, the defect that the scene rendering method provided in the prior art requires an additional customized baking tool.
Fig. 1 is a flowchart illustrating a scene rendering method according to an exemplary embodiment of the present disclosure, where an execution subject of the scene rendering method may be a server that renders a game scene.
Referring to fig. 1, a scene rendering method according to one embodiment of the present disclosure includes the steps of:
step S110, creating an environment map according to the direction of a basic light source in a user-defined coordinate space; the user-defined coordinate space is obtained by rotating the coordinate axis of the world coordinate space;
step S120, baking the game scene after the color is removed according to the environment map to obtain an illumination map; the illumination map comprises illumination values of different pixel points in the game scene in different directions of the basic light source;
step S130, determining target illumination values of different pixel points of the target light source direction in the game scene according to the illumination map;
and step S140, determining the final illumination color according to the target illumination value and the input color information, and rendering the game scene according to the final illumination color.
In the technical solution provided in the embodiment shown in fig. 1, on one hand, an environment map is created according to a basic light source direction in a custom coordinate space (the custom coordinate space is obtained by rotating a coordinate axis of a world coordinate space), and different light source directions from the sky can be simultaneously stored in the environment map. Furthermore, the game scene after being decolored is baked according to the environment map to obtain the illumination map (the illumination map contains illumination values of different pixel points of different basic light source directions in the game scene), so that illumination distribution of different light source directions can be obtained only by baking once, the technical problems that in the related technology, three times of baking are needed from three directions respectively, the data processing flow is complex are solved, the data processing flow is simplified, the calculated data amount is reduced, the use of a memory is reduced, the performance in operation is improved, and single baking can be directly realized through a universal baking tool of an engine, the technical problem that in the related technology, a baking device needs to be customized is solved, and the development cost is reduced. On the other hand, the target illumination values of different pixel points of the target light source direction in the game scene are determined according to the illumination map, so that the technical problem that the data processing flow is complex due to the fact that the weights corresponding to three channels (RGB) of the illumination map need to be determined according to the current basic light source direction and then the contribution value of the current basic light source direction to illumination is calculated in the related technology is solved, and the related data processing flow is simplified. And determining the final illumination color according to the target illumination value and the input color information, rendering the game scene according to the final illumination color, displaying different illumination colors according to the tone of the game scene, and improving the immersion and fidelity of the game.
The following describes the specific implementation of each step in fig. 1 in detail:
it should be noted that, at present, the light source direction in the world coordinate space is generally adopted to perform the processing procedure of the correlated light mapping, however, referring to fig. 2, fig. 2 shows a schematic diagram of the customized coordinate space in an exemplary embodiment of the present disclosure, since the main light source of GI (Global illumination) is always generated from the sky, and the sun is not illuminated from the ground, in the world coordinate space (three-dimensional space formed by X, Y, Z, O is the origin of coordinates), the main light direction does not use the direction information of the lower hemisphere, so the spatial conversion is performed on the direction of the baking light source in the present disclosure, and the customized coordinate space (three-dimensional space formed by V0, V1, and V2) is obtained. Compared with a world coordinate space, the positive directions of all coordinate axes in the user-defined coordinate space are distributed on an upper hemisphere, so that illumination information based on all sky directions can be recorded more directly.
Specifically, a set of basis vectors (V) satisfying "two by two orthogonal, length 1, and trisecting the circumference corresponding to the XZ plane" can be obtained0,V1,V2) For example, the set of basis vectors satisfying the above condition may be:
Figure BDA0002978516730000071
Figure BDA0002978516730000072
taking an approximation thereof can yield: v0=(0.8165,0.5774,0),V1=(-0.4082,0.5774,-0.7071),V2(-0.4082, 0.5774, 0.7071), thus, may beSo as to determine the three-dimensional space defined by the basis vector as the self-defined coordinate space.
After the above basis vectors are obtained, the direction of the basis vectors may be determined as the direction of the base light source in the custom coordinate space.
With continued reference to FIG. 1, in step S110, an environment map is created according to the basic illuminant directions in the custom coordinate space.
In this step, referring to fig. 3, fig. 3 is a schematic flowchart illustrating a process of creating an environment map according to a basic light source direction in a custom coordinate space in an exemplary embodiment of the present disclosure, including steps S301 to S303, where step S110 is explained below with reference to fig. 3:
in step S301, the basis vectors are mapped to the spherical map to determine the pixel points corresponding to each basis vector.
In this step, a three-dimensional rectangular coordinate system may be established with the center of the spherical map as the origin of coordinates, and then, the pixel point corresponding to each base vector may be determined according to the three-dimensional rectangular coordinate system.
In step S302, the basis vectors are normalized, and the obtained normalized basis vectors are converted into color values.
In this step, normalization processing may be performed on the basis vectors to obtain normalized basis vectors, and the program code for implementing this step may be: spherepixplmnlz ═ normaize (spherepixelpos).
Wherein, the normalization processing is as follows: and keeping the direction of the vector unchanged, and enabling the length of the vector to be 1. Illustratively, the vector V0The normalized basis vector obtained after normalization may be V'0=(a1,b1,c1)。
After obtaining the plurality of normalized basis vectors, the obtained normalized basis vectors may be converted into color values and stored, and the program code for implementing this step may be, for example: rgb (float 3) (saturrate (dot (V0, spherepixponmlz)), saturrate (dot (V1, spherepixponmlz)), saturrate (dot (V2, spherepixponmlz))).
In step S303, an environment map is generated according to the color value of each pixel point.
In this step, the color values of the pixels may be stored correspondingly to obtain an environment map. For example, when the color value is (0, 255, 0), it can be determined that the color of the pixel is red. Exemplarily, referring to fig. 4, fig. 4 shows a schematic diagram of an environment map generated under an HDR (High Dynamic Range Imaging) format 32 bits/channel condition in an exemplary embodiment of the present disclosure, where the condition may be changed according to actual conditions to obtain environment maps with different numerical accuracies.
With continued reference to fig. 1, in step S120, the game scene after the color removal is baked according to the environment map, so as to obtain an illumination map.
In this step, a game scene may be decolored first, and for example, a decoloring process may be implemented based on an engine shader, and the program code for implementing this step may be: desaturated trigilightmap dot (OriginalColor, float3(0.2126,0.7152,0.0722)), where OriginalColor represents the original light source color.
And baking the decolored game scene according to the environment map to obtain an illumination map, wherein the illumination map comprises illumination values of different pixel points of different basic light source directions in the game scene.
For example, the R channel in the illumination map (lightmap) may store the base luminaire direction V0In the game scene, the G channel can store the direction V of the basic light source according to the illumination values of different pixel points1In the illumination distribution of different pixel points in a game scene, the B channel can store the direction V of the basic light source2The illumination distribution of different pixel points in the game scene. Therefore, the method only needs to bake once, does not need to bake separately and then combine, simplifies the production process of the map, and can bake through the Messiah and Unity of the engine, UE4 and other general baking devices of the engine, does not need to customize the baking devices, and reduces the development cost.
The charting baking is a technology for rendering the illumination information into a charting and then pasting the baked charting back to the scene. In this way, the illumination information becomes the map, the CPU (Central Processing Unit) is not required to calculate time, and only the ordinary map is required to be calculated, so that the speed is extremely high.
In step S130, target illumination values of different pixel points of the target light source direction in the game scene are determined according to the illumination map.
In this step, referring to fig. 5, fig. 5 is a schematic flowchart illustrating a process of determining target illumination values of different pixel points of a target light source direction in a game scene according to an illumination map in an exemplary embodiment of the present disclosure, including steps S501 to S503, where step S130 is explained below with reference to fig. 5:
in step S501, the target light source direction is converted into a specified light source direction in the custom coordinate space.
In this step, the basis vector (V) of the customized coordinate space can be used0,V1,V2) The formed matrix is used as a conversion matrix between the world coordinate space and the custom coordinate space, and further, the target light source direction can be converted into the specified light source direction in the custom coordinate space based on the conversion matrix. Illustratively, the program code for implementing this step S501 may be:
float3 V0=(0.8165,0.5774,0);
float3 V1=(-0.4082,0.5774,-0.7071);
float3 V2=(-0.4082,0.5774,0.7071);
float3
TriGISunDir=float3(dot(V0,SunDirWorld),dot(V1,SunDirWorld),
dot(V2,SunDirWorld));
GILightingSimple=max(0,dot(normalize(TriGISunDir),GILightmap));
for example, when the scene displayed in the game is a 9 am scene, it may be determined that the target light source direction is a 9 am light source direction.
Illustratively, when the target light source direction is VpThe conversion matrix is (V)0,V1,V2) Then, the above-mentioned specified light source direction can be expressed as (V)p·V0,Vp·V1,Vp·V2)。
In step S502, normalization processing is performed on the designated light source direction to obtain a normalized direction.
In this step, the specified light source direction may be normalized to obtain a normalized direction. Illustratively, the resulting normalized direction may be (a2, b2, c 2).
In step S503, the target illumination values of different pixel points of the target light source direction in the game scene are determined according to the product of the normalized direction and the illumination value stored in the illumination map.
In this step, the target illumination values of different pixel points of the target light source direction in the game scene can be determined according to the product of the normalized direction and the illumination value stored in the illumination map. For example, after the illumination value (R1, B1, G1) of the pixel a is read from the illumination map, the illumination value may be normalized to obtain a target value (a3, B3, c 3).
Further, when the normalized direction is (a2, b2, c2), the target illumination value of the target light source direction at the pixel point a can be determined as: a2 a3+ b2 b3+ c2 c 3.
In step S140, a final illumination color is determined according to the target illumination value and the input color information, and the game scene is rendered according to the final illumination color.
In this step, after the target illumination value is obtained, different color information (for example, yellow is dyed to the illumination of a bright area, and gray is dyed to the illumination of a dark area) can be input according to the actual scene requirement, so as to dye the illumination of different pixel points. Specifically, the product of the target illumination value and the input color information may be determined as the final illumination color of the pixel point a. For example, when the input color value is (220,180,100), and the target illumination value a2 a3+ b2 b3+ c2 c3 is 0.5, the final illumination color of the pixel a may be (110,90, 50).
Furthermore, after the final illumination color of each pixel point is determined, the game scene can be rendered according to the final illumination color, so that the game scene presents a corresponding picture effect.
For example, referring to fig. 6A to 6E, fig. 6A to 6E show schematic diagrams of display effects of a game scene when the direction of a target light source is changed in an exemplary embodiment of the present disclosure, and it can be seen from the diagrams that the illumination effect in the present disclosure may be changed along with the change of the direction of the light source.
Based on the technical scheme, the method and the device for processing the data not only solve the technical problems that in the related technology, three times of baking are needed from three directions respectively, and the data processing flow is complex, simplify the data processing flow, reduce the calculated data amount, reduce the use of a memory, and improve the performance in operation, but also can be realized by a universal baking tool carried by an engine directly in a single baking mode, solve the technical problem that in the related technology, a baking device needs to be customized, and reduce the development cost.
The present disclosure also provides a scene rendering apparatus, and fig. 7 shows a schematic structural diagram of the scene rendering apparatus in an exemplary embodiment of the present disclosure; as shown in fig. 7, the scene rendering apparatus 700 may include an environment map generating module 701, a baking processing module 702, an illumination value determining module 703, and a rendering module 704. Wherein:
the environment map generating module 701 is configured to create an environment map according to a basic light source direction in the user-defined coordinate space; the user-defined coordinate space is obtained after the coordinate axes of the world coordinate space are rotated.
In an exemplary embodiment of the present disclosure, the environment map generation module is configured to obtain a base vector of the custom coordinate space; and determining the direction of the base vector as the direction of the base light source in the self-defined coordinate space.
In an exemplary embodiment of the present disclosure, the environment map generating module is configured to map the basis vectors into the spherical map to determine pixel points corresponding to each basis vector; normalizing the basis vectors, and converting the obtained normalized basis vectors into color values; and generating an environment map according to the color value of each pixel point.
In an exemplary embodiment of the present disclosure, the environment map generating module is configured to map the basis vectors into a three-dimensional rectangular coordinate system using a center of the spherical map as a coordinate origin to determine a pixel point corresponding to each basis vector.
A baking processing module 702, configured to bake the decolored game scene according to the environment map to obtain an illumination map; the illumination map contains illumination values of different pixel points in the game scene from different basic light source directions.
And an illumination value determining module 703, configured to determine, according to the illumination map, target illumination values of different pixel points of the target light source direction in the game scene.
In an exemplary embodiment of the present disclosure, the illumination value determination module is configured to convert the target light source direction into a specified light source direction in a custom coordinate space; normalizing the direction of the designated light source to obtain a normalized direction; and determining the target illumination values of different pixel points of the target light source direction in the game scene according to the product of the normalized direction and the illumination values stored in the illumination map.
In an exemplary embodiment of the present disclosure, the illumination value determination module is configured to convert the target light source direction into a specified light source direction in the custom coordinate space according to a basis vector of the custom coordinate space.
And a rendering module 704, configured to determine a final illumination color according to the target illumination value and the input color information, and render the game scene according to the final illumination color.
In an exemplary embodiment of the present disclosure, the rendering module is configured to determine a product of the target illumination value and the input color information as a final illumination color.
The specific details of each module in the scene rendering device have been described in detail in the corresponding scene rendering method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
The present application also provides a computer-readable storage medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable storage medium may transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The computer readable storage medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the above embodiments.
In addition, the embodiment of the disclosure also provides an electronic device capable of implementing the method.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 800 according to this embodiment of the disclosure is described below with reference to fig. 8. The electronic device 800 shown in fig. 8 is only an example and should not bring any limitations to the functionality and scope of use of the embodiments of the present disclosure.
As shown in fig. 8, electronic device 800 is in the form of a general purpose computing device. The components of the electronic device 800 may include, but are not limited to: the at least one processing unit 810, the at least one memory unit 820, a bus 830 connecting various system components (including the memory unit 820 and the processing unit 810), and a display unit 840.
Wherein the storage unit stores program code that is executable by the processing unit 810 to cause the processing unit 810 to perform steps according to various exemplary embodiments of the present disclosure as described in the "exemplary methods" section above in this specification. For example, the processing unit 810 may perform the following as shown in fig. 1: step S110, creating an environment map according to the direction of a basic light source in a user-defined coordinate space; the user-defined coordinate space is obtained by rotating the coordinate axis of the world coordinate space; step S120, baking the game scene after the color is removed according to the environment map to obtain an illumination map; the illumination map comprises illumination values of different pixel points in the game scene in different directions of the basic light source; step S130, determining target illumination values of different pixel points of the target light source direction in the game scene according to the illumination map; and step S140, determining the final illumination color according to the target illumination value and the input color information, and rendering the game scene according to the final illumination color.
The storage unit 820 may include readable media in the form of volatile memory units such as a random access memory unit (RAM)8201 and/or a cache memory unit 8202, and may further include a read only memory unit (ROM) 8203.
The storage unit 820 may also include a program/utility 8204 having a set (at least one) of program modules 8205, such program modules 8205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 830 may be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 800 may also communicate with one or more external devices 900 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 800, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 800 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 850. Also, the electronic device 800 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 860. As shown, the network adapter 860 communicates with the other modules of the electronic device 800 via the bus 830. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 800, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (10)

1. A method of scene rendering, comprising:
creating an environment map according to the direction of a basic light source in a user-defined coordinate space; the user-defined coordinate space is obtained after rotation processing is carried out on the coordinate axis of the world coordinate space;
baking the game scene after the color is removed according to the environment map to obtain an illumination map; the illumination map comprises illumination values of different pixel points of different basic light source directions in the game scene;
determining target illumination values of different pixel points of the target light source direction in the game scene according to the illumination map;
and determining a final illumination color according to the target illumination value and the input color information, and rendering the game scene according to the final illumination color.
2. The method of claim 1, wherein the base illuminant direction in the custom coordinate space is obtained by:
obtaining a base vector of the user-defined coordinate space;
and determining the direction of the base vector as the direction of the base light source in the custom coordinate space.
3. The method of claim 2, wherein creating an environment map from the base luminaire directions in the custom coordinate space comprises:
mapping the base vectors to a spherical mapping to determine pixel points corresponding to each base vector;
normalizing the basis vectors, and converting the obtained normalized basis vectors into color values;
and generating the environment map according to the color value of each pixel point.
4. The method of claim 3, wherein said mapping the basis vectors into a spherical map to determine the pixel point corresponding to each of the basis vectors comprises:
and mapping the base vectors to a three-dimensional rectangular coordinate system taking the center of the spherical mapping as a coordinate origin to determine pixel points corresponding to each base vector.
5. The method of claim 1, wherein determining target illumination values for different pixel points of a target light source direction in the game scene from the illumination map comprises:
converting the target light source direction into a specified light source direction in the custom coordinate space;
normalizing the specified light source direction to obtain a normalized direction;
and determining the target illumination values of different pixel points of the target light source direction in the game scene according to the product of the normalization direction and the illumination value stored in the illumination map.
6. The method of claim 5, wherein converting the target light source direction to a specified light source direction in the custom coordinate space comprises:
and converting the target light source direction into a specified light source direction in the custom coordinate space according to the base vector of the custom coordinate space.
7. The method according to any one of claims 1 to 6, wherein determining a final illumination color based on the target illumination value and input color information comprises:
and determining the product of the target illumination value and the input color information as the final illumination color.
8. A scene rendering apparatus, comprising:
the environment map generating module is used for creating an environment map according to the direction of the basic light source in the user-defined coordinate space; the user-defined coordinate space is obtained after rotation processing is carried out on the coordinate axis of the world coordinate space;
the baking processing module is used for baking the decolored game scene according to the environment map to obtain an illumination map; the illumination map comprises illumination values of different pixel points of different basic light source directions in the game scene;
the illumination value determining module is used for determining target illumination values of different pixel points of a target light source direction in the game scene according to the illumination map;
and the rendering module is used for determining the final illumination color according to the target illumination value and the input color information and rendering the game scene according to the final illumination color.
9. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the scene rendering method of any of claims 1-7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the scene rendering method of any one of claims 1 to 7 via execution of the executable instructions.
CN202110281256.4A 2021-03-16 2021-03-16 Scene rendering method and device, computer storage medium and electronic equipment Pending CN113144611A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110281256.4A CN113144611A (en) 2021-03-16 2021-03-16 Scene rendering method and device, computer storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110281256.4A CN113144611A (en) 2021-03-16 2021-03-16 Scene rendering method and device, computer storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN113144611A true CN113144611A (en) 2021-07-23

Family

ID=76887305

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110281256.4A Pending CN113144611A (en) 2021-03-16 2021-03-16 Scene rendering method and device, computer storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113144611A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115063517A (en) * 2022-06-07 2022-09-16 网易(杭州)网络有限公司 Flash effect rendering method and device in game, storage medium and electronic equipment
CN116385612A (en) * 2023-03-16 2023-07-04 如你所视(北京)科技有限公司 Global illumination representation method and device under indoor scene and storage medium
WO2023231215A1 (en) * 2022-06-01 2023-12-07 合众新能源汽车股份有限公司 Scene rendering method and apparatus, electronic device, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0856815A2 (en) * 1997-01-31 1998-08-05 Microsoft Corporation Method and system for determining and/or using illumination maps in rendering images
US20050032574A1 (en) * 2003-07-23 2005-02-10 Nintendo Co., Ltd. Image processing program and image processing apparatus
CN104134230A (en) * 2014-01-22 2014-11-05 腾讯科技(深圳)有限公司 Image processing method, image processing device and computer equipment
CN108579082A (en) * 2018-04-27 2018-09-28 网易(杭州)网络有限公司 The method, apparatus and terminal of shadow are shown in game
CN109903385A (en) * 2019-04-29 2019-06-18 网易(杭州)网络有限公司 Rendering method, device, processor and the terminal of threedimensional model
CN111105491A (en) * 2019-11-25 2020-05-05 腾讯科技(深圳)有限公司 Scene rendering method and device, computer readable storage medium and computer equipment
CN111632378A (en) * 2020-06-08 2020-09-08 网易(杭州)网络有限公司 Illumination map making method, game model rendering method, illumination map making device, game model rendering device and electronic equipment
WO2021036395A1 (en) * 2019-08-27 2021-03-04 杭州群核信息技术有限公司 Pbr real-time rendering material conversion method, device, and system, and rendering method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0856815A2 (en) * 1997-01-31 1998-08-05 Microsoft Corporation Method and system for determining and/or using illumination maps in rendering images
US20050032574A1 (en) * 2003-07-23 2005-02-10 Nintendo Co., Ltd. Image processing program and image processing apparatus
CN104134230A (en) * 2014-01-22 2014-11-05 腾讯科技(深圳)有限公司 Image processing method, image processing device and computer equipment
CN108579082A (en) * 2018-04-27 2018-09-28 网易(杭州)网络有限公司 The method, apparatus and terminal of shadow are shown in game
CN109903385A (en) * 2019-04-29 2019-06-18 网易(杭州)网络有限公司 Rendering method, device, processor and the terminal of threedimensional model
WO2021036395A1 (en) * 2019-08-27 2021-03-04 杭州群核信息技术有限公司 Pbr real-time rendering material conversion method, device, and system, and rendering method
CN111105491A (en) * 2019-11-25 2020-05-05 腾讯科技(深圳)有限公司 Scene rendering method and device, computer readable storage medium and computer equipment
CN111632378A (en) * 2020-06-08 2020-09-08 网易(杭州)网络有限公司 Illumination map making method, game model rendering method, illumination map making device, game model rendering device and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023231215A1 (en) * 2022-06-01 2023-12-07 合众新能源汽车股份有限公司 Scene rendering method and apparatus, electronic device, and storage medium
CN115063517A (en) * 2022-06-07 2022-09-16 网易(杭州)网络有限公司 Flash effect rendering method and device in game, storage medium and electronic equipment
CN116385612A (en) * 2023-03-16 2023-07-04 如你所视(北京)科技有限公司 Global illumination representation method and device under indoor scene and storage medium
CN116385612B (en) * 2023-03-16 2024-02-20 如你所视(北京)科技有限公司 Global illumination representation method and device under indoor scene and storage medium

Similar Documents

Publication Publication Date Title
CN109685869B (en) Virtual model rendering method and device, storage medium and electronic equipment
CN113144611A (en) Scene rendering method and device, computer storage medium and electronic equipment
CN107358643B (en) Image processing method, image processing device, electronic equipment and storage medium
CN109903366B (en) Virtual model rendering method and device, storage medium and electronic equipment
US10636336B2 (en) Mixed primary display with spatially modulated backlight
US11176901B1 (en) Pan-warping and modifying sub-frames with an up-sampled frame rate
CN112116692A (en) Model rendering method, device and equipment
US20230120253A1 (en) Method and apparatus for generating virtual character, electronic device and readable storage medium
US10210788B2 (en) Displaying method and display with subpixel rendering
CN112912823A (en) Generating and modifying representations of objects in augmented reality or virtual reality scenes
US11942009B2 (en) Display non-uniformity correction
CN110177287A (en) A kind of image procossing and live broadcasting method, device, equipment and storage medium
CN111080806A (en) Map processing method and device, electronic device and storage medium
JP3549871B2 (en) Drawing processing apparatus and method, recording medium storing drawing processing program, drawing processing program
CN111383311B (en) Normal map generation method, device, equipment and storage medium
CN114127834A (en) System and method for spatiotemporal dithering
WO2022121653A1 (en) Transparency determination method and apparatus, electronic device, and storage medium
CN113487717B (en) Picture processing method and device, computer readable storage medium and electronic equipment
CN112862943A (en) Virtual model rendering method and device, storage medium and electronic equipment
US11011123B1 (en) Pan-warping and modifying sub-frames with an up-sampled frame rate
CN112580213A (en) Method and apparatus for generating display image of electric field lines, and storage medium
CN109448123B (en) Model control method and device, storage medium and electronic equipment
JP2003168130A (en) System for previewing photorealistic rendering of synthetic scene in real-time
US20220366873A1 (en) Display artifact reduction
JP2022165401A (en) Control method, apparatus, device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination