CN112190936A - Game scene rendering method, device, equipment and storage medium - Google Patents
Game scene rendering method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN112190936A CN112190936A CN202011075527.2A CN202011075527A CN112190936A CN 112190936 A CN112190936 A CN 112190936A CN 202011075527 A CN202011075527 A CN 202011075527A CN 112190936 A CN112190936 A CN 112190936A
- Authority
- CN
- China
- Prior art keywords
- rendering
- parameter
- rendered
- game scene
- direct light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000009877 rendering Methods 0.000 title claims abstract description 328
- 238000000034 method Methods 0.000 title claims abstract description 96
- 239000000463 material Substances 0.000 claims abstract description 275
- 239000002184 metal Substances 0.000 claims description 16
- 238000013507 mapping Methods 0.000 claims description 6
- 238000001465 metallisation Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 4
- 230000000694 effects Effects 0.000 abstract description 43
- 238000012545 processing Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 238000004040 coloring Methods 0.000 description 5
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 239000000306 component Substances 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 239000008358 core component Substances 0.000 description 1
- 230000009193 crawling Effects 0.000 description 1
- 235000019800 disodium phosphate Nutrition 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/56—Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Image Generation (AREA)
Abstract
The application provides a game scene rendering method, a game scene rendering device, game scene rendering equipment and a storage medium, and relates to the technical field of scene configuration. The method comprises the following steps: acquiring a first material rendering parameter and a second material rendering parameter of an object to be rendered through a material shader of the object to be rendered in a game scene; according to the first material rendering parameter, performing material rendering on the direct light irradiation area on the object to be rendered by adopting a physical rendering method; and according to the second material rendering parameter, performing material rendering on the region which is not irradiated by the direct light on the object to be rendered by adopting a physical rendering method. Compared with the prior art, the effect of the direct light irradiation area and the area which is not irradiated by the direct light in the game scene can be adjusted through a simple mode, so that the contrast of the direct light irradiation area and the area which is not irradiated by the direct light is balanced, and the problem that the performance consumption is increased due to the fact that details of a dark part are processed is solved.
Description
Technical Field
The present application relates to the field of scene configuration technologies, and in particular, to a method, an apparatus, a device, and a storage medium for rendering a game scene.
Background
In the process of making a game scene, a bright area and a dark area of the scene are often distinguished by whether the direct light irradiation can be received or not, the bright area can receive the direct light irradiation, the dark area is shielded and cannot be influenced by the direct light, the whole dark part often presents an almost same deep gray effect, and therefore extra light treatment is often required to be carried out on the dark area.
In order to highlight the details of the dark part in the prior art, the surface of the object is generally interacted by light rays generated by direct light and indirect light, so that the coloring result of the surface of the object after the interaction is obtained and displayed.
However, this approach of using the light source to process the dark portion details increases performance consumption and makes it difficult to balance the contrast between the bright and dark portions.
Disclosure of Invention
An object of the present application is to provide a method, an apparatus, a device and a storage medium for rendering a game scene, so as to solve the problems in the prior art that processing the details of the dark portion increases the performance consumption and it is difficult to balance the contrast between the bright portion and the dark portion.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides a method for rendering a game scene, where the method includes:
acquiring a first material rendering parameter and a second material rendering parameter of an object to be rendered through a material shader of the object to be rendered in a game scene;
according to the first material rendering parameter, performing material rendering on the direct light irradiation area on the object to be rendered by adopting a physical rendering method;
and according to the second material rendering parameter, performing material rendering on the region which is not irradiated by the direct light on the object to be rendered by adopting a physical rendering method.
Optionally, the obtaining, by the texture shader of the object to be rendered in the game scene, a first texture rendering parameter and a second texture rendering parameter of the object to be rendered includes:
obtaining the first material rendering parameter through a first configuration area on a configuration interface of the material shader;
and acquiring the second material rendering parameter through a second configuration area on a configuration interface of the material shader.
Optionally, the obtaining the first texture rendering parameter through a first configuration area on a configuration interface of the texture shader includes:
receiving a material mapping input through the first configuration area;
reading the first material rendering parameter from a preset channel in the material map; and the preset channel in the material mapping records material rendering parameters of corresponding types.
Optionally, the reading the first material rendering parameter from a preset channel in the material map includes:
a first metallization value is read from an R channel of the material map, and a first roughness value is read from a G channel of the material map.
Optionally, the obtaining, by a second configuration area on a configuration interface of the texture shader, the second texture rendering parameter includes:
receiving preset type material rendering parameters input through the second configuration area, wherein the second material rendering parameters comprise: and the preset type of material rendering parameters.
Optionally, the second configuration area has therein: at least one input box of preset type material rendering parameters;
the receiving of the preset type of material rendering parameters input through the second configuration area includes:
receiving the material rendering parameters of each preset type input through the input box of the material rendering parameters of each preset type: the second material rendering parameters include: the at least one preset type of material rendering parameter.
Optionally, the at least one preset type of material rendering parameter includes: a second metal value, and/or a second roughness value.
Optionally, the performing, according to the second material rendering parameter, a material rendering on a region, which is not irradiated by the direct light, of the object to be rendered by using a physical rendering method includes:
configuring highlight items of indirect light of a rendering pipeline of the physics-based rendering method according to the second material rendering parameters;
and performing material rendering on the region which is not directly irradiated by the direct light on the object to be rendered based on the configured rendering pipeline.
In a second aspect, another embodiment of the present application provides a game scene rendering apparatus, including: an acquisition module and a rendering module, wherein:
the acquisition module is used for acquiring a first material rendering parameter and a second material rendering parameter of an object to be rendered through a material shader of the object to be rendered in a game scene;
the rendering module is used for rendering the material of the direct light irradiation area on the object to be rendered by adopting a physical rendering method according to the first material rendering parameter; and according to the second material rendering parameter, performing material rendering on the region which is not irradiated by the direct light on the object to be rendered by adopting a physical rendering method.
Optionally, the obtaining module is specifically configured to obtain the first material rendering parameter through a first configuration area on a configuration interface of the material shader; and acquiring the second material rendering parameter through a second configuration area on a configuration interface of the material shader.
Optionally, the apparatus further comprises: a receiving module and a reading module, wherein:
the receiving module is used for receiving the texture mapping input through the first configuration area;
the reading module is used for reading the first material rendering parameter from a preset channel in the material map; and the preset channel in the material mapping records material rendering parameters of corresponding types.
Optionally, the reading module is specifically configured to read a first metal value from an R channel of the material map, and read a first roughness value from a G channel of the material map.
Optionally, the receiving module is specifically configured to receive a preset type of material rendering parameter input through the second configuration area, where the second material rendering parameter includes: and the preset type of material rendering parameters.
Optionally, the receiving module is specifically configured to receive the material rendering parameters of each preset type input through an input box of the material rendering parameters of each preset type: the second material rendering parameters include: the at least one preset type of material rendering parameter.
Optionally, the apparatus further comprises: a configuration module configured to configure highlights items of indirect light of a rendering pipeline of the physics-based rendering method according to the second material rendering parameters;
the rendering module is specifically configured to perform material rendering on an area, which is not irradiated by direct light, on the object to be rendered based on the configured rendering pipeline.
In a third aspect, another embodiment of the present application provides a game scene rendering apparatus, including: a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, the processor and the storage medium communicate via the bus when the game scene rendering device is running, and the processor executes the machine-readable instructions to perform the steps of the method according to any one of the first aspect.
In a fourth aspect, another embodiment of the present application provides a storage medium having a computer program stored thereon, where the computer program is executed by a processor to perform the steps of the method according to any one of the above first aspects.
The beneficial effect of this application is: by adopting the game scene rendering method provided by the application, since the rendering is realized by directly obtaining the first material rendering parameter and the second material rendering parameter of the material shader of the object to be rendered and respectively performing material rendering on the direct light irradiation area and the area which is not irradiated by the direct light according to the first material rendering parameter and the second material rendering parameter, the effect of the area which is not irradiated by the direct light in the game scene and the effect of the area which is not irradiated by the direct light can be adjusted by a simple method without performing additional illumination processing on the area which is not irradiated by the direct light, so that the contrast between the area which is irradiated by the direct light and the area which is not irradiated by the direct light is balanced, the dead black effect of the area which is not irradiated by the direct light is avoided, the rendering effect of the game scene is improved, and the hierarchical effect between the area which is irradiated by the direct light and the area which is not irradiated by the direct light in the rendered game scene can be ensured, the rendering effect is more real.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic flowchart of a game scene rendering method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a game scene rendering method according to another embodiment of the present application;
fig. 3 is a schematic flowchart of a game scene rendering method according to another embodiment of the present application;
fig. 4 is a schematic flowchart of a game scene rendering method according to another embodiment of the present application;
fig. 5 is a schematic flowchart of a game scene rendering method according to another embodiment of the present application;
FIG. 6 is a schematic view of a configuration interface provided in an embodiment of the present application;
fig. 7 is a schematic structural diagram of a game scene rendering apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a game scene rendering apparatus according to another embodiment of the present application;
fig. 9 is a schematic structural diagram of a game scene rendering device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments.
The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
Additionally, the flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
To enable one skilled in the art to use the present disclosure, in connection with specific application scenarios: the rendering of a game scene is an example, and the following embodiments are given. It will be apparent to those skilled in the art that the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the application. Although the present application is primarily described in the context of rendering of a game scene, it should be understood that this is merely one exemplary embodiment.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
For the sake of facilitating an understanding of the present application, some terms referred to in the present application are explained below:
shader (Shader): the Shader stores execution codes of the GPU, is used for informing the GPU of how to draw the color of the target pixel, and is widely applied to the field of three-dimensional graphics. Various three-dimensional graphic effects can be obtained through the shader code.
Direct light (Direct light): light that is emitted from a light source directly onto an object is called direct light.
Indirect light (Indirect light): in the real world, objects can be illuminated by indirect lighting. Indirect lighting means that light is generally reflected between objects and finally enters the camera, i.e. the light is reflected by more than one object before entering the camera.
A Physical Based Rendering (PBR) is a real-time Rendering method widely used in recent years, and a material system corresponding to the PBR is called a PBR material system, and the PBR material system includes a plurality of parameters such as a Base Color (basic Color), a metal degree (Metallic), and a Roughness (Roughness).
Ambient reflection (IBL): in the PBR system, high light reflection is achieved by IBL technology. The effect of ambient light in a real-time rendering is achieved by using an environment map (which may be a real-world camera shot or a real-time rendering of an in-game camera).
Rendering the pipeline: the rendering pipeline is a core component of real-time rendering. The rendering pipeline functions to render a two-dimensional image by giving scene elements such as a virtual camera, a three-dimensional scene object, and a light source.
It should be noted that, before the application of the present application, in the prior art, in order to illuminate a scene, direct light and indirect light are usually used in a game engine, and in order to prevent a part not illuminated by the direct light from appearing "dead black", that is, to highlight details of a dark area, a light source or an environmental reflection is used to interact with an object, that is, light generated by the direct light and the indirect light is required to interact with a surface of the object, so as to obtain a rendering and coloring result and display the rendering and coloring result.
The game scene rendering method provided by the application can be used for respectively performing material rendering on the direct light irradiation area and the area which is not irradiated by the direct light by directly obtaining the first material rendering parameter and the second material rendering parameter of the material shader of the object to be rendered according to the first material rendering parameter and the second material rendering parameter, and can be used for adjusting the effects of the direct light irradiation area and the area which is not irradiated by the direct light in the game scene by a simple mode without performing extra illumination processing on the area which is not irradiated by the direct light, thereby balancing the comparison between the direct light irradiation area and the area which is not irradiated by the direct light, avoiding the dead black effect of the area which is not irradiated by the direct light, improving the rendering effect of the game scene, and solving the problem that the surface of the object is interactively obtained by light generated by the direct light and the indirect light in the prior art to obtain the final rendering coloring effect and display, the problem of high performance consumption is caused, and the effect of simplifying the rendering process of the game scene is achieved.
The game scene rendering method provided by the embodiment of the present application is explained below with reference to a plurality of specific application examples. Fig. 1 is a schematic flow chart of a game scene rendering method according to an embodiment of the present application, and as shown in fig. 1, the method includes:
s101: and acquiring a first material rendering parameter and a second material rendering parameter of the object to be rendered through a material shader of the object to be rendered in the game scene.
In the method provided by the application, the material shader comprises the first material rendering parameter and the second material rendering parameter, so that the material shader based on the application can acquire two parts of material rendering parameters aiming at an object to be rendered, and the materials corresponding to the bright part and the dark part area are rendered respectively according to the two parts of material rendering parameters.
S102: and performing material rendering on the direct light irradiation area on the object to be rendered by adopting a physical rendering method according to the first material rendering parameter.
Optionally, performing material rendering on objects corresponding to all direct light irradiation areas in the game scene according to the first material rendering parameter; according to the first material rendering parameter and in combination with the selection operation of the user, material rendering can be performed on only a plurality of selected objects in the direct light irradiation area in the game scene; or performing material rendering on a plurality of objects of the selected target type only in a plurality of types of objects in the direct light irradiation area in the game scene according to the first material rendering parameter and by combining with the selection operation of the user; the determination method of the specific material rendering object may be flexibly adjusted according to the user requirement, and is not limited to the embodiment described above.
And performing material rendering on the direct light irradiation area based on the first material rendering parameter, namely ensuring the rendering effect on the bright part area. Therefore, the rendering effect of the bright part area can be controlled by configuring or changing the first material rendering parameter.
S103: and according to the second material rendering parameter, performing material rendering on the region which is not irradiated by the direct light on the object to be rendered by adopting a physical rendering method.
Optionally, in some possible embodiments, the way of rendering the material of the region not irradiated by the direct light may be, for example: the material rendering can be carried out on objects corresponding to all regions which are not directly irradiated by light in the game scene; or according to the selection operation of the user, only performing material rendering on a plurality of selected objects in the area which is not directly irradiated by the light in the game scene; the method can also be used for performing material rendering on a plurality of objects of the selected target type in a plurality of types of objects in an area which is not directly irradiated by light in a game scene according to the selection operation of a user; the determination of the specific material rendering object may be flexibly adjusted according to the user's needs, and is not limited to the embodiments described above.
The area in the game scene not directly illuminated by the light due to occlusion is a dark area of the current scene, for example, a partial area occluded by a building or a plant, or an area in a cave or a room without a light source is a dark area in the current game scene, and the second material rendering parameter is used for controlling the rendering effect of the dark area.
By adopting the game scene rendering method provided by the application, since the rendering is performed by directly obtaining the first material rendering parameter and the second material rendering parameter of the material shader of the object to be rendered and respectively performing material rendering on the direct light irradiation area and the area which is not irradiated by the direct light in the game scene according to the first material rendering parameter and the second material rendering parameter, the effect of the direct light irradiation area and the area which is not irradiated by the direct light in the game scene can be adjusted by a simple method without performing additional illumination processing on the area which is not irradiated by the direct light in the game scene, so that the contrast of the direct light irradiation area and the area which is not irradiated by the direct light in the game scene is balanced, the dead black effect of the area which is not irradiated by the direct light is avoided, the rendering effect of the game scene is improved, and the hierarchical effect between the direct light irradiation area and the area which is not irradiated by the direct light in the rendered game scene can be ensured, the rendering effect is more real.
Optionally, on the basis of the foregoing embodiment, an embodiment of the present application may further provide a game scene rendering method, and an implementation process of obtaining the material rendering parameter in the foregoing method is described as follows with reference to the accompanying drawings. Fig. 2 is a schematic flowchart of a game scene rendering method according to another embodiment of the present application, and as shown in fig. 2, S101 may include:
s104: a first texture rendering parameter is obtained through a first configuration area on a configuration interface of a texture shader.
The first configuration area is used for receiving the first material rendering parameter, for example, the first configuration area may be displayed at a corresponding position preset on a setting interface in a form of the setting interface, and a user may determine that the acquired parameter input by the user is the first material rendering parameter in a form of inputting the parameter in the first configuration area.
S105: and acquiring a second texture rendering parameter through a second configuration area on the configuration interface of the texture shader.
In some possible embodiments, for example, the first configuration area is used to receive the first material rendering parameter, and the second configuration area is displayed at a corresponding position preset on the setting interface in a form of a setting interface (where the first configuration area and the second configuration area are different areas on the setting interface), and the user may determine that the acquired parameter input by the user is the second material rendering parameter in a form of inputting the parameter in the second configuration area.
Optionally, the first material rendering parameter and the second material rendering parameter may be manually configured by an artist, or may be determined by the artist through selection operation in a preset configuration option, and a determination mode of the specific material rendering parameter may be flexibly adjusted according to a user requirement.
Optionally, on the basis of the foregoing embodiment, an embodiment of the present application may further provide a game scene rendering method, and an implementation process of obtaining the first material rendering parameter in the foregoing method is described as follows with reference to the accompanying drawings. Fig. 3 is a schematic flowchart of a game scene rendering method according to another embodiment of the present application, and as shown in fig. 3, S104 may include:
s106: and receiving a texture map input through the first configuration area.
In some possible embodiments, the material map may be a material map selected from a plurality of material maps in a preset material map library, a material map drawn by an artist, or a material map obtained by crawling from a network, and an obtaining manner of the specific material map may be flexibly adjusted according to a user requirement, which is not limited to the above embodiments.
S107: a first texture rendering parameter is read from a channel preset in the texture map.
The material rendering parameters of the corresponding types are recorded in the preset channel in the material map, and the material rendering parameters are used for performing material rendering on the direct light irradiation area according to the material rendering parameters of the corresponding types in the material map, so that if the material of the direct light irradiation area needs to be replaced, only the corresponding material map needs to be replaced.
In an embodiment of the present application, the first metal value may be read from an R channel of the material map, and the first roughness value may be read from a G channel of the material map, that is, the material rendering may be performed on the metal value of the direct light irradiation area according to the read first metal value, and the material rendering may be performed on the roughness of the direct light irradiation area according to the read first roughness value.
Optionally, on the basis of the foregoing embodiment, an embodiment of the present application may further provide a game scene rendering method, and an implementation process of obtaining the first material rendering parameter in the foregoing method is described as follows with reference to the accompanying drawings. Fig. 4 is a schematic flowchart of a game scene rendering method according to another embodiment of the present application, and as shown in fig. 4, S105 may include:
s108: and receiving preset type material rendering parameters input through the second configuration area.
Wherein the second material rendering parameters include: and presetting the material rendering parameters of the types.
Optionally, in an embodiment of the present application, the second configuration area has: at least one input box of preset type material rendering parameters; s108 is to receive the material rendering parameters of each preset type input through the input box of the material rendering parameters of each preset type: the second material rendering parameters include: at least one preset type of material rendering parameter.
For example, the at least one preset type of material rendering parameter may include: in an embodiment of the present application, the at least one predetermined type of material rendering parameter includes: a second metal value and a second roughness value; the value of the second metal value can be any value in the range of 0-1, wherein when the second metal value is 0, the effect corresponding to the current material is completely not metallic, and when the second metal value is closer to 1, the metallic effect corresponding to the current material is stronger; the closer the second roughness value is to 0, the lower the effect roughness corresponding to the current material is, i.e. the smoother the effect corresponding to the current material is, and the closer the second roughness value is to 1, the higher the effect roughness corresponding to the current material is, i.e. the smoother the effect corresponding to the current material is; however, it should be understood that the preset type of material and the content specifically included in the rendering parameters may be flexibly adjusted according to the user's needs, and are not limited to the embodiments described above.
By adopting the method provided by the application, compared with the mode of only reading the rendering parameters of the first material in the prior art, the method for only reading one set of parameters is replaced by additionally adding the rendering parameters of the second material, so that the configuration of the effect of the bright part and the dark part in the scene can be realized through simple setting, complex processes such as interactive calculation and the like according to direct light and indirect light are not needed, the effect of the bright part and the dark part can be modified if needed subsequently, the effect of the bright part and the dark part in the scene can be modified only through simple modification, and the working efficiency of art workers is improved.
Optionally, on the basis of the foregoing embodiment, an embodiment of the present application may further provide a game scene rendering method, and an implementation process of performing texture rendering on a region that is not irradiated by direct light in the foregoing method is described as follows with reference to the accompanying drawings. Fig. 5 is a schematic flowchart of a game scene rendering method according to another embodiment of the present application, and as shown in fig. 5, S103 may include:
s109: and configuring highlight items of indirect light of a rendering pipeline of the physical rendering method according to the second material rendering parameters.
By adopting the method provided by the application, the PBR part in the rendering pipeline is improved, and the IBL part read from the PBR is processed, so that the material shader of the object to be rendered can receive the independent first material rendering parameter and the independent second material rendering parameter, namely the PBR in the method provided by the application needs to receive two different material rendering parameters, and then the material rendering is carried out on different areas in the game scene according to the received different material rendering parameters, and the two different material rendering parameters are respectively used for rendering the material effects of the bright area and the dark area in the current game scene; wherein, the highlight item of the IBL part of the indirect light of the rendering pipeline of the PBR may be configured according to the second material rendering parameter for highlight calculation in the subsequent IBL.
S110: and performing material rendering on the region which is not directly irradiated by the light on the object to be rendered based on the configured rendering pipeline.
In the embodiment of the application, the rendering of the material of the direct light irradiation area (bright part) and the material of the area not irradiated by the direct light (bright part) is performed separately, specifically, the object bright part area is rendered according to the first material rendering parameter, the object dark part area is rendered according to the second material rendering parameter, and the final coloring effect in the scene comprises the result of rendering the material of the area irradiated by the direct light and the result of rendering the material of the area not irradiated by the direct light.
The following is illustrated in connection with rendering of a virtual snow layer in a game scene. Fig. 6 is a schematic configuration interface diagram provided in an embodiment of the present application, and as shown in fig. 6, a current game scene rendering method is exemplarily explained as an example of rendering of a snow layer in a game scene: the object texture shader configuration interface may be, for example, a terrain mosaic texture shader configuration interface, and the configuration interface may include, for example, parameter configuration areas such as terrain standard parameters and snow base color parameters, and in addition, a second configuration area 130 may be additionally provided on the object texture shader configuration interface for receiving second texture rendering parameters; at this time, in the Snow Layer, besides the configuration areas of the standard parameters and the basic parameters, two configuration areas are also layered: the first configuration area 120 and the second configuration area 130 are respectively used for reading different degrees of metallization and roughness corresponding to the bright part area and the dark part area, and performing subsequent material rendering on an object to be rendered in the current game scene according to the read parameters.
Wherein: the first layer is to read a first material rendering parameter through a first configuration area 120, namely, a snow layer mixed parameter SnowLayer Mix interface, the interface reads a material map, wherein a R channel in the material map stores a metal value, and a G channel in the material map is a roughness value, the first material rendering parameter is determined according to the two read parameters, and then the material rendering is performed on a bright part area directly irradiated by light according to the first material rendering parameter, if an artist needs to modify the material effects of the two areas in the current game scene, the first material rendering parameter corresponding to the interface can be modified directly by modifying the material map, so that the material effect corresponding to the bright part area in the current game scene is adjusted.
The second layering is to read a second material rendering parameter through a second configuration area 130, namely a snow layer metal degree specification parameter SnowLayerSpecCube Metallic interface and a snow layer Roughness specification parameter SnowLayerSpecCube roughnessinterface, and then perform material rendering on a dark part area which is not directly irradiated by light according to the second material rendering parameter; the two interfaces in the second configuration area 130 are used for reading corresponding material rendering parameters, which respectively allow to receive a value ranging from 0 to 1, and different values correspond to different roughness or metallization, and are used for subsequently processing the metallization and the roughness of the dark portion area according to the read values of the two interfaces; in the process of configuring the game scene, if an artist needs to modify the material effect of the dark area in the current game scene, the material effect corresponding to the dark area in the current game scene can be adjusted directly on the first configuration area 120 and the second configuration area 130 corresponding to the parameter configuration interface by modifying the material rendering parameters respectively corresponding to the two interfaces.
In the method provided by the application, the PBR calculates two different metal degrees and Roughness values which are used for processing the bright part material effect and the dark part material effect in the current game scene, wherein in the IBL part, the metal degrees and the Roughness values required by the IBR are increased by using the method provided by the application, and the second configuration area comprises two interfaces (SnowLayerSpecCube Metallic and SnowLayerSpecCube Roughness).
By adopting the game scene rendering method provided by the application, the target bright part area and the target dark part area are rendered respectively through the configuration results of the first material rendering parameter and the second material rendering parameter in the first configuration area and the second configuration area, and compared with the mode of rendering the dark part area according to the interaction of direct light and indirect light in the traditional technology, the rendering method provided by the application is simple and easy to operate and modify, and is convenient for an art worker to control the effect of the bright part and the dark part of any material in the game scene by adjusting the first material rendering parameter and the second material rendering parameter.
The following explains a game scene rendering device provided in the present application with reference to the drawings, where the game scene rendering device can execute any one of the game scene rendering methods shown in fig. 1 to 6, and specific implementation and beneficial effects thereof are referred to above, and are not described again below.
Fig. 7 is a schematic structural diagram of a game scene rendering apparatus according to an embodiment of the present application, and as shown in fig. 7, the apparatus includes: an acquisition module 201 and a rendering module 202, wherein:
the obtaining module 201 is configured to obtain a first material rendering parameter and a second material rendering parameter of an object to be rendered through a material shader of the object to be rendered in a game scene.
The rendering module 202 is configured to perform material rendering on a direct light irradiation region on an object to be rendered by using a physics-based rendering method according to the first material rendering parameter; and according to the second material rendering parameter, performing material rendering on the region which is not irradiated by the direct light on the object to be rendered by adopting a physical rendering method.
Optionally, the obtaining module 201 is specifically configured to obtain a first texture rendering parameter through a first configuration area on a configuration interface of the texture shader; and acquiring a second texture rendering parameter through a second configuration area on the configuration interface of the texture shader.
Fig. 8 is a schematic structural diagram of a game scene rendering apparatus according to an embodiment of the present application, and as shown in fig. 8, the apparatus further includes: a receiving module 203 and a reading module 204, wherein:
a receiving module 203, configured to receive a texture map input through the first configuration area;
a reading module 204, configured to read a first material rendering parameter from a preset channel in a material map; the material rendering parameters of corresponding types are recorded in a preset channel in the material map.
Optionally, the reading module 204 is specifically configured to read the first metal value from an R channel of the material map, and read the first roughness value from a G channel of the material map.
Optionally, the receiving module 203 is specifically configured to receive a preset type of material rendering parameter input through the second configuration area, where the second material rendering parameter includes: and presetting the material rendering parameters of the types.
Optionally, the receiving module 203 is specifically configured to receive each preset type of material rendering parameter input through the input box of each preset type of material rendering parameter: the second material rendering parameters include: at least one preset type of material rendering parameter.
As shown in fig. 8, the apparatus further includes: a configuration module 205, configured to configure highlight items of indirect light of a rendering pipeline of the physics-based rendering method according to the second material rendering parameter;
the rendering module 202 is specifically configured to perform material rendering on a region, which is not directly irradiated by light, of an object to be rendered based on the configured rendering pipeline.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 9 is a schematic structural diagram of a game scene rendering device according to an embodiment of the present application, where the game scene rendering device may be integrated in a terminal device or a chip of the terminal device.
As shown in fig. 9, the game scene rendering apparatus includes: a processor 501, a storage medium 502, and a bus 503.
The processor 501 is used for storing a program, and the processor 501 calls the program stored in the storage medium 502 to execute the method embodiment corresponding to fig. 1-6. The specific implementation and technical effects are similar, and are not described herein again.
Optionally, the present application also provides a program product, such as a storage medium, on which a computer program is stored, including a program, which, when executed by a processor, performs embodiments corresponding to the above-described method.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to perform some steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Claims (11)
1. A method for rendering a game scene, the method comprising:
acquiring a first material rendering parameter and a second material rendering parameter of an object to be rendered through a material shader of the object to be rendered in a game scene;
according to the first material rendering parameter, performing material rendering on the direct light irradiation area on the object to be rendered by adopting a physical rendering method;
and according to the second material rendering parameter, performing material rendering on the region which is not irradiated by the direct light on the object to be rendered by adopting a physical rendering method.
2. The method of claim 1, wherein the obtaining a first material rendering parameter and a second material rendering parameter of an object to be rendered by a material shader of the object to be rendered in the game scene comprises:
obtaining the first material rendering parameter through a first configuration area on a configuration interface of the material shader;
and acquiring the second material rendering parameter through a second configuration area on a configuration interface of the material shader.
3. The method of claim 2, wherein obtaining the first texture rendering parameter via a first configuration area on a configuration interface of the texture shader comprises:
receiving a material mapping input through the first configuration area;
reading the first material rendering parameter from a preset channel in the material map; and the preset channel in the material mapping records material rendering parameters of corresponding types.
4. The method of claim 3, wherein reading the first material rendering parameter from a predetermined channel in the material map comprises:
a first metallization value is read from an R channel of the material map, and a first roughness value is read from a G channel of the material map.
5. The method of claim 2, wherein obtaining the second texture rendering parameters through a second configuration area on a configuration interface of the texture shader comprises:
receiving preset type material rendering parameters input through the second configuration area, wherein the second material rendering parameters comprise: and the preset type of material rendering parameters.
6. The method of claim 5, wherein the second configuration region has therein: at least one input box of preset type material rendering parameters;
the receiving of the preset type of material rendering parameters input through the second configuration area includes:
receiving the material rendering parameters of each preset type input through the input box of the material rendering parameters of each preset type: the second material rendering parameters include: the at least one preset type of material rendering parameter.
7. The method of claim 6, wherein the at least one preset type of material rendering parameter comprises: a second metal value, and/or a second roughness value.
8. The method according to any one of claims 1 to 7, wherein performing material rendering on the region of the object to be rendered that is not irradiated by the direct light by using a physics-based rendering method according to the second material rendering parameter comprises:
configuring highlight items of indirect light of a rendering pipeline of the physics-based rendering method according to the second material rendering parameters;
and performing material rendering on the region which is not directly irradiated by the direct light on the object to be rendered based on the configured rendering pipeline.
9. A game scene rendering apparatus, the apparatus comprising: an acquisition module and a rendering module, wherein:
the acquisition module is used for acquiring a first material rendering parameter and a second material rendering parameter of an object to be rendered through a material shader of the object to be rendered in a game scene;
the rendering module is used for rendering the material of the direct light irradiation area on the object to be rendered by adopting a physical rendering method according to the first material rendering parameter; and according to the second material rendering parameter, performing material rendering on the region which is not irradiated by the direct light on the object to be rendered by adopting a physical rendering method.
10. A game scene rendering apparatus, characterized in that the apparatus comprises: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the game scene rendering device is running, the processor executing the machine-readable instructions to perform the method of any one of claims 1-8.
11. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, performs the method of any of the preceding claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011075527.2A CN112190936B (en) | 2020-10-09 | 2020-10-09 | Game scene rendering method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011075527.2A CN112190936B (en) | 2020-10-09 | 2020-10-09 | Game scene rendering method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112190936A true CN112190936A (en) | 2021-01-08 |
CN112190936B CN112190936B (en) | 2024-10-29 |
Family
ID=74013229
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011075527.2A Active CN112190936B (en) | 2020-10-09 | 2020-10-09 | Game scene rendering method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112190936B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113052947A (en) * | 2021-03-08 | 2021-06-29 | 网易(杭州)网络有限公司 | Rendering method, rendering device, electronic equipment and storage medium |
CN114581575A (en) * | 2021-12-13 | 2022-06-03 | 北京市建筑设计研究院有限公司 | Model rendering processing method and device and electronic equipment |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150130805A1 (en) * | 2013-11-11 | 2015-05-14 | Oxide Interactive, LLC | Method and system of anti-aliasing shading decoupled from rasterization |
US20170236322A1 (en) * | 2016-02-16 | 2017-08-17 | Nvidia Corporation | Method and a production renderer for accelerating image rendering |
CN107103638A (en) * | 2017-05-27 | 2017-08-29 | 杭州万维镜像科技有限公司 | A kind of Fast rendering method of virtual scene and model |
US20180227568A1 (en) * | 2017-02-07 | 2018-08-09 | Siemens Healthcare Gmbh | Lightfield rendering based on depths from physically-based volume rendering |
CN108434742A (en) * | 2018-02-02 | 2018-08-24 | 网易(杭州)网络有限公司 | The treating method and apparatus of virtual resource in scene of game |
CN110193193A (en) * | 2019-06-10 | 2019-09-03 | 网易(杭州)网络有限公司 | The rendering method and device of scene of game |
CN111068312A (en) * | 2019-12-02 | 2020-04-28 | 网易(杭州)网络有限公司 | Game picture rendering method and device, storage medium and electronic equipment |
CN111127624A (en) * | 2019-12-27 | 2020-05-08 | 珠海金山网络游戏科技有限公司 | Illumination rendering method and device based on AR scene |
CN111210497A (en) * | 2020-01-16 | 2020-05-29 | 网易(杭州)网络有限公司 | Model rendering method and device, computer readable medium and electronic equipment |
CN111467807A (en) * | 2020-05-18 | 2020-07-31 | 网易(杭州)网络有限公司 | Snow melting effect rendering method and device, electronic equipment and storage medium |
-
2020
- 2020-10-09 CN CN202011075527.2A patent/CN112190936B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150130805A1 (en) * | 2013-11-11 | 2015-05-14 | Oxide Interactive, LLC | Method and system of anti-aliasing shading decoupled from rasterization |
US20170236322A1 (en) * | 2016-02-16 | 2017-08-17 | Nvidia Corporation | Method and a production renderer for accelerating image rendering |
US20180227568A1 (en) * | 2017-02-07 | 2018-08-09 | Siemens Healthcare Gmbh | Lightfield rendering based on depths from physically-based volume rendering |
CN107103638A (en) * | 2017-05-27 | 2017-08-29 | 杭州万维镜像科技有限公司 | A kind of Fast rendering method of virtual scene and model |
CN108434742A (en) * | 2018-02-02 | 2018-08-24 | 网易(杭州)网络有限公司 | The treating method and apparatus of virtual resource in scene of game |
CN110193193A (en) * | 2019-06-10 | 2019-09-03 | 网易(杭州)网络有限公司 | The rendering method and device of scene of game |
CN111068312A (en) * | 2019-12-02 | 2020-04-28 | 网易(杭州)网络有限公司 | Game picture rendering method and device, storage medium and electronic equipment |
CN111127624A (en) * | 2019-12-27 | 2020-05-08 | 珠海金山网络游戏科技有限公司 | Illumination rendering method and device based on AR scene |
CN111210497A (en) * | 2020-01-16 | 2020-05-29 | 网易(杭州)网络有限公司 | Model rendering method and device, computer readable medium and electronic equipment |
CN111467807A (en) * | 2020-05-18 | 2020-07-31 | 网易(杭州)网络有限公司 | Snow melting effect rendering method and device, electronic equipment and storage medium |
Non-Patent Citations (1)
Title |
---|
张帆: "《Unity5.X游戏开发基础》", 31 May 2017, 浙江工业大学出版社, pages: 215 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113052947A (en) * | 2021-03-08 | 2021-06-29 | 网易(杭州)网络有限公司 | Rendering method, rendering device, electronic equipment and storage medium |
CN113052947B (en) * | 2021-03-08 | 2022-08-16 | 网易(杭州)网络有限公司 | Rendering method, rendering device, electronic equipment and storage medium |
CN114581575A (en) * | 2021-12-13 | 2022-06-03 | 北京市建筑设计研究院有限公司 | Model rendering processing method and device and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN112190936B (en) | 2024-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112316420B (en) | Model rendering method, device, equipment and storage medium | |
US8803879B1 (en) | Omnidirectional shadow texture mapping | |
CN107749077B (en) | Card ventilation grid shadow rendering method, device, equipment and medium | |
CN112215934A (en) | Rendering method and device of game model, storage medium and electronic device | |
CN112884874B (en) | Method, device, equipment and medium for applying applique on virtual model | |
CN111760277B (en) | Illumination rendering method and device | |
CN111768473B (en) | Image rendering method, device and equipment | |
CN113674389A (en) | Scene rendering method and device, electronic equipment and storage medium | |
CN112190936A (en) | Game scene rendering method, device, equipment and storage medium | |
US6791544B1 (en) | Shadow rendering system and method | |
JP4430678B2 (en) | Programmable filtering method and apparatus for texture map data in a three-dimensional graphics subsystem | |
CN112274934B (en) | Model rendering method, device, equipment and storage medium | |
WO2023098358A1 (en) | Model rendering method and apparatus, computer device, and storage medium | |
CN111199573A (en) | Virtual-real mutual reflection method, device, medium and equipment based on augmented reality | |
CN114742931A (en) | Method and device for rendering image, electronic equipment and storage medium | |
CN111383320A (en) | Virtual model processing method, device, equipment and storage medium | |
US11804008B2 (en) | Systems and methods of texture super sampling for low-rate shading | |
KR20100075351A (en) | Method and system for rendering mobile computer graphic | |
CN115845369A (en) | Cartoon style rendering method and device, electronic equipment and storage medium | |
CN114596403B (en) | Image processing method, device, storage medium and terminal | |
WO2022042003A1 (en) | Three-dimensional coloring method and apparatus, and computing device and storage medium | |
CN114367105A (en) | Model coloring method, device, apparatus, medium, and program product | |
CN109074673A (en) | Pass through the constant multiplication of the texture cell of graphics processing unit | |
CN117333598B (en) | 3D model rendering system and method based on digital scene | |
CN115761087A (en) | Model rendering method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |