CN115526977B - Game picture rendering method and device - Google Patents

Game picture rendering method and device Download PDF

Info

Publication number
CN115526977B
CN115526977B CN202211285899.7A CN202211285899A CN115526977B CN 115526977 B CN115526977 B CN 115526977B CN 202211285899 A CN202211285899 A CN 202211285899A CN 115526977 B CN115526977 B CN 115526977B
Authority
CN
China
Prior art keywords
rendering
information
light
screen
game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211285899.7A
Other languages
Chinese (zh)
Other versions
CN115526977A (en
Inventor
洪晓健
靖超
何允恒
胡佳
贺智艺
张健
田国刚
李小航
滕泽榞
姜凯伦
张继生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Changyou Chuangxiang Software Technology Co ltd
Original Assignee
Beijing Changyou Chuangxiang Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Changyou Chuangxiang Software Technology Co ltd filed Critical Beijing Changyou Chuangxiang Software Technology Co ltd
Priority to CN202211285899.7A priority Critical patent/CN115526977B/en
Publication of CN115526977A publication Critical patent/CN115526977A/en
Application granted granted Critical
Publication of CN115526977B publication Critical patent/CN115526977B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Image Generation (AREA)

Abstract

The application provides a game picture rendering method and device, wherein the method comprises the following steps: obtaining picture data to be rendered in a game; based on a plurality of first blocks divided by a screen and light source data, constructing a first light list of each first block by utilizing a light rejection algorithm; extracting material information of the object from rendering attribute information of the object; if the object is determined to be a first type object based on the material information of the object, rendering the object in a forward rendering mode based on the first light list of each first partition, the depth information, the normal line information and the material information of the object; if the object is determined to be the second type object based on the material information of the object, the rendering attribute information of the object is cached to a plurality of geometric buffers, and the object is rendered in a delayed rendering mode based on the first lamplight list of each first partition and the cached information in the geometric buffers. According to the scheme, the rendering effect of the game picture can be improved on the premise of considering the rendering performance.

Description

Game picture rendering method and device
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a method and apparatus for rendering a game frame.
Background
With the continuous development of game technology, the variety of games running on mobile terminals such as mobile phones is increasing.
For some game applications running on the mobile terminal, the mobile terminal needs to render game pictures of the game application and finally output the game pictures to a display screen of the mobile terminal. However, the game screen rendering has a high requirement on hardware such as a video memory of the mobile terminal, and the hardware performance such as the video memory of the mobile terminal is relatively low, and may not meet the rendering requirement of high performance, so that the situation that the rendering effect of the game screen is affected to achieve the performance often occurs.
Disclosure of Invention
The application provides a game picture rendering method and device, which can improve the rendering effect of a game picture on the premise of considering the rendering performance.
In one aspect, the present application provides a game screen rendering method, applied to a mobile terminal, including:
obtaining picture data to be rendered in a game, wherein the picture data comprises: light source data of a light source and rendering attribute information of at least one non-transparent object in a game screen;
based on a plurality of first blocks divided by a screen and the light source data, constructing a first light list of each first block by utilizing a light rejection algorithm;
Extracting material information of the object from rendering attribute information of the object;
if the object is determined to be a first type object based on the material information of the object, extracting depth information and normal line information of the object from rendering attribute information of the object, and rendering the object in a forward rendering mode based on a first lamplight list of each first partition, the depth information, the normal line information and the material information of the object;
and if the object is determined to be a second type object based on the material information of the object, caching the rendering attribute information of the object to a plurality of geometric buffers, and rendering the object in a delayed rendering mode based on the first lamplight list of each first partition and the information cached in the geometric buffers.
In one possible implementation manner, before the building of the first light list of each first partition, the method further includes:
determining the type of a graphics processor in the mobile terminal;
determining a target light rejection algorithm suitable for the graphic processor in the mobile terminal according to the configured light rejection algorithms suitable for the different types of graphic processors;
The first blocks divided based on the screen and the light source data construct a first light list of each first block by utilizing a light rejection algorithm, including:
and constructing a first lamplight list of each first partition by utilizing the target lamplight rejection algorithm in a graphic processor of the mobile terminal based on the first partitions divided by the screen and the light source data.
In yet another possible implementation manner, the rendering the object in a delayed rendering manner based on the first light list of each of the first partitions and the information cached in the plurality of geometric buffers includes:
determining a target delay rendering code applicable to the object based on the material quality of the object suitable for each set of delay rendering codes and combining the material information of the object, wherein the target delay rendering code belongs to the sets of delay rendering codes;
and calling the target delay rendering code to execute delay rendering of the object based on the first light list of each first partition and the information cached in the plurality of geometric buffers.
In yet another possible implementation manner, the picture data further includes: volume fog data corresponding to each pixel point in the game picture and used for volume fog rendering;
After the picture data to be rendered in the game is obtained, the method further comprises the following steps:
based on a plurality of second blocks divided by a screen and the light source data, constructing a second light list of each second block by utilizing a light rejection algorithm, wherein the area of the second block is larger than that of the first block;
generating an initial volume texture image based on the volume fog data of each pixel point in the game picture, wherein the volume fog data is associated with each pixel point in the initial volume texture image;
and determining the volume light effect of the screen based on the initial texture image, the second light list and the light source data.
In yet another possible implementation manner, the method further includes:
determining the volume fog color of a screen based on the volume fog data of each pixel point in the game picture;
and superposing the volume fog color on the volume light effect of the screen.
In yet another possible implementation, the picture data further includes: depth information and normal information of the game picture;
before said determining the volumetric light effect of the screen, further comprising:
constructing a plurality of depth maps with different resolutions based on the depth information of the game picture;
Combining the plurality of depth maps and normal information of the game picture to determine a space reflection effect of the screen;
and determining the ambient light shielding effect of each pixel point in the screen based on the depth information of the game picture.
In yet another possible implementation manner, the picture data further includes: sky data of sky in the game picture and rendering attribute information of at least one transparent object;
the method further comprises the steps of:
rendering sky of the game scene based on the sky data;
rendering the transparent object based on rendering attribute information of the transparent object.
In still another aspect, the present application further provides a game screen rendering device, applied to a mobile terminal, including:
a data obtaining unit configured to obtain picture data to be rendered in a game, the picture data including: light source data of a light source and rendering attribute information of at least one non-transparent object in a game screen;
the first construction unit is used for constructing a first lamplight list of each first partition by utilizing a lamplight rejection algorithm based on a plurality of first partitions divided by the screen and the light source data;
a material extraction unit, configured to extract material information of the object from rendering attribute information of the object;
A first object rendering unit, configured to, if it is determined that the object is a first type object based on material information of the object, extract depth information and normal information of the object from rendering attribute information of the object, and render the object in a forward rendering manner based on a first light list of each first partition and the depth information, the normal information and the material information of the object;
and the second object rendering unit is used for caching rendering attribute information of the object into a plurality of geometric buffers if the object is determined to be a second type object based on the material information of the object, and rendering the object in a delayed rendering mode based on the first lamplight list of each first partition and the information cached in the geometric buffers.
In one possible implementation, the method further includes:
the type determining unit is used for determining the type of the graphic processor in the mobile terminal before the first constructing unit constructs the first lamplight list of each first block;
the rejection algorithm determining unit is used for determining a target rejection algorithm suitable for the graphic processor in the mobile terminal according to the configured lamp rejection algorithm suitable for the graphic processor of different types;
The first building unit includes:
the first construction subunit is configured to construct, in the graphics processor of the mobile terminal, a first light list of each first partition by using the target light rejection algorithm based on a plurality of first partitions divided by a screen and the light source data.
In yet another possible implementation manner, the second object rendering unit includes:
the code determining unit is used for determining a target delay rendering code applicable to the object based on the material quality of the object suitable for each set of delay rendering codes and combining the material information of the object, wherein the target delay rendering code belongs to the sets of delay rendering codes;
and the rendering code calling unit is used for calling the target delay rendering code to execute delay rendering of the object based on the first lamplight list of each first partition and the information cached in the plurality of geometric buffers.
As can be seen from the above, after obtaining the picture data to be rendered in the game, the present application constructs a light list of each block based on the multiple blocks divided by the screen and the light source data in the picture data by using the light rejection algorithm. On the basis, for a non-transparent object in a game picture to be rendered, if the object is determined to be a first type object suitable for forward rendering based on the material information of the object, the object is rendered in a forward rendering mode by combining a lamplight list; if the object is determined to be a second type object suitable for delayed rendering based on the material information of the object, the object is rendered by adopting a delayed rendering mode in combination with a lamplight list, so that different rendering modes are reasonably selected in combination with the rendering requirements of different objects, and the rendering effect of the object in a game picture can be effectively ensured; and compared with the method which only singly adopts forward rendering or delay rendering based on the blocks to render all objects, the method adopts two rendering pipelines of forward rendering and delay rendering of the blocks to render different objects in the mobile terminal, effectively utilizes the respective advantages of the two rendering modes, can give consideration to the rendering performance and is beneficial to ensuring the rendering effect of the game picture.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a game screen rendering method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a game screen rendering method according to an embodiment of the present application;
fig. 3 is a schematic diagram showing a composition structure of a game screen rendering device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without undue burden, are within the scope of the present application.
Fig. 1 is a schematic flow chart of a game screen rendering method provided in an embodiment of the present application, where the method of the embodiment is applied to a mobile terminal, and the mobile terminal may be a mobile phone or a tablet computer, which is not limited thereto.
The method of the embodiment can comprise the following steps:
s101, obtaining picture data to be rendered in the game.
It is understood that the picture data to be rendered refers to picture data of a game picture to be output. The screen data may include various data required to render a game screen.
For example, the screen data of the game screen may be model data of a game model, or may include information about a game scene, etc., without limitation.
It will be appreciated that the game screen includes objects such as characters, animals, or articles in the game, and therefore the screen data includes at least rendering attribute information of the objects in the game. The rendering attribute information of the object is basic information required for rendering the object, for example, the rendering attribute information of the object may include some or all of information such as a type, a name, depth information, normal information, and material information of the object, which is not limited.
It can be understood that many factors need to be considered in rendering non-transparent objects other than transparent objects in the game screen, and based on this, the present embodiment mainly describes the non-transparent objects in the game screen. Accordingly, in this embodiment, the frame data at least includes rendering attribute information of at least one non-transparent object in the game frame. Rendering attribute information of the non-transparent object includes: the material information of the object, such as the material identifier of the object, can represent the relevant information of the material of the object. Of course, the rendering attribute information may also include other information mentioned above, without limitation.
In addition, the screen data further includes: the light source data of the light sources in the game screen, such as the number of light sources, the kind and position of each light source, and the like, are not limited.
Of course, the picture data also includes other information of the corresponding scene of the game picture, which is not described herein.
S102, constructing a first lamplight list of each first partition by utilizing a lamplight rejection algorithm based on a plurality of first partitions and light source data divided by a screen.
The tiles into which the screen is divided are a plurality of tiles (tiles) into which the screen is logically divided. Each partition, also called a tile, refers to a respective small block of the screen that is logically divided.
In this application, for convenience of distinction, a partition involved in rendering a non-transparent object in this embodiment is referred to as a first partition, and correspondingly, a light list constructed for the first partition is referred to as a first light list.
The first light list of each first partition may include light information of a light source that affects the first partition, and the light information may be part or all of information in light source data of the light source.
It can be understood that the first light list of each first partition is constructed by utilizing the light rejection algorithm, the coloring and other treatments of a rendering channel are not required to be performed on each light source (namely, light) blindly, only the lights which affect the game scene need to be traversed, and the effects of the lights with the effects are overlapped on each partition, so that the number of lights which need to be processed in the coloring process can be reduced, and the data processing amount is reduced.
In this application, there are many possible light rejection algorithms, for example, the light rejection algorithm may be a finely truncated block illumination (Fine Prued Tiled Light, FPTL) algorithm, or a Cluster (Cluster) -based light rejection algorithm, or the like, which is not limited thereto.
It will be appreciated that the light culling and construction of the light list is accomplished in the central processing unit (central processing unit, CPU) with respect to the graphics processor (Graphics Processing Unit, GPU), with a low degree of parallelization and a relatively low efficiency. Based on this, the running light rejection algorithm may be executed in a graphics processor (Graphics Processing Unit, GPU) of the mobile terminal and a first light list of each first partition is constructed.
The inventors of the present application studied to find that: in building a light list, multiple threads and thread groups are required to build a buffer for a shader, which requires consideration of compatibility issues between the GPU as a parallel processor and the GPU's ability to handle complex data computations.
The inventors have further found through extensive studies that: the light rejection algorithm suitable for different GPUs is also different, and the problem of incompatibility between the mobile terminal and the parallel processor for complex data calculation is solved by selecting a reasonable light rejection algorithm. On the basis, the method and the device for determining the lighting rejection algorithm suitable for the GPUs of different types are achieved in a large number.
For example, the GPU for vendor A may be more suitable for FPTL algorithms, while the GPU for vendor B may be more suitable for Cluster algorithm based lamp culling.
Based on this, the present application can first determine the type of graphics processor in the mobile terminal. And then, determining a target lamplight rejection algorithm suitable for the graphic processor in the mobile terminal according to the lamplight rejection algorithm suitable for the graphic processor of the configured different types. Correspondingly, based on a plurality of first blocks divided by a screen and the light source data, a first light list of each first block can be constructed by utilizing the target light rejection algorithm in the GPU of the mobile terminal.
S103, extracting the material information of the object from the rendering attribute information of the object aiming at each non-transparent object to be rendered.
The material information of the object may include related information characterizing a material of the object, for example, the material information of the object may be a material identifier of the object, for example, a unique identifier such as a name or a number of the material of the object. Of course, the material information of the object may further include: the information such as the material characteristics of the object is not limited thereto.
And S104, if the object is determined to be the first type object based on the material information of the object, extracting the depth information and the normal information of the object from the rendering attribute information of the object, and rendering the object in a forward rendering mode based on the first light list of each first partition, the depth information, the normal information and the material information of the object.
In this application, the effect achieved by rendering the object and some special requirements will also be different when considering that the materials of the objects are different, so that the rendering modes suitable for different objects will naturally also be different. Based on this, in order to achieve both hardware performance and rendering effect in the present application, the present application needs to reasonably determine the rendering mode of the rendering object by combining the material of the object. Unlike the existing rendering pipeline which only deploys a single rendering mode, the delayed rendering and the forward rendering can be adopted to perform parallel rendering.
On the basis, the method and the device can determine the proper rendering modes of the objects with different materials in advance. On this basis, for a non-transparent object to be rendered, the type of the object may be determined based on the material information of the object. Wherein the first type of object is an object suitable for forward rendering and the second type of object is an object suitable for deferred rendering.
If the material information of the object characterizes the object as a first type object, the object can be forward rendered by combining the first lamplight list of each first partition based on the depth information, the normal information and the material information of the object. It can be understood that the rendering of the first type of object is essentially based on tile forward rendering, and the rendering mode combines forward rendering and tile light rejection to reduce the number of lights in the coloring process, and as the light rejection can reduce the illumination required by iterative calculation of each object, the rendering mode can not only ensure the rendering effect, but also improve the performance.
It can be appreciated that after extracting the depth information and the normal information of the first type object, the present application caches the depth information, the normal information and the texture information of the object in a cache area associated with a game picture to be rendered, so as to perform forward rendering based on the information when the object needs to be rendered.
S105, if the object is determined to be the second type object based on the material information of the object, the rendering attribute information of the object is cached to a plurality of geometric buffers, and the object is rendered in a delayed rendering mode based on the first lamplight list of each first partition and the information cached in the geometric buffers.
Wherein, different Geometry buffers (G-buffers) store different rendering attribute information of the object.
For example, in one possible implementation, for the second type of object, the present application may construct four geometry buffers, namely, a first geometry buffer, a second geometry buffer, a third geometry buffer, and a fourth geometry buffer. The first geometric buffer is cached: primary colors and specular occlusion information of the object; the second geometric buffer buffers normal information and roughness information corresponding to the object. Caching the information of metal information, rendering layer information, shadow shielding cover information and material identification codes of the object in a third geometric buffer area; the fourth geometric buffer buffers baked diffuse reflection illumination information or self-luminous color information of the object.
Wherein the information cached in the four geometric cache areas all belong to rendering attribute information of the object.
It is understood that if the material information of the object characterizes the object as a second type object, the present application may be based on the delayed rendering of the light list of each first tile. The delay rendering can effectively reduce illumination calculation, so that the rendering performance of objects can be greatly improved, the delay rendering of the tile-combined lamplight list can further reduce illumination calculation, and invalid input and output can be reduced by only calculating for one block at a time.
Based on the method, for some objects with low rendering requirements or materials which do not need special rendering requirements, the method adopts delayed rendering to render instead of forward rendering, and is beneficial to improving rendering performance.
It will be appreciated that for the second type of object, the material of the different objects will also differ, and that objects of different materials will also differ as they are suitable for deferred rendering. Based on this, in order to perform the delayed rendering of the object more efficiently and reasonably, the present application may further configure a plurality of sets of delayed rendering codes in advance, and configure at least one object material for which each of the plurality of sets of delayed rendering codes is suitable in advance.
On the basis, for each object belonging to the second type of object, the application can determine the target delay rendering code applicable to the object based on the object materials suitable for each of the plurality of sets of delay rendering codes and combining the material information of the object. The target deferred rendering code belongs to the plurality of sets of deferred rendering codes. Correspondingly, the target deferred rendering code may be invoked to perform deferred rendering of the object based on the first light list of each first partition and the information cached in the plurality of geometric buffers.
As can be seen from the above, after obtaining the picture data to be rendered in the game, the present application constructs a light list of each block based on the multiple blocks divided by the screen and the light source data in the picture data by using the light rejection algorithm. On the basis, for a non-transparent object in a game picture to be rendered, if the object is determined to be a first type object suitable for forward rendering based on the material information of the object, the object is rendered in a forward rendering mode by combining a lamplight list; if the object is determined to be a second type object suitable for delayed rendering based on the material information of the object, the object is rendered by adopting a delayed rendering mode in combination with a lamplight list, so that different rendering modes are reasonably selected in combination with the rendering requirements of different objects, and the rendering effect of the object in a game picture can be effectively ensured; and compared with the method which only singly adopts forward rendering or delay rendering based on the blocks to render all objects, the method adopts two rendering pipelines of forward rendering and delay rendering of the blocks to render different objects in parallel in the mobile terminal, effectively utilizes the respective advantages of the two rendering modes, and can give consideration to the rendering performance and the rendering effect of the enhanced game picture.
It will be appreciated that some transparent objects may be involved in the game screen in addition to non-transparent objects. Based on this, the rendering attribute information of the transparent object is also included in the screen data. Accordingly, after rendering the non-transparent object, the method may further include: the transparent object is rendered based on its rendering properties.
It will be appreciated that the game screen involves rendering of game scene environments such as some sky and background in addition to objects such as objects and characters.
Based on the above, the application can also relate to the rendering of the video data and can further comprise volume fog data required by volume fog rendering, and correspondingly, the rendering of the game video in the application can further comprise rendering of the volume fog effect and the like of the screen. Further, the rendering of the game screen may further include rendering of a spatial reflection effect of the screen, an ambient light shielding effect, and the like.
The rendering of game visuals of the present application is described below in connection with one implementation.
As shown in fig. 2, which is a schematic flow chart of a game screen rendering method according to an embodiment of the present application, the method of the present embodiment may include:
s201, obtaining picture data to be rendered in the game.
Wherein the picture data includes light source data of a light source in the game picture, rendering attribute information of transparent objects and opaque objects in the game picture.
The picture data further includes: volume fog data for volume fog rendering, depth information and normal line information of the game picture and sky data of sky in the game picture, wherein the volume fog data, the depth information and the normal line information and the sky data correspond to each pixel point in the game picture.
Wherein the volume fog data is related data required for rendering the volume fog into the game screen. The sky data refers to related data required for completing the sky in the scene corresponding to the game picture, such as the size and quantity of the clouds in the sky, the sky color mode and the like, which is not limited.
S202, extracting the material information of the object from the rendering attribute information of the object.
S203, if the material information of the object represents that the object belongs to the non-transparent first type object, extracting the depth information and the normal information of the object from the rendering attribute information of the object, and caching the depth information, the normal information and the material information of the object.
S204, if the material information of the object represents that the object belongs to the non-transparent second type object, the rendering attribute information of the object is cached into a plurality of geometric cache areas.
In this embodiment, whether the object is a transparent object is described as an example by extracting the material information of the object and determining whether the object is a transparent object based on the material information of the object. But the same applies to this embodiment if it is determined by other means whether the object is a transparent object.
The above steps S203 and S204 may be referred to the related description of the previous embodiments, and will not be repeated here.
S205, constructing a plurality of depth maps with different resolutions based on the depth information of the game picture.
There are many possibilities for constructing the depth map here, and there is no limitation to this.
For example, in one possible implementation, at least one Hierarchical Z-buffer (HIZ) depth map is generated based on depth information of a game frame.
In the present embodiment, the purpose of constructing the depth map is to determine the spatial reflection effect and the ambient light blocking effect of the screen later, and if both effects are not turned on in the game, this step S205 may not be performed.
S206, constructing a first lamplight list of each first partition by utilizing a lamplight rejection algorithm based on the first partitions and the light source data divided by the screen.
This step S206 may be referred to in the related description of the previous embodiment, and will not be described herein.
S207, based on the second blocks and the light source data divided by the screen, constructing a second light list of each second block by utilizing a light rejection algorithm.
Wherein the area of the second block is larger than the area of the first block, or the size of the second block is larger than the size of the first block.
For example, the second partition may be composed of four or more first partitions. For example, the first tile may be 32×32 pixels in size, and the second grid may be 64×64 pixels in size.
It is to be understood that, in this step S207, the process of constructing the second light list of each second partition is similar to the process of constructing the first light list of each first partition except that the size of the second partition is different from the size of the first partition, and specific reference is made to the foregoing related description, which is not repeated herein.
Optionally, the present application may further determine a plurality of three-dimensional units divided from the cone corresponding to the screen, where each three-dimensional unit is a Cluster (Cluster), and based on the light source data in the picture data, construct a three-dimensional pixel light source list of each three-dimensional unit for subsequent determination of a volumetric fog effect and the like.
S208, generating an initial volume texture image based on the volume fog data of each pixel point in the game picture.
Wherein, each pixel point in the initial volume texture image is associated with volume fog data.
Each pixel point in the initial volume texture image has a one-to-one object relationship with each pixel point in the game picture (or screen), so that the volume fog data of each pixel point in the game picture can be associated to the initial volume texture image in generating the initial volume texture image.
Wherein the initial volume texture image is generated for subsequent volume fog effect rendering.
It will be appreciated that in practical applications, it may be detected whether the game has turned on the volumetric light effect or the volumetric fog effect, and if the volumetric fog effect or the volumetric light effect is not turned on in the game, the relevant operations such as this step S208 and the subsequent determination of the volumetric light effect need not be performed.
The steps S202 to S208 are some basic preparation works for the related rendering such as the following object, the volume fog rendering or the screen space reflection, and the sequence of the steps may not be limited in practical application.
S209, combining the plurality of depth maps and normal line information in the game picture, and determining the space reflection effect of the screen.
The spatial reflection effect of the screen may include a spatial reflection effect corresponding to each pixel point in the screen (i.e., each pixel point corresponding to the game screen).
For example, a ray stepping process may be performed in combination with a plurality of depth maps and normal information of the game screen, to determine the color of the light reflected sample in the screen, and so on.
There are many different algorithms for determining the spatial reflection effect of the screen, and there is no limitation to this.
For example, the spatial reflection effect of the screen may be determined based on a screen space reflection (Screen Space Reflection, SSR) algorithm, or alternatively, based on a screen space plane reflection (Screen Space Planar Reflections, SSPR) algorithm.
It will be appreciated that if the in-game non-on screen reflection effect is detected, this step S209 may not be performed, but only in the case where the in-game on screen reflection effect is determined, this step S209 and the related operations of the previous determination of the depth map may be performed.
S210, determining the ambient light shielding effect of each pixel point in the screen based on the depth information of the game picture.
The shielding effect of the ambient light represents the shielding effect of the ambient environment on the ambient light in the game picture.
In the present application, there are many possible specific implementations for determining the ambient light shielding effect of each pixel point in the screen, which is not limited.
In order to make the game screen loaded with the ambient light shielding effect more real, the application can calculate the ambient light shielding effect of each pixel point in the screen based on the Ground Truth ambient light shielding (GTAO) algorithm. The GTAO adds a Cosine Weight (Cosine-Weighted is an algorithm for sampling a sphere, and is commonly used in the fields of sampling an incident direction during path tracking and the like) on the basis of a horizontal reference ambient light shielding (Horizon Based Ambient Occlusion, HBAO) algorithm, so as to realize a more real ambient light shielding effect.
S211, determining the volume light effect of the screen based on the initial texture image, the second light list and the light source data.
The volume light is a very common lighting special effect in games, and is mainly used for representing a light column which is leaked out of a light-transmitting part of an object when the light irradiates the object to be shielded. The light is called volume light because of the strong visual sense of volume.
In an alternative way, if it is confirmed that the volume light or the volume fog effect is turned on in the game, this step S211 is performed; if the volumetric light or volumetric fog effect is not turned on in the game, this step S211 need not be performed.
S212, determining the volume fog color of the screen based on the volume fog data of each pixel point in the game picture, and superposing the volume fog color on the volume light effect of the screen.
It will be appreciated that no haze colour is present in the volumetric light effect, and that by superimposing the volumetric haze colour on the volumetric light effect, a volumetric haze colour effect with a haze colour can be obtained.
Superimposing the volumetric fog color on the volumetric light effect is actually rendering the volumetric fog effect. The effect of the volume fog is that the cloud fog in the game can show a more natural effect instead of mapping, so that the light texture and the cloud texture are more real.
S213, for the non-transparent object belonging to the first type object, rendering the object by adopting a forward rendering mode based on the first light list of each first partition and the cached depth information, normal line information and material information of the object.
S214, aiming at the non-transparent object belonging to the second type object, rendering the object by adopting a delayed rendering mode based on the first lamplight list of each first partition and the information cached in the plurality of geometric buffers.
Steps S213 and S214 may be referred to in the description of the previous embodiment, and are not limited thereto.
It is understood that the order of steps S213 and S214 is not limited. Meanwhile, the sequence of steps S213 and S214 and the previous steps S209 to S212 is not limited to that shown in fig. 2, and in practical application, steps S13 and S214 may be performed first and then steps S209 to S214 may be performed.
And S215, rendering the sky of the game picture based on the sky data.
It will be appreciated that in practice, the rendering of a scene other than an object in a game frame may involve determining a color level map in the game frame for refraction and reflection to ensure that brightness is not lost at low resolution, although other rendering processes may be involved, without limitation.
S216, for the transparent object determined based on the material information of the object, rendering the transparent object based on the object data of the transparent object.
The specific rendering of the transparent object can be set according to the requirement, for example, conventional forward rendering can be adopted, and the specific rendering is not limited.
It will be appreciated that post-processing rendering may also be performed after this step S216, and the above rendering results are finally loaded to present a game screen.
It can be understood that in the embodiment of the present application, the spatial reflection effect, the volumetric fog effect, and the like of the screen may be turned on or not in the game according to the needs, so as to flexibly configure the corresponding rendering pipeline, and the corresponding rendering effect may be presented or not according to the actual needs.
Corresponding to the game picture rendering method, the application also provides a game picture rendering device. Fig. 3 is a schematic diagram showing a composition structure of a game screen rendering device according to an embodiment of the present application, where the device is applied to a mobile terminal. The apparatus of this embodiment may include:
a data obtaining unit 301, configured to obtain picture data to be rendered in a game, where the picture data includes: light source data of a light source and rendering attribute information of at least one non-transparent object in a game screen;
a first construction unit 302, configured to construct a first light list of each first partition by using a light rejection algorithm based on a plurality of first partitions divided by a screen and the light source data;
a material extraction unit 303, configured to extract material information of the object from rendering attribute information of the object;
a first object rendering unit 304, configured to, if it is determined that the object is a first type object based on material information of the object, extract depth information and normal information of the object from rendering attribute information of the object, and render the object in a forward rendering manner based on the first light list of each first partition and the depth information, the normal information and the material information of the object;
And the second object rendering unit 305 is configured to, if it is determined that the object is a second type object based on the material information of the object, buffer the rendering attribute information of the object to a plurality of geometric buffers, and render the object in a delayed rendering manner based on the first light list of each first partition and the information buffered in the plurality of geometric buffers.
In one possible implementation, the apparatus further includes:
the type determining unit is used for determining the type of the graphic processor in the mobile terminal before the first constructing unit constructs the first lamplight list of each first block;
the rejection algorithm determining unit is used for determining a target rejection algorithm suitable for the graphic processor in the mobile terminal according to the configured lamp rejection algorithm suitable for the graphic processor of different types;
the first building unit includes:
the first construction subunit is configured to construct, in the graphics processor of the mobile terminal, a first light list of each first partition by using the target light rejection algorithm based on a plurality of first partitions divided by a screen and the light source data.
In yet another possible implementation manner, the second object rendering unit includes:
The code determining unit is used for determining a target delay rendering code applicable to the object based on the material quality of the object suitable for each set of delay rendering codes and combining the material information of the object, wherein the target delay rendering code belongs to the sets of delay rendering codes;
and the rendering code calling unit is used for calling the target delay rendering code to execute delay rendering of the object based on the first lamplight list of each first partition and the information cached in the plurality of geometric buffers.
In still another possible implementation manner, the picture data obtained by the data obtaining unit further includes: volume fog data corresponding to each pixel point in the game picture and used for volume fog rendering;
the apparatus further comprises:
the second construction unit is used for constructing a second lamplight list of each second partition based on a plurality of second partitions divided by a screen and the light source data after the data obtaining unit obtains the picture data to be rendered in the game, and the area of the second partition is larger than that of the first partition by utilizing a lamplight rejection algorithm;
the texture generation unit is used for generating an initial volume texture image based on the volume fog data of each pixel point in the game picture, wherein the volume fog data is associated with each pixel point in the initial volume texture image;
And the volume light determining unit is used for determining the volume light effect of the screen based on the initial texture image, the second lamplight list and the light source data.
Further, the apparatus may further include:
a fog color determining unit, configured to determine a volume fog color of a screen based on volume fog data of each pixel point in the game screen;
and the volume fog determining unit is used for superposing the volume fog color on the volume light effect of the screen.
In still another possible implementation manner, the picture data obtained by the data obtaining unit further includes: depth information and normal information of the game picture;
the apparatus further comprises:
a depth map construction unit, configured to construct a plurality of depth maps with different resolutions based on depth information of the game screen before the volumetric light effect of the screen is determined by the volumetric light determination unit;
the emission determining unit is used for combining the plurality of depth maps and normal line information of the game picture to determine the space reflection effect of the screen;
and the shielding determining unit is used for determining the shielding effect of the ambient light of each pixel point in the screen based on the depth information of the game picture.
Further, the picture data obtained by the data obtaining unit further includes: sky data of sky in the game picture and rendering attribute information of at least one transparent object;
The apparatus further comprises:
a sky rendering unit for rendering sky of the game picture based on the sky data;
and a third object rendering unit for rendering the transparent object based on the rendering attribute information of the transparent object.
It should be noted that, in the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described as different from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other. Meanwhile, the features described in the embodiments of the present specification may be replaced with or combined with each other to enable those skilled in the art to make or use the present application. For the apparatus class embodiments, the description is relatively simple as it is substantially similar to the method embodiments, and reference is made to the description of the method embodiments for relevant points.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application and are intended to be comprehended within the scope of the present application.

Claims (10)

1. A game screen rendering method, which is applied to a mobile terminal, comprising:
obtaining picture data to be rendered in a game, wherein the picture data comprises: light source data of a light source and rendering attribute information of at least one non-transparent object in a game screen;
based on a plurality of first blocks divided by a screen and the light source data, constructing a first light list of each first block by utilizing a light rejection algorithm, wherein the first blocks refer to blocks involved in rendering the non-transparent object;
Extracting material information of the object from rendering attribute information of the object;
if the object is determined to be a first type object based on the material information of the object, extracting depth information and normal line information of the object from rendering attribute information of the object, and rendering the object in a forward rendering mode based on a first lamplight list of each first partition, the depth information, the normal line information and the material information of the object;
and if the object is determined to be a second type object based on the material information of the object, caching the rendering attribute information of the object to a plurality of geometric buffers, and rendering the object in a delayed rendering mode based on the first lamplight list of each first partition and the information cached in the geometric buffers.
2. The method of claim 1, further comprising, prior to said constructing the first light list for each of said first partitions:
determining the type of a graphics processor in the mobile terminal;
determining a target light rejection algorithm suitable for the graphic processor in the mobile terminal according to the configured light rejection algorithms suitable for the different types of graphic processors;
The first blocks divided based on the screen and the light source data construct a first light list of each first block by utilizing a light rejection algorithm, including:
and constructing a first lamplight list of each first partition by utilizing the target lamplight rejection algorithm in a graphic processor of the mobile terminal based on the first partitions divided by the screen and the light source data.
3. The method of claim 1, wherein the rendering the object in a delayed rendering manner based on the first light list of each of the first partitions and the information cached in the plurality of geometric buffers comprises:
determining a target delay rendering code applicable to the object based on the material quality of the object suitable for each set of delay rendering codes and combining the material information of the object, wherein the target delay rendering code belongs to the sets of delay rendering codes;
and calling the target delay rendering code to execute delay rendering of the object based on the first light list of each first partition and the information cached in the plurality of geometric buffers.
4. The method of claim 1, wherein the picture data further comprises: volume fog data corresponding to each pixel point in the game picture and used for volume fog rendering;
After the picture data to be rendered in the game is obtained, the method further comprises the following steps:
based on a plurality of second blocks divided by a screen and the light source data, constructing a second light list of each second block by utilizing a light rejection algorithm, wherein the area of the second block is larger than that of the first block;
generating an initial volume texture image based on the volume fog data of each pixel point in the game picture, wherein the volume fog data is associated with each pixel point in the initial volume texture image;
and determining the volume light effect of the screen based on the initial texture image, the second light list and the light source data.
5. The method as recited in claim 4, further comprising:
determining the volume fog color of a screen based on the volume fog data of each pixel point in the game picture;
and superposing the volume fog color on the volume light effect of the screen.
6. The method of claim 4, wherein the picture data further comprises: depth information and normal information of the game picture;
before said determining the volumetric light effect of the screen, further comprising:
constructing a plurality of depth maps with different resolutions based on the depth information of the game picture;
Combining the multiple depth maps and normal information of the game picture to determine the space reflection effect of the screen;
and determining the ambient light shielding effect of each pixel point in the screen based on the depth information of the game picture.
7. The method of claim 1 or 4, wherein the picture data further comprises: sky data of sky in the game picture and rendering attribute information of at least one transparent object;
the method further comprises the steps of:
rendering sky of the game scene based on the sky data;
rendering the transparent object based on rendering attribute information of the transparent object.
8. A game screen rendering apparatus, which is applied to a mobile terminal, comprising:
a data obtaining unit configured to obtain picture data to be rendered in a game, the picture data including: light source data of a light source and rendering attribute information of at least one non-transparent object in a game screen;
the first construction unit is used for constructing a first lamplight list of each first partition based on a plurality of first partitions divided by a screen and the light source data by utilizing a lamplight rejection algorithm, wherein the first partitions refer to the partitions involved in rendering the non-transparent object;
A material extraction unit, configured to extract material information of the object from rendering attribute information of the object;
a first object rendering unit, configured to, if it is determined that the object is a first type object based on material information of the object, extract depth information and normal information of the object from rendering attribute information of the object, and render the object in a forward rendering manner based on a first light list of each first partition and the depth information, the normal information and the material information of the object;
and the second object rendering unit is used for caching rendering attribute information of the object into a plurality of geometric buffers if the object is determined to be a second type object based on the material information of the object, and rendering the object in a delayed rendering mode based on the first lamplight list of each first partition and the information cached in the geometric buffers.
9. The apparatus as recited in claim 8, further comprising:
the type determining unit is used for determining the type of the graphic processor in the mobile terminal before the first constructing unit constructs the first lamplight list of each first block;
The rejection algorithm determining unit is used for determining a target rejection algorithm suitable for the graphic processor in the mobile terminal according to the configured lamp rejection algorithm suitable for the graphic processor of different types;
the first building unit includes:
the first construction subunit is configured to construct, in the graphics processor of the mobile terminal, a first light list of each first partition by using the target light rejection algorithm based on a plurality of first partitions divided by a screen and the light source data.
10. The apparatus of claim 8, wherein the second object rendering unit comprises:
the code determining unit is used for determining a target delay rendering code applicable to the object based on the material quality of the object suitable for each set of delay rendering codes and combining the material information of the object, wherein the target delay rendering code belongs to the sets of delay rendering codes;
and the rendering code calling unit is used for calling the target delay rendering code to execute delay rendering of the object based on the first lamplight list of each first partition and the information cached in the plurality of geometric buffers.
CN202211285899.7A 2022-10-20 2022-10-20 Game picture rendering method and device Active CN115526977B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211285899.7A CN115526977B (en) 2022-10-20 2022-10-20 Game picture rendering method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211285899.7A CN115526977B (en) 2022-10-20 2022-10-20 Game picture rendering method and device

Publications (2)

Publication Number Publication Date
CN115526977A CN115526977A (en) 2022-12-27
CN115526977B true CN115526977B (en) 2023-07-21

Family

ID=84703071

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211285899.7A Active CN115526977B (en) 2022-10-20 2022-10-20 Game picture rendering method and device

Country Status (1)

Country Link
CN (1) CN115526977B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10803655B2 (en) * 2012-06-08 2020-10-13 Advanced Micro Devices, Inc. Forward rendering pipeline with light culling
CN108236783B (en) * 2018-01-09 2020-10-23 网易(杭州)网络有限公司 Method and device for simulating illumination in game scene, terminal equipment and storage medium
CN108564646B (en) * 2018-03-28 2021-02-26 腾讯科技(深圳)有限公司 Object rendering method and device, storage medium and electronic device
CN114782613A (en) * 2022-04-29 2022-07-22 北京字跳网络技术有限公司 Image rendering method, device and equipment and storage medium

Also Published As

Publication number Publication date
CN115526977A (en) 2022-12-27

Similar Documents

Publication Publication Date Title
US11645810B2 (en) Method for continued bounding volume hierarchy traversal on intersection without shader intervention
US20210027525A1 (en) Forward rendering pipeline with light culling
WO2022111619A1 (en) Image processing method and related apparatus
US8115763B2 (en) Device for the photorealistic representation of dynamic, complex, three-dimensional scenes by means of ray tracing
Szirmay-Kalos et al. Approximate ray-tracing on the gpu with distance impostors
US10049486B2 (en) Sparse rasterization
US10055883B2 (en) Frustum tests for sub-pixel shadows
KR20010085424A (en) Graphics processor with deferred shading
CN113900797B (en) Three-dimensional oblique photography data processing method, device and equipment based on illusion engine
US10600232B2 (en) Creating a ray differential by accessing a G-buffer
WO2022143367A1 (en) Image rendering method and related device therefor
WO2023142607A1 (en) Image rendering method and apparatus, and device and medium
Harada et al. Forward+: A step toward film-style shading in real time
CN112041894B (en) Enhancing realism of a scene involving a water surface during rendering
Harada A 2.5 D culling for forward+
CN115526977B (en) Game picture rendering method and device
Di Koa et al. Interactive rendering of translucent materials under area lights using voxels and Poisson disk samples
WO2024027237A1 (en) Rendering optimization method, and electronic device and computer-readable storage medium
Ruff et al. Dynamic per Object Ray Caching Textures for Real-Time Ray Tracing
CN117671104A (en) Rendering method, rendering device, electronic equipment and computer readable storage medium
Li et al. Stage Lighting Simulation Based on Epipolar Sampling
WO2021185771A1 (en) System and method for real-time ray tracing in a 3d environment
WO2024086382A1 (en) Methods and systems for rendering video graphics using scene segmentation
WO2022131949A1 (en) A device for performing a recursive rasterization
WO2023285161A1 (en) System and method for real-time ray tracing in a 3d environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant