CN115526977A - Game picture rendering method and device - Google Patents

Game picture rendering method and device Download PDF

Info

Publication number
CN115526977A
CN115526977A CN202211285899.7A CN202211285899A CN115526977A CN 115526977 A CN115526977 A CN 115526977A CN 202211285899 A CN202211285899 A CN 202211285899A CN 115526977 A CN115526977 A CN 115526977A
Authority
CN
China
Prior art keywords
rendering
information
light
list
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211285899.7A
Other languages
Chinese (zh)
Other versions
CN115526977B (en
Inventor
洪晓健
靖超
何允恒
胡佳
贺智艺
张健
田国刚
李小航
滕泽榞
姜凯伦
张继生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Changyou Chuangxiang Software Technology Co ltd
Original Assignee
Beijing Changyou Chuangxiang Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Changyou Chuangxiang Software Technology Co ltd filed Critical Beijing Changyou Chuangxiang Software Technology Co ltd
Priority to CN202211285899.7A priority Critical patent/CN115526977B/en
Publication of CN115526977A publication Critical patent/CN115526977A/en
Application granted granted Critical
Publication of CN115526977B publication Critical patent/CN115526977B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The application provides a game picture rendering method and a game picture rendering device, wherein the method comprises the following steps: acquiring picture data to be rendered in a game; constructing a first light list of each first block by using a light eliminating algorithm based on a plurality of first blocks divided by a screen and light source data; extracting material information of the object from the rendering attribute information of the object; if the object is determined to be a first type object based on the material information of the object, rendering the object by adopting a forward rendering mode based on the first lamplight list of each first block and the depth information, the normal information and the material information of the object; and if the object is determined to be the second type object based on the material information of the object, caching rendering attribute information of the object into a plurality of geometric buffer areas, and rendering the object by adopting a delayed rendering mode based on the first lamplight list of each first block and the cached information in the plurality of geometric buffer areas. According to the scheme, the rendering effect of the game picture can be improved on the premise of giving consideration to the rendering performance.

Description

Game picture rendering method and device
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for rendering a game screen.
Background
With the development of game technology, the types of games played on mobile terminals such as mobile phones are increasing.
For some game applications running on the mobile terminal, the mobile terminal needs to render game screens of the game applications and finally output the game screens to a display screen of the mobile terminal. However, the game screen rendering has a high requirement on hardware such as a video memory of the mobile terminal, and the hardware such as the video memory of the mobile terminal has relatively low performance, and may not meet the high performance rendering requirement.
Disclosure of Invention
The application provides a game picture rendering method and device, which aim to improve the rendering effect of a game picture on the premise of giving consideration to rendering performance.
In one aspect, the present application provides a game screen rendering method applied to a mobile terminal, including:
obtaining picture data to be rendered in a game, wherein the picture data comprises: light source data of a light source in a game picture and rendering attribute information of at least one non-transparent object;
constructing a first light list of each first block by using a light eliminating algorithm based on a plurality of first blocks divided by a screen and the light source data;
extracting material information of the object from rendering attribute information of the object;
if the object is determined to be a first type object based on the material information of the object, extracting depth information and normal information of the object from rendering attribute information of the object, and rendering the object by adopting a forward rendering mode based on a first light list of each first block and the depth information, the normal information and the material information of the object;
and if the object is determined to be a second type object based on the material information of the object, caching rendering attribute information of the object into a plurality of geometric buffer areas, and rendering the object by adopting a delayed rendering mode based on the first lamplight list of each first block and the cached information in the geometric buffer areas.
In a possible implementation manner, before the constructing the first light list of each first segment, the method further includes:
determining the type of a graphics processor in the mobile terminal;
determining a target light eliminating algorithm suitable for the graphic processor in the mobile terminal according to the light eliminating algorithms suitable for the configured different types of graphic processors;
the method for constructing a first light list of each first block by using a light elimination algorithm based on a plurality of first blocks divided by a screen and the light source data comprises the following steps:
and constructing a first light list of each first block by utilizing the target light eliminating algorithm in a graphic processor of the mobile terminal based on the plurality of first blocks divided by the screen and the light source data.
In another possible implementation manner, the rendering the object in a delayed rendering manner based on the first light list of each first partition and the information cached in the plurality of geometric buffers includes:
determining target delayed rendering codes applicable to the object based on object materials suitable for multiple sets of delayed rendering codes respectively and by combining material information of the object, wherein the target delayed rendering codes belong to the multiple sets of delayed rendering codes;
and calling the target delayed rendering code to execute delayed rendering of the object based on the first light list of each first block and the information cached in the plurality of geometric buffers.
In yet another possible implementation manner, the picture data further includes: volume fog data corresponding to each pixel point in the game picture and used for volume fog rendering;
after the obtaining of the picture data to be rendered in the game, the method further includes:
constructing a second light list of each second block by using a light eliminating algorithm based on a plurality of second blocks divided by a screen and the light source data, wherein the area of each second block is larger than that of each first block;
generating an initial volume texture image based on volume fog data of each pixel point in a game picture, wherein volume fog data are associated with each pixel point in the initial volume texture image;
determining a volumetric light effect of a screen based on the initial texture image, the second light list, and the light source data.
In another possible implementation manner, the method further includes:
determining the volume fog color of the screen based on the volume fog data of each pixel point in the game picture;
superimposing the volume fog color on the volume light effect of the screen.
In yet another possible implementation manner, the picture data further includes: depth information and legal line information of the game picture;
before the determining the volume light effect of the screen, the method further comprises:
constructing a plurality of depth maps with different resolutions based on the depth information of the game picture;
determining the spatial reflection effect of the screen by combining the multiple depth maps and the normal information of the game picture;
and determining the ambient light shielding effect of each pixel point in the screen based on the depth information of the game picture.
In yet another possible implementation manner, the picture data further includes: sky data of the sky in the game picture and rendering attribute information of at least one transparent object;
the method further comprises the following steps:
rendering a sky of the game frame based on the sky data;
rendering the transparent object based on rendering attribute information of the transparent object.
In another aspect, the present application further provides a game screen rendering apparatus applied to a mobile terminal, including:
a data obtaining unit, configured to obtain picture data to be rendered in a game, where the picture data includes: light source data of a light source in a game picture and rendering attribute information of at least one non-transparent object;
the first building unit is used for building a first light list of each first block by using a light eliminating algorithm based on a plurality of first blocks divided by a screen and the light source data;
a material extraction unit for extracting material information of the object from rendering attribute information of the object;
a first object rendering unit, configured to, if it is determined that the object is a first type object based on the material information of the object, extract depth information and normal information of the object from rendering attribute information of the object, and render the object in a forward rendering manner based on a first light list of each first partition and the depth information, normal information, and material information of the object;
and the second object rendering unit is used for caching rendering attribute information of the object into a plurality of geometric buffer areas if the object is determined to be a second type object based on the material information of the object, and rendering the object by adopting a delayed rendering mode based on the first light list of each first block and the cached information in the geometric buffer areas.
In one possible implementation manner, the method further includes:
the type determining unit is used for determining the type of a graphic processor in the mobile terminal before the first building unit builds the first light list of each first block;
the removing algorithm determining unit is used for determining a target light removing algorithm suitable for the graphic processor in the mobile terminal according to the light removing algorithms suitable for the configured graphic processors of different types;
the first building element comprising:
and the first construction subunit is used for constructing a first light list of each first block by utilizing the target light eliminating algorithm in a graphic processor of the mobile terminal based on the plurality of first blocks divided by the screen and the light source data.
In another possible implementation manner, the second object rendering unit includes:
the code determination unit is used for determining target delayed rendering codes applicable to the object based on object materials suitable for multiple sets of delayed rendering codes respectively and by combining material information of the object, wherein the target delayed rendering codes belong to the multiple sets of delayed rendering codes;
and the rendering code calling unit is used for calling the target delayed rendering code to execute delayed rendering of the object based on the first light list of each first block and the information cached in the plurality of geometric buffer areas.
Therefore, after the picture data to be rendered in the game is obtained, the light list of each block is constructed by using the light eliminating algorithm based on the plurality of blocks divided from the screen and the light source data in the picture data. On the basis, for non-transparent objects in a game picture to be rendered, if the objects are determined to be first type objects suitable for forward rendering based on the material information of the objects, the objects are rendered in a forward rendering mode by combining with a light list; if the object is determined to be the second type object suitable for delayed rendering based on the material information of the object, the object is rendered in a delayed rendering mode by combining the light list, so that different rendering modes can be reasonably selected by combining the rendering requirements of different objects, and the rendering effect of the object in the game picture can be effectively ensured; moreover, only singly adopt to render to all objects based on the preceding rendering of piecemeal or delay rendering, this application adopts two kinds of rendering pipelines of the preceding rendering of piecemeal and delay rendering to render different objects in the mobile terminal, effectively utilizes the respective advantage of two kinds of rendering modes, both can compromise the rendering performance, is favorable to guaranteeing the rendering effect of game picture again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only the embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on the provided drawings without creative efforts.
FIG. 1 is a schematic flow chart illustrating a game screen rendering method according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart illustrating a game screen rendering method according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating a composition structure of a game screen rendering apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without any inventive step are within the scope of protection of the present application.
As shown in fig. 1, a schematic flow chart of a game screen rendering method provided in the embodiment of the present application is shown, and the method of the present embodiment is applied to a mobile terminal, which may be a mobile phone or a tablet computer, and the like, which is not limited thereto.
The method of the embodiment can comprise the following steps:
s101, obtaining picture data to be rendered in the game.
It is understood that the screen data to be rendered refers to screen data of a game screen to be output. The picture data may include various data required for rendering a game picture.
For example, the screen data of the game screen may be model data of a game model, may include information related to a game scene, and the like, and is not limited thereto.
It is understood that the game screen includes objects such as characters, animals, or articles in the game, and therefore, the screen data at least includes rendering attribute information of the objects in the game. The rendering attribute information of the object is basic information required for rendering the object, for example, the rendering attribute information of the object may include part or all of information such as a type, a name, depth information, normal information, and material information of the object, which is not limited herein.
It can be understood that there are many factors to be considered in rendering the non-transparent objects other than the transparent object in the game screen, and based on this, the present embodiment mainly introduces the non-transparent object in the game screen. Accordingly, in this embodiment, the screen data at least includes rendering attribute information of at least one non-transparent object in the game screen. The rendering attribute information of the non-transparent object includes: the material information of the object, such as the material identification of the object, can represent the relevant information of the material of the object. Of course, the rendering attribute information may also include other information mentioned above, which is not limited thereto.
In addition, the picture data further includes: the light source data of the light sources in the game picture, such as the number of light sources, the type and position of each light source, etc., are not limited.
Of course, the picture data further includes other information of the scene corresponding to the game picture, which is not described herein again.
S102, constructing a first light list of each first block by using a light eliminating algorithm based on a plurality of first blocks and light source data divided by a screen.
The divided blocks of the screen are a plurality of blocks (tiles) obtained by logically dividing the screen. Each tile, also called a tile, refers to the individual small blocks into which the screen is logically divided.
In this application, for convenience of distinction, the partition involved in rendering the non-transparent object in this embodiment is referred to as a first partition, and accordingly, the light list constructed for the first partition is referred to as a first light list.
The first light list of each first partition may include light information of a light source that affects the first partition, and the light information may be part or all of light source data of the light source.
It can be understood that the first light list of each first block is constructed by using the light elimination algorithm, so that processing such as coloring of a rendering channel can be performed on each light source (namely light) no longer blindly, and only the light influencing the game scene needs to be traversed, and the effect of the light influencing the game scene is superposed on each block, so that the number of lights needing to be processed in the coloring process can be reduced, and the data processing amount is reduced.
In the present application, the Light elimination algorithm may have various possibilities, for example, the Light elimination algorithm may be a Fine pointed Light (FPTL) algorithm, or a Cluster (Cluster) based Light elimination algorithm, and the like, which is not limited thereto.
It can be understood that, compared to a Graphics Processing Unit (GPU), the light elimination and the light list construction are completed in a Central Processing Unit (CPU), and therefore, the degree of parallelism is low and the efficiency is relatively low. Based on this, the operating light elimination algorithm can be executed in a Graphics Processing Unit (GPU) of the mobile terminal and a first light list of each first partition is constructed.
Through the research of the inventor of the application, the following results are found: in building the light list, multiple threads and thread groups required for building a buffer for a shader are required, which requires consideration of compatibility issues between the GPU as a parallel processor and the GPU's ability to handle complex data computations.
And the inventor further discovers through a great deal of research: the light elimination algorithms suitable for different GPUs are different, and by selecting the reasonable light elimination algorithm, the incompatibility problem between the complex data calculation and the parallel processor of the mobile terminal can be reduced. On the basis, the light elimination algorithm suitable for the GPUs of different types is determined through a large number of implementations.
For example, the manufacturer A GPU may be better suited for the FPTL algorithm, while the manufacturer B GPU may be better suited for lamp rejection based on the Cluster algorithm.
On the basis, the application can determine the type of the graphics processor in the mobile terminal. And then, determining a target light elimination algorithm suitable for the graphics processor in the mobile terminal according to the light elimination algorithms suitable for the configured different types of graphics processors. Correspondingly, based on the plurality of first blocks and the light source data divided by the screen, the target light eliminating algorithm can be utilized in the GPU of the mobile terminal to construct a first light list of each first block.
S103, aiming at each non-transparent object to be rendered, extracting the material information of the object from the rendering attribute information of the object.
The material information of the object may include related information representing the material of the object, for example, the material information of the object may be a material identifier of the object, such as a unique identifier, such as a name or a number, of the material of the object. Of course, the material information of the object may further include: the information such as material characteristics of the object is not limited to this.
And S104, if the object is determined to be the first type object based on the material information of the object, extracting the depth information and the normal information of the object from the rendering attribute information of the object, and rendering the object by adopting a forward rendering mode based on the first lamplight list of each first block and the depth information, the normal information and the material information of the object.
In the application, considering that the materials of the objects are different, the effects and some special requirements that the objects need to achieve in object rendering are also different, and therefore, the rendering modes suitable for different objects are naturally different. Therefore, in order to take hardware performance and rendering effect into consideration, the rendering mode of the rendering object needs to be reasonably determined by combining the material of the object. Different from the existing rendering pipeline which only deploys a single rendering mode, the rendering pipeline which corresponds to the two rendering modes of delayed rendering and forward rendering can be adopted for parallel rendering in the application.
On this basis, the rendering mode that the object of different materials is suitable can be predetermined. On this basis, for a non-transparent object to be rendered, the type of the object may be determined based on material information of the object. Wherein the first type objects are objects suitable for forward rendering and the second type objects are objects suitable for delayed rendering.
If the material information of the object represents that the object is a first type object, the object may be forward rendered by combining the first light list of each first partition based on the depth information, the normal information, and the material information of the object. It can be understood that rendering of the first type of object is based on tile forward rendering essentially, and the rendering mode is a rendering technology combining forward rendering and tile light rejection to reduce the number of lights in the coloring process.
It can be understood that, after the depth information and the normal information of the first type object are extracted, the present application caches the depth information, the normal information and the material information of the object in a cache area associated with a game picture to be rendered, so that when the object needs to be rendered, forward rendering is performed based on the information.
And S105, if the object is determined to be the second type object based on the material information of the object, caching rendering attribute information of the object into a plurality of geometric buffer areas, and rendering the object by adopting a delayed rendering mode based on the first lamplight list of each first block and the cached information in the geometric buffer areas.
Wherein, different Geometry buffers (G-buffers) store different kinds of rendering attribute information of the object.
For example, in one possible implementation, for the second type object, the present application may construct four geometric buffers, namely a first geometric buffer, a second geometric buffer, a third geometric buffer, and a fourth geometric buffer. Caching in a first geometry buffer: primary color and specular occlusion information of the object; and the second geometric buffer area caches the normal information and the roughness information corresponding to the object. Caching the metal information, the rendering layer information, the shadow shielding cover information and the material identification code information of the object in a third geometric buffer area; the fourth geometry buffer caches baking diffuse reflection illumination information or self-luminous color information of the object.
Wherein, the information cached in the four geometric cache regions all belong to the rendering attribute information of the object.
It can be understood that if the material information of the object indicates that the object is the second type object, the present application may be based on the delayed rendering of the light list of each first tile. Because the delayed rendering can effectively reduce illumination calculation, the rendering performance of the object can be greatly improved, the illumination calculation can be further reduced by combining the delayed rendering of the light list of tile, and invalid input and output can be reduced by only calculating one block at a time.
Based on this, for some objects which have low rendering requirements or do not need special rendering requirements, the delayed rendering is adopted for rendering instead of the forward rendering, and the rendering performance is favorably improved.
It will be appreciated that for objects of the second type, there may be differences in the material of the different objects, and that objects of different material may be suitable for deferred rendering. Based on this, in order to perform the delayed rendering of the object more efficiently and reasonably, the present application may further pre-configure a plurality of sets of delayed rendering codes, and pre-configure at least one object material suitable for each of the plurality of sets of delayed rendering codes.
On this basis, for each object belonging to the second type of object, the target delayed rendering code applicable to the object may be determined based on the respective suitable object material of the plurality of sets of delayed rendering codes in combination with the material information of the object. The target delayed rendering code belongs to the plurality of sets of delayed rendering codes. Accordingly, the target deferred rendering code may be invoked to perform deferred rendering of the object based on the first light list of each first partition and information cached in the plurality of geometry buffers.
According to the method, after the picture data to be rendered in the game are obtained, the light list of each block is constructed by using the light elimination algorithm based on the plurality of blocks divided by the screen and the light source data in the picture data. On the basis, for a non-transparent object in a game picture to be rendered, if the object is determined to be a first type object suitable for forward rendering based on the material information of the object, the object is rendered in a forward rendering mode by combining with a light list; if the object is determined to be the second type object suitable for delayed rendering based on the material information of the object, the object is rendered in a delayed rendering mode by combining the light list, so that different rendering modes can be reasonably selected by combining the rendering requirements of different objects, and the rendering effect of the object in the game picture can be effectively ensured; moreover, only singly adopt to render to all objects based on the preceding rendering of blocking or postpone rendering, this application adopts two kinds of rendering pipelines of the preceding rendering of blocking and postpone rendering to render to different objects in parallel in mobile terminal, effectively utilizes the respective advantage of two kinds of rendering modes to can compromise the rendering performance and promote the rendering effect of recreation picture.
It is understood that some transparent objects may be involved in the game screen in addition to the non-transparent objects. Based on this, the rendering attribute information of the transparent object is also included in the screen data. Correspondingly, after rendering the non-transparent object, the method may further include: the transparent object is rendered based on the rendering properties of the transparent object.
It can be understood that, besides objects such as articles and characters, the game picture also relates to rendering of some game scene environments such as sky and background.
Accordingly, the present application may further involve that the picture data may further include volume fog data required for volume fog rendering, and accordingly, the rendering of the game picture in the present application may further include rendering of a volume fog effect and the like of the screen. Further, the rendering of the game picture may further include rendering of a spatial reflection effect of the screen, an ambient light shielding effect, and the like.
The rendering of the game screen of the present application is described below with reference to one implementation.
As shown in fig. 2, which shows another schematic flow chart of the method for rendering a game screen in the embodiment of the present application, the method in this embodiment may include:
s201, obtaining picture data to be rendered in the game.
The picture data comprises light source data of a light source in the game picture and rendering attribute information of a transparent object and an opaque object in the game picture.
The picture data further includes: and volume fog data used for volume fog rendering, depth information and normal information of the game picture and sky data of the sky in the game picture, which correspond to each pixel point in the game picture.
The volume fog data is related data required for rendering the volume fog into the game picture. The sky data refers to relevant data required for completing the sky in a scene corresponding to the game picture, such as the size and number of clouds in the sky, the sky color mode, and the like, which is not limited.
S202, the material information of the object is extracted from the rendering attribute information of the object.
S203, if the object represented by the material information of the object belongs to the non-transparent first type object, extracting the depth information and the normal information of the object from the rendering attribute information of the object, and caching the depth information, the normal information and the material information of the object.
S204, if the material information of the object represents that the object belongs to the non-transparent second type object, the rendering attribute information of the object is cached to a plurality of geometric cache regions.
In this embodiment, it is described as an example that after the material information of the object is extracted, whether the object is a transparent object is determined based on the material information of the object. But the same applies if it is determined by other means whether the object is a transparent object.
The above steps S203 and S204 can refer to the related description of the previous embodiment, and are not described herein again.
And S205, constructing a plurality of depth maps with different resolutions based on the depth information of the game picture.
There are many possibilities for constructing the depth map here, which is not limited to this.
For example, in one possible implementation, at least one Hierarchical Z-buffer (HIZ) depth map is generated based on depth information of a game frame.
In the present embodiment, the purpose of constructing the depth map is to subsequently determine the spatial reflection effect and the ambient light blocking effect of the screen, and if the two effects are not turned on in the game, the step S205 may not be executed.
S206, constructing a first light list of each first block by using a light eliminating algorithm based on the plurality of first blocks and the light source data divided by the screen.
The step S206 can refer to the related description of the previous embodiment, and is not described herein again.
And S207, constructing a second lamplight list of each second block by using a lamplight eliminating algorithm based on the plurality of second blocks and the light source data divided by the screen.
The area of the second block is larger than that of the first block, or the size of the second block is larger than that of the first block.
For example, the second partition may be composed of four or more first partitions. For example, the first tile may be 32 x 32 pixels in size, while the second grid may be 64 x 64 pixels in size.
It can be understood that, in step S207, except that the size of the second partition is different from the size of the first partition, the process of constructing the second light list of each second partition is similar to the process of constructing the first light list of each first partition, which may specifically refer to the foregoing description, and is not repeated herein.
Optionally, the method and the device can also determine a plurality of three-dimensional units divided by a view cone corresponding to the screen, each three-dimensional unit is a Cluster (Cluster), and on the basis, a stereoscopic pixel light source list of each three-dimensional unit is constructed based on light source data in the picture data, so as to be used for determining the volume fog effect and the like subsequently.
S208, based on the volume fog data of each pixel point in the game picture, generating an initial volume texture image.
And each pixel point in the initial volume texture image is associated with volume fog data.
Each pixel point in the initial volume texture image has a one-to-one object relationship with each pixel point in the game picture (or screen), so that the volume fog data of each pixel point in the game picture can be associated to the initial volume texture image in the initial volume texture image generated by the method.
Wherein the initial volume texture image is generated for subsequent volume fog effect rendering.
It is understood that, in practical applications, it may be detected whether the volume light effect or the volume fog effect is turned on in the game, and if the volume fog effect or the volume light effect is not turned on in the game, the operations related to the step S208 and the subsequent determination of the volume light effect are not required to be performed.
The above steps S202 to S208 are some basic preparation works for the following object, volume fog rendering or screen space reflection and other related rendering, and the order of the above steps may not be limited in practical application.
S209, combining the plurality of depth maps and the normal information in the game picture to determine the space reflection effect of the screen.
The spatial reflection effect of the screen may include a spatial reflection effect corresponding to each pixel point (i.e., each pixel point corresponding to the game picture) in the screen.
For example, the light stepping processing may be performed by combining multiple depth maps and the normal information of the game screen, and the color of the light reflected sampled in the screen may be determined.
There may be many different algorithms for determining the spatial reflection effect of the screen, which is not limited to this.
For example, the spatial Reflection effect of the Screen may be determined based on a Screen Space Reflection (SSR) algorithm, or determined based on a Screen Space Planar Reflection (SSPR) algorithm.
It is understood that the step S209 may not be executed if it is detected that the screen reflection effect is not turned on in the game, and the operations related to the step S209 and the previously determined depth map are executed only in the case of determining that the screen reflection effect is turned on in the game.
S210, determining the ambient light shielding effect of each pixel point in the screen based on the depth information of the game picture.
The ambient light shielding effect represents the shielding effect of the ambient environment in the game picture on the ambient light.
In the application, the specific implementation manner for determining the ambient light shielding effect of each pixel point in the screen may be multiple, and is not limited.
In order to make the game picture loaded with the Ambient light shielding effect more real, the Ambient light shielding effect of each pixel point in the screen can be calculated based on a Ground-based Ambient light shielding (GTAO) algorithm of Ground Truth. On the basis of a horizontal basis Ambient light shielding (HBAO) algorithm, the GTAO adds a Cosine Weight (Cosine-Weighted is an algorithm for sampling a sphere, and is commonly used in the field of sampling an incident direction during path tracking, and the like) to achieve a more real Ambient light shielding effect.
And S211, determining the volume light effect of the screen based on the initial texture image, the second light list and the light source data.
The volume light is a very common special illumination effect in games, and is mainly used for representing a light column which is leaked out from a light-transmitting part of an object when light irradiates a shielding object. It is called volume light because it gives a visually strong sense of volume.
In an alternative, if it is confirmed that the volume light or volume fog effect is turned on in the game, the step S211 is performed; if the volume light or the volume fog effect is not turned on in the game, the step S211 does not need to be performed.
S212, based on the volume fog data of each pixel point in the game picture, the volume fog color of the screen is determined, and the volume fog color is superposed on the volume light effect of the screen.
It can be understood that the fog color is not reflected in the volume light effect, and the volume fog color effect with the fog color can be obtained by superimposing the volume fog color on the volume light effect.
The volume fog effect is actually rendered by superimposing the volume fog color on the volume light effect. The function of the volume fog is to make the cloud fog in the game show a more natural effect instead of being pasted with pictures, so that the light texture and the cloud fog texture are both more real.
And S213, aiming at the non-transparent objects belonging to the first type of objects, rendering the objects in a forward rendering mode based on the first lamplight list of each first block and the cached depth information, normal information and material information of the objects.
S214, aiming at the non-transparent objects belonging to the second type of objects, rendering the objects in a delayed rendering mode based on the first lamplight list of each first block and the cached information in the plurality of geometric buffer areas.
The steps S213 and S214 can refer to the related description of the previous embodiments, which is not limited.
It is understood that the sequence of steps S213 and S214 is not limited. Meanwhile, the sequence of steps S213 and S214 and the previous steps S209 to S212 is not limited to that shown in fig. 2, and in practical application, steps S13 and S214 may be executed first, and then steps S209 to S214 may be executed.
S215, rendering the sky of the game picture based on the sky data.
It will be appreciated that in practical applications, rendering of a scene other than an object in a game scene may also involve determining a color level map in the game scene, the color level map being used for refraction and reflection to ensure that brightness is not lost at low resolution, and of course, other rendering processes may be involved, without limitation.
S216, for the transparent object determined based on the material information of the object, rendering the transparent object based on the object data of the transparent object.
The specific rendering for the transparent object may be set according to needs, for example, a conventional forward rendering may be used, which is not limited in this regard.
It is understood that post-processing rendering may also be performed after this step S216, and the above rendering results are finally loaded to present a game screen.
It can be understood that in the embodiment of the present application, the spatial reflection effect and the volume fog effect of the screen may be turned on or off in the game as needed, so as to flexibly configure the corresponding rendering pipeline, and present or not present the corresponding rendering effect as needed.
The application also provides a game picture rendering device. Fig. 3 is a schematic diagram illustrating a composition structure of a game screen rendering device according to an embodiment of the present disclosure, where the device is applied to a mobile terminal. The apparatus of this embodiment may include:
a data obtaining unit 301, configured to obtain screen data to be rendered in a game, where the screen data includes: light source data of a light source and rendering attribute information of at least one non-transparent object in a game picture;
a first constructing unit 302, configured to construct a first light list of each first partition by using a light culling algorithm based on a plurality of first partitions divided by a screen and the light source data;
a material extracting unit 303, configured to extract material information of the object from rendering attribute information of the object;
a first object rendering unit 304, configured to, if it is determined that the object is a first type object based on the material information of the object, extract depth information and normal information of the object from rendering attribute information of the object, and render the object in a forward rendering manner based on a first light list of each first partition and the depth information, normal information, and material information of the object;
a second object rendering unit 305, configured to, if it is determined that the object is a second type object based on the material information of the object, cache rendering attribute information of the object in a plurality of geometric buffer areas, and render the object in a delayed rendering manner based on the first light list of each first partition and the cached information in the plurality of geometric buffer areas.
In one possible implementation, the apparatus further includes:
the type determining unit is used for determining the type of a graphic processor in the mobile terminal before the first building unit builds the first light list of each first block;
the system comprises a rejection algorithm determining unit, a target light rejection algorithm determining unit and a light rejection algorithm judging unit, wherein the rejection algorithm determining unit is used for determining the target light rejection algorithm suitable for the graphics processor in the mobile terminal according to the light rejection algorithms suitable for the configured graphics processors of different types;
the first building element comprising:
and the first construction subunit is used for constructing a first light list of each first block by utilizing the target light eliminating algorithm in a graphic processor of the mobile terminal based on the plurality of first blocks divided by the screen and the light source data.
In another possible implementation manner, the second object rendering unit includes:
the code determination unit is used for determining target delayed rendering codes applicable to the object based on object materials suitable for multiple sets of delayed rendering codes respectively and by combining material information of the object, wherein the target delayed rendering codes belong to the multiple sets of delayed rendering codes;
and the rendering code calling unit is used for calling the target delayed rendering code to execute delayed rendering of the object based on the first lamplight list of each first partition and the information cached in the plurality of geometric buffer areas.
In yet another possible implementation manner, the picture data obtained by the data obtaining unit further includes: volume fog data corresponding to each pixel point in the game picture and used for volume fog rendering;
the device also includes:
a second construction unit, configured to construct a second light list of each second segment by using a light elimination algorithm based on a plurality of second segments divided by a screen and the light source data after the data obtaining unit obtains picture data to be rendered in a game, where an area of the second segment is larger than an area of the first segment;
the texture generating unit is used for generating an initial volume texture image based on volume fog data of each pixel point in a game picture, and volume fog data is associated with each pixel point in the initial volume texture image;
a volume light determination unit for determining a volume light effect of a screen based on the initial texture image, the second light list and the light source data.
Further, the apparatus may further include:
the fog color determining unit is used for determining the volume fog color of the screen based on the volume fog data of each pixel point in the game picture;
and the volume fog determining unit is used for superposing the volume fog color on the volume light effect of the screen.
In yet another possible implementation manner, the picture data obtained by the data obtaining unit further includes: depth information and normal information of the game picture;
the device also includes:
the depth map construction unit is used for constructing a plurality of depth maps with different resolutions based on the depth information of the game picture before the volume light determination unit determines the volume light effect of the screen;
the emission determining unit is used for determining the spatial reflection effect of the screen by combining the plurality of depth maps and the normal information of the game picture;
and the shading determining unit is used for determining the ambient light shading effect of each pixel point in the screen based on the depth information of the game picture.
Further, the picture data obtained by the data obtaining unit further includes: sky data of the sky in the game picture and rendering attribute information of at least one transparent object;
the device also includes:
a sky rendering unit to render a sky of the game screen based on the sky data;
a third object rendering unit for rendering the transparent object based on rendering attribute information of the transparent object.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. Also, the features described in the embodiments of the present specification may be replaced or combined with each other to enable one skilled in the art to make or use the present application. For the device-like embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
Finally, it should also be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of additional identical elements in the process, method, article, or apparatus that comprises the element.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make various improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (10)

1. A game picture rendering method is applied to a mobile terminal and comprises the following steps:
obtaining picture data to be rendered in a game, wherein the picture data comprises: light source data of a light source and rendering attribute information of at least one non-transparent object in a game picture;
constructing a first light list of each first block by using a light eliminating algorithm based on a plurality of first blocks divided by a screen and the light source data;
extracting material information of the object from rendering attribute information of the object;
if the object is determined to be a first type object based on the material information of the object, extracting depth information and normal information of the object from rendering attribute information of the object, and rendering the object by adopting a forward rendering mode based on a first lamplight list of each first block and the depth information, the normal information and the material information of the object;
and if the object is determined to be a second type object based on the material information of the object, caching rendering attribute information of the object into a plurality of geometric buffer areas, and rendering the object by adopting a delayed rendering mode based on the first lamplight list of each first block and the cached information in the geometric buffer areas.
2. The method of claim 1, further comprising, prior to said constructing a first list of lights for each of said first segments:
determining the type of a graphics processor in the mobile terminal;
determining a target light eliminating algorithm suitable for the graphic processor in the mobile terminal according to the light eliminating algorithms suitable for the configured different types of graphic processors;
the method for constructing a first light list of each first block by using a light eliminating algorithm based on a plurality of first blocks divided by a screen and the light source data comprises the following steps:
and constructing a first light list of each first block by utilizing the target light eliminating algorithm in a graphic processor of the mobile terminal based on a plurality of first blocks divided by a screen and the light source data.
3. The method of claim 1, wherein rendering the object in a delayed rendering manner based on the first light list of each of the first partitions and the information cached in the plurality of geometric buffers comprises:
determining target delayed rendering codes applicable to the object based on object materials suitable for multiple sets of delayed rendering codes respectively and by combining material information of the object, wherein the target delayed rendering codes belong to the multiple sets of delayed rendering codes;
and calling the target delayed rendering code to execute delayed rendering of the object based on the first lamplight list of each first partition and the cached information in the plurality of geometric buffers.
4. The method of claim 1, wherein the picture data further comprises: volume fog data corresponding to each pixel point in the game picture and used for volume fog rendering;
after the obtaining of the picture data to be rendered in the game, the method further includes:
constructing a second light list of each second block by using a light eliminating algorithm based on a plurality of second blocks divided by a screen and the light source data, wherein the area of each second block is larger than that of each first block;
generating an initial volume texture image based on volume fog data of each pixel point in a game picture, wherein each pixel point in the initial volume texture image is associated with volume fog data;
determining a volumetric light effect of a screen based on the initial texture image, the second light list, and the light source data.
5. The method of claim 4, further comprising:
determining the volume fog color of the screen based on the volume fog data of each pixel point in the game picture;
superimposing the volume fog color on the volume light effect of the screen.
6. The method of claim 4, wherein the picture data further comprises: depth information and normal information of the game picture;
before the determining the volume light effect of the screen, the method further comprises:
constructing a plurality of depth maps with different resolutions based on the depth information of the game picture;
determining the spatial reflection effect of the screen by combining the multiple depth maps and the normal information of the game picture;
and determining the ambient light shielding effect of each pixel point in the screen based on the depth information of the game picture.
7. The method according to claim 1 or 4, wherein the picture data further comprises: sky data of the sky in the game picture and rendering attribute information of at least one transparent object;
the method further comprises the following steps:
rendering a sky of the game frame based on the sky data;
rendering the transparent object based on rendering attribute information of the transparent object.
8. A game picture rendering device applied to a mobile terminal includes:
a data obtaining unit, configured to obtain picture data to be rendered in a game, where the picture data includes: light source data of a light source and rendering attribute information of at least one non-transparent object in a game picture;
the first building unit is used for building a first light list of each first block by using a light eliminating algorithm based on a plurality of first blocks divided by a screen and the light source data;
a material extraction unit for extracting material information of the object from rendering attribute information of the object;
a first object rendering unit, configured to extract depth information and normal information of the object from rendering attribute information of the object if the object is determined to be a first type object based on material information of the object, and render the object in a forward rendering manner based on a first light list of each first partition and the depth information, normal information, and material information of the object;
and the second object rendering unit is used for caching rendering attribute information of the object into a plurality of geometric buffer areas if the object is determined to be a second type object based on the material information of the object, and rendering the object by adopting a delayed rendering mode based on the first lamplight list of each first block and the cached information in the geometric buffer areas.
9. The apparatus of claim 8, further comprising:
the type determining unit is used for determining the type of a graphic processor in the mobile terminal before the first building unit builds the first light list of each first block;
the removing algorithm determining unit is used for determining a target light removing algorithm suitable for the graphic processor in the mobile terminal according to the light removing algorithms suitable for the configured graphic processors of different types;
the first building element comprising:
and the first construction subunit is used for constructing a first light list of each first block by utilizing the target light eliminating algorithm in a graphic processor of the mobile terminal based on the plurality of first blocks divided by the screen and the light source data.
10. The apparatus of claim 8, wherein the second object rendering unit comprises:
the code determination unit is used for determining target delayed rendering codes applicable to the objects based on the object material suitable for each of a plurality of sets of delayed rendering codes and by combining the material information of the objects, wherein the target delayed rendering codes belong to the plurality of sets of delayed rendering codes;
and the rendering code calling unit is used for calling the target delayed rendering code to execute delayed rendering of the object based on the first lamplight list of each first partition and the information cached in the plurality of geometric buffer areas.
CN202211285899.7A 2022-10-20 2022-10-20 Game picture rendering method and device Active CN115526977B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211285899.7A CN115526977B (en) 2022-10-20 2022-10-20 Game picture rendering method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211285899.7A CN115526977B (en) 2022-10-20 2022-10-20 Game picture rendering method and device

Publications (2)

Publication Number Publication Date
CN115526977A true CN115526977A (en) 2022-12-27
CN115526977B CN115526977B (en) 2023-07-21

Family

ID=84703071

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211285899.7A Active CN115526977B (en) 2022-10-20 2022-10-20 Game picture rendering method and device

Country Status (1)

Country Link
CN (1) CN115526977B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130328873A1 (en) * 2012-06-08 2013-12-12 Advanced Micro Devices, Inc. Forward rendering pipeline with light culling
CN108236783A (en) * 2018-01-09 2018-07-03 网易(杭州)网络有限公司 The method, apparatus of illumination simulation, terminal device and storage medium in scene of game
CN108564646A (en) * 2018-03-28 2018-09-21 腾讯科技(深圳)有限公司 Rendering intent and device, storage medium, the electronic device of object
CN114782613A (en) * 2022-04-29 2022-07-22 北京字跳网络技术有限公司 Image rendering method, device and equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130328873A1 (en) * 2012-06-08 2013-12-12 Advanced Micro Devices, Inc. Forward rendering pipeline with light culling
CN108236783A (en) * 2018-01-09 2018-07-03 网易(杭州)网络有限公司 The method, apparatus of illumination simulation, terminal device and storage medium in scene of game
CN108564646A (en) * 2018-03-28 2018-09-21 腾讯科技(深圳)有限公司 Rendering intent and device, storage medium, the electronic device of object
CN114782613A (en) * 2022-04-29 2022-07-22 北京字跳网络技术有限公司 Image rendering method, device and equipment and storage medium

Also Published As

Publication number Publication date
CN115526977B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
CN109509138B (en) Reduced acceleration structure for ray tracing system
US10706608B2 (en) Tree traversal with backtracking in constant time
CN109603155B (en) Method and device for acquiring merged map, storage medium, processor and terminal
US7463261B1 (en) Three-dimensional image compositing on a GPU utilizing multiple transformations
EP4242973A1 (en) Image processing method and related apparatus
US10803655B2 (en) Forward rendering pipeline with light culling
US10049486B2 (en) Sparse rasterization
CN113900797B (en) Three-dimensional oblique photography data processing method, device and equipment based on illusion engine
KR101681056B1 (en) Method and Apparatus for Processing Vertex
JP2004103039A (en) Texture treatment and shading method for 3d image
WO2021249091A1 (en) Image processing method and apparatus, computer storage medium, and electronic device
CN112055216B (en) Method and device for rapidly loading mass of oblique photography based on Unity
CN110675480A (en) Method and device for acquiring sampling position of texture operation
CN111754381A (en) Graphics rendering method, apparatus, and computer-readable storage medium
CN114758051A (en) Image rendering method and related equipment thereof
CN112041894B (en) Enhancing realism of a scene involving a water surface during rendering
US9092907B2 (en) Image shader using two-tiered lookup table for implementing style attribute references
CN115526977B (en) Game picture rendering method and device
Rohmer et al. Tiled frustum culling for differential rendering on mobile devices
CN116188552B (en) Region-based depth test method, device, equipment and storage medium
CN116883572B (en) Rendering method, device, equipment and computer readable storage medium
CN116824028B (en) Image coloring method, apparatus, electronic device, storage medium, and program product
CN116993894B (en) Virtual picture generation method, device, equipment, storage medium and program product
CN116630516B (en) 3D characteristic-based 2D rendering ordering method, device, equipment and medium
CN108404412A (en) The light source management system of a kind of rendering engine of playing from generation to generation, devices and methods therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant