WO2022127242A1 - 游戏图像处理方法、装置、程序和可读介质 - Google Patents
游戏图像处理方法、装置、程序和可读介质 Download PDFInfo
- Publication number
- WO2022127242A1 WO2022127242A1 PCT/CN2021/119152 CN2021119152W WO2022127242A1 WO 2022127242 A1 WO2022127242 A1 WO 2022127242A1 CN 2021119152 W CN2021119152 W CN 2021119152W WO 2022127242 A1 WO2022127242 A1 WO 2022127242A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- reflection
- information
- pixel
- map
- texture
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 18
- 238000009877 rendering Methods 0.000 claims abstract description 75
- 238000004364 calculation method Methods 0.000 claims abstract description 57
- 238000000034 method Methods 0.000 claims abstract description 47
- 238000012545 processing Methods 0.000 claims abstract description 41
- 239000000463 material Substances 0.000 claims description 52
- 238000004590 computer program Methods 0.000 claims description 22
- 238000012360 testing method Methods 0.000 claims description 20
- 238000005070 sampling Methods 0.000 claims description 18
- 230000009467 reduction Effects 0.000 claims description 13
- 238000005457 optimization Methods 0.000 claims description 9
- 239000011159 matrix material Substances 0.000 claims description 8
- 238000000547 structure data Methods 0.000 claims description 8
- 230000009466 transformation Effects 0.000 claims description 8
- 238000004422 calculation algorithm Methods 0.000 claims description 5
- 230000003094 perturbing effect Effects 0.000 claims 1
- 238000013507 mapping Methods 0.000 abstract 2
- 230000000694 effects Effects 0.000 description 21
- 101100257682 Homo sapiens SRARP gene Proteins 0.000 description 13
- 102100029291 Steroid receptor-associated and regulated protein Human genes 0.000 description 13
- 230000008569 process Effects 0.000 description 13
- 238000005516 engineering process Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 9
- 238000003860 storage Methods 0.000 description 7
- 230000008901 benefit Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000012805 post-processing Methods 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 229910021389 graphene Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
Definitions
- the present application relates to the technical field of image processing, and in particular, to a game image processing method, apparatus, program and readable medium.
- the reflection effect is an indispensable part of the game screen.
- it can be used to simulate the mirror reflection effect of materials such as metal or glass on the surrounding environment, or the reflection effect of the water surface on the sky and surrounding scenery.
- Fake Reflection technology can be used to make the reflection plane transparent, and symmetrical objects can be placed directly in symmetrical places, so that through the main camera once The reflection result is rendered.
- a game image processing method comprising:
- Image rendering of the game scene is performed by using the texture information.
- a game image processing device comprising:
- the acquisition module is used to acquire the reflection plane defined in the game scene
- an acquisition module which is also used to acquire the structural data of the reflection plane
- a calculation module configured to perform reflection calculation according to the structural data, the projection data of the current camera, the pixel information of the current screen and the depth map information of the current screen space, and obtain the texture information including the reflection result;
- a rendering module configured to perform image rendering of the game scene by using the texture information.
- a computer device/equipment/system comprising a memory, a processor, and a computer program/instruction stored on the memory, the processor implements the above-mentioned first step when executing the computer program/instruction The steps of the method in one aspect.
- a computer-readable medium on which computer programs/instructions are stored, and when the computer programs/instructions are executed by a processor, implement the steps of the method in the first aspect.
- a computer program product comprising computer programs/instructions, when the computer program/instructions are executed by a processor, the steps of the method of the first aspect above are implemented.
- the present application provides a game image processing method, device, program and readable medium.
- the present application proposes a new reflection
- the processing scheme can specifically calculate the reflection according to the structure data of the reflection plane defined in the game scene, the projection data of the current camera, the pixel information of the current screen and the depth map information of the current screen space, and obtain the texture information including the reflection result, which can be used later.
- the texture information is used for image rendering of the game scene.
- the present application is not limited by a specific angle in a specific place in the game scene, and there is no need to additionally place symmetrical objects in places where the reflection plane is symmetrical, which can save the rendering overhead of the main camera, thereby saving the cost of game image rendering. Being able to draw the reflection results of multiple planes at one time can not only improve the efficiency of game image rendering, but also ensure the correctness of the reflection results.
- FIG. 1 shows a schematic flowchart of a game image processing method provided by an embodiment of the present application
- FIG. 2 shows a schematic flowchart of another game image processing method provided by an embodiment of the present application
- FIG. 3 shows a schematic diagram of an example of a reflection calculation effect provided by an embodiment of the present application
- FIG. 4 shows a schematic diagram of an example of a Gaussian blur noise reduction effect provided by an embodiment of the present application
- FIG. 5 shows a schematic diagram of an example of a reflection plane material map provided by an embodiment of the present application
- FIG. 6 shows a schematic diagram of an effect comparison example of using a roughness sampling Mipmap provided by an embodiment of the present application
- FIG. 7 shows a schematic diagram of a comparative example of RT effects using normal disturbance reflection provided by an embodiment of the present application
- FIG. 8 shows a schematic structural diagram of a game image processing apparatus provided by an embodiment of the present application.
- Figure 9 schematically shows a block diagram of a computer apparatus/apparatus/system for implementing the method according to the present invention.
- Figure 10 schematically shows a block diagram of a computer program product implementing the method according to the invention.
- This embodiment provides a game image processing method, as shown in FIG. 1 , the method includes:
- Step 101 Acquire a reflection plane defined in the game scene.
- the reflection plane to be defined in the game scene can be determined according to the actual situation of the game scene.
- the water surface, smooth road, glass of buildings, mirrors at the position of the sink, etc. appearing in the game scene can all be defined as reflective planes, which can make the game scene more realistic and improve the game player's game experience.
- the execution subject may be an apparatus or device for game image processing, which may be configured on the client side or the server side.
- any number of reflection planes can be defined in the game scene according to actual requirements, such as one, two or more reflection planes. If there are multiple reflection planes defined in the acquired game scene, the following processing procedures are performed for each reflection plane, that is, the procedures shown in steps 102 to 103 .
- Step 102 Acquire structural data of the reflection plane.
- the structure data of the reflection plane may include: specular reflection transformation matrix of the vertex relative to the reflection plane, plane normal of the reflection plane, position of the bounding box of the reflection plane, reflection threshold of the reflection plane, noise intensity and other detailed data.
- Step 103 Perform reflection calculation according to the structure data of the reflection plane, the projection data of the current camera, the pixel information of the current screen, and the depth map information of the current screen space, and obtain map information including the reflection result.
- the pixel information of the current screen may include relevant information of each pixel point of the current screen.
- the depth map information of the current screen space may include the depth information of the current screen. The depth represents the distance of the pixel from the camera in the 3D world. The greater the depth value of the pixel, the farther the pixel is from the camera.
- the reflection result can be calculated in the forward rendering pipeline, that is, the forward ray is used to find the current The pixel to which the pixel will be reflected.
- the projection data of the current camera, the pixel information of the current screen, and the depth map information of the current screen space first, the world coordinates of each pixel are calculated through the depth map information of the current screen space, Then, for each reflection plane, calculate the world coordinate point where the world coordinate of the pixel point is located through the plane reflection, and finally calculate the pixel point back to the current screen space. After the depth test, the color of the pixel point whose depth test is successful is calculated. and depth are written into the texture information, and the texture information containing the reflection result is obtained.
- Step 104 using the calculated texture information to perform image rendering of the game scene.
- GPU Graphics Processing Unit
- this embodiment proposes a new reflection processing solution, which can be specifically based on the structural data of the reflection plane defined in the game scene, the projection of the current camera
- the data, the pixel information of the current screen, and the depth map information of the current screen space are used for reflection calculation to obtain texture information including the reflection result, and then the texture information is used to render the image of the game scene.
- This embodiment is not limited by a specific angle in a specific place in the game scene, and there is no need to additionally place symmetrical objects in places where the reflection plane is symmetrical, which can save the rendering overhead of the main camera, thereby saving the cost of game image rendering. Being able to draw the reflection results of multiple planes at one time can not only improve the efficiency of game image rendering, but also ensure the correctness of the reflection results.
- this embodiment also provides another game image processing method, as shown in FIG. 2 , the method includes:
- Step 201 Acquire a reflection plane defined in the game scene.
- this embodiment may be implemented based on the Unity game engine.
- PPV Post-Process-Volume
- reflection planes such as water surface, smooth road surface, glass, etc.
- the most important N reflection planes can be selected according to the configuration, and the structure data of these selected reflection planes can be calculated. Specifically, the processes shown in steps 202 to 203 can be performed.
- Step 202 Determine the distance between the reflection plane defined in the game scene and the camera.
- Step 203 Acquire structural data of the reflection plane whose distance from the camera meets the preset distance condition.
- the preset distance conditions may be preset according to actual needs. For example, order the distance between the reflection plane and the camera from near to far.
- the reflection planes in the front row are the reflection planes with a closer distance to the camera. Calculating the reflection results of these reflection planes will make The reflection effect is more prominent and obvious, and for the reflection planes in the back, because the distance from the camera is far, if these reflection planes are also calculated for reflection, not only the actual rendering effect is not obvious, but also the cost of calculating reflections will increase. Therefore, it is preferable to take the top preset number (the top N) of reflection planes as the reflection planes whose distances meet the preset distance condition, and then calculate the structure data of these reflection planes.
- a certain distance threshold can also be preset, and the reflection planes whose distance from the camera is less than or equal to the preset distance threshold are regarded as reflection planes that meet the preset distance condition, and then the structural data of these reflection planes are calculated.
- CPU Central Processing Unit
- the reflection calculation in this embodiment can combine the advantages of two reflection processing technologies, Planar Reflection and Screen-Space-Reflection (SSR), which is equivalent to the Screen-Space-Planar -Reflection, SSPR) reflection calculation.
- SSR Planar Reflection and Screen-Space-Reflection
- SSPR Screen-Space-Planar -Reflection
- the deferred rendering pipeline needs to write data to multiple texture maps in each rendering command, generally called GBuffer0, GBuffer1... .
- a technology called Multi-Render-Target (MRT) needs to be used.
- MRT Multi-Render-Target
- the forward rendering pipeline only needs to use one or a few GBuffers, and the bandwidth overhead is relatively low.
- the reflection calculation of SSPR in this embodiment is performed in the forward rendering pipeline. Using the forward rendering pipeline is more affordable and popular than the deferred rendering pipeline. Therefore, the reflection calculation and image rendering of SSPR in this embodiment can be more suitable for mobile terminals. .
- Step 204 Perform reflection calculation according to the structure data of the reflection plane, the projection data of the current camera, the pixel information of the current screen, and the depth map information of the current screen space, and obtain map information including the reflection result.
- the reflection calculation of the SSPR that is, the calculation of the calculation shader (ComputeShader, CS) is performed.
- the input data of this module includes the DS1 array (reflection plane structure data) calculated in the CPU in the previous step 203, the pixel Color0 of the current screen (pixel information of the current screen), the current screen space depth Depth0 (the depth map of the current screen space) information), and the current camera's projection data CameraData (the current camera's projection data), the output data is a texture (including the texture information of the reflection result), named Ans0.
- step 204 may specifically include: calculating the world coordinates of each pixel in the pixel information of the current screen by using the depth map information of the current screen space and the projection data of the current camera; referring to the relative reflection of the vertex in the structural data.
- the specular reflection transformation matrix of the plane changes the world coordinates of the pixels in the world space to obtain the world coordinates of the reflection result; through the projection data of the current camera, the world coordinates of the reflection result are converted into the current screen space, so that according to the current screen
- the depth map information of the space and the reflection result are used for depth test; the texture information is generated according to the pixels whose depth test is successful.
- the depth test is performed according to the depth map information of the current screen space and the reflection result, which may specifically include: comparing the depth value of the target pixel in the depth map information of the current screen space with the target pixel in the reflection result RGBA. Compare the values of the A channel in the middle; correspondingly, generate map information according to the pixels that have passed the test, which may specifically include: if the depth test is determined to be successful according to the comparison result, then write the color value and depth value of the target pixel to the data containing the reflection. In the texture information of the result; if it is determined that the depth test fails according to the comparison result, the target pixel will be discarded.
- the calculation process and steps of the reflection calculation of SSPR include:
- step 204 may further include: in the forward rendering pipeline, using GPU multi-core to perform reflection calculation to obtain texture information.
- GPU multi-core to perform reflection calculation to obtain texture information.
- the multi-core feature of GPU is used to accelerate the calculation of reflections.
- Step 205 using the texture information including the reflection result to perform image rendering of the game scene.
- step 205 may specifically include: first, performing noise reduction processing on the reflection results in the map information through a Gaussian blur algorithm; then using noise reduction
- the processed texture information is used for image rendering of the game scene.
- the noise reduction processing is performed through the open source Gaussian blur algorithm, and in order to improve the noise reduction processing effect, multiple Gaussian blurs can be used to obtain a higher quality texture effect, as shown in Figure 4, in order to use the Gaussian blur algorithm for noise reduction. Processed renderings.
- step 205 may further include: obtaining a target map including roughness and normal information of the material corresponding to each pixel point of the reflection plane; correspondingly, step 205 may specifically include Including: firstly Mipmap the texture information to obtain multiple Mipmap textures of different resolutions; then read the roughness of each pixel material in the target texture; then determine the sampling of each pixel according to the roughness of each pixel material The Mipmap texture corresponding to the definition, wherein, the materials with different roughness have their own corresponding definition Mipmap texture; using the texture information of the Mipmap texture containing the respective sampling of each pixel point, the image rendering of the game scene is performed.
- the material of the reflection plane can be individually drawn in advance through the RenderFeature of the Universal Renderer Pipeline (Universal-Renderer-Pipeline, URP, a programmable rendering pipeline in Unity) to obtain the reflection plane
- URP Universal-Renderer-Pipeline
- Unity a programmable rendering pipeline in Unity
- the RT which refers to a texture used for direct instruction drawing
- the RT needs to have Mipmaps with different definitions.
- different Mipmaps are selected.
- the target map containing the roughness and normal information of the reflective plane material is drawn separately in advance.
- a texture map containing the roughness and normal on the material map of its pixels is rendered, as shown in Figure 5 shown.
- the target map can be read later to get the roughness of the corresponding pixel material.
- Mipmap technology will automatically generate a lower resolution (that is, lower definition) number of textures for a map.
- a map that determines which level of sharpness each pixel is sampled from, based on roughness As shown in Figure 6, the left picture in Figure 6 is the rendering of the Mipmap without roughness sampling, and the right picture is the rendering of the Mipmap with roughness sampling. Obviously, the reflection effect of the right picture is better.
- the normal of the material can be used to perturb the reflection result RT to further improve the reflection effect.
- the image rendering of the game scene is performed by using the texture information of the Mipmap textures containing the respective sampling of each pixel, which may specifically include: first reading the normal information of each pixel material in the target texture (drawn separately in advance) ; According to the normal information of the material of each pixel point, perform perturbation processing on the texture information of the Mipmap texture containing the respective sampling of each pixel point; and then use the perturbed texture information to render the image of the game scene.
- the perturbation processing is performed on the map information of the Mipmap map containing the respective sampling of each pixel, which may specifically include: first, according to the current screen coordinates of the pixel, calculate the texture map coordinates; Then, according to the normal direction and noise intensity of the pixel material, the overlay calculation is performed on the texture map coordinates; finally, the map information is sampled based on the texture map coordinates after the overlay calculation.
- a texture map coordinate (UV) will be calculated according to the screen coordinates of the pixel for sampling; according to the direction and strength of the normal, a (-1, 1) can be simply obtained. result.
- DetailUV Normal.RG*2-1, and finally superimpose DetailUV on the UV, and then sample the reflection map, you will get the disturbed result.
- the left picture in Figure 7 shows the effect of RT without normal perturbation
- the right picture shows the effect of RT with normal perturbation. Obviously, the reflection effect of the right picture is better.
- the R and G channels of the target map can record the normal information of the pixel material
- the B channel of the target map can record the roughness of the pixel material; correspondingly, the roughness of each pixel material in the target map is read.
- it may include: obtaining the roughness of each pixel material by reading the B channel of the target texture; correspondingly, reading the normal information of each pixel material in the target texture, which may specifically include: reading the target texture by reading the target texture.
- the R and G channels of get the normal information of each pixel material.
- the R and G channels record the normal direction (such as X, Y direction vectors), and the B channel records the roughness.
- the roughness and normal information of the reflective plane material can be accurately obtained, which is convenient for further superposition optimization of the reflection effect.
- the method of this embodiment may further include: performing frame-by-frame image rendering on the game scene, so that each frame of image performs reflection calculation, or image noise reduction processing, or image Overlay optimization processing.
- one SSPR rendering can be divided into three parts, the first part is the reflection calculation as shown in Figure 3, and the second part is the image noise reduction, which is the Gaussian blur as shown in Figure 4.
- the third part is image overlay optimization, that is, the image overlay optimization process shown in Figure 6 and Figure 7.
- a counter may be set, and only one step of the above-mentioned three parts is executed in each frame to perform a cycle, and the purpose of optimizing the overhead is achieved by reducing the amount of calculation per frame.
- the forward rendering pipeline is used instead of the deferred rendering pipeline. It also needs to be able to efficiently render reflection results from multiple planes at the same time. And on the premise of ensuring that these problems in the prior art are solved, it is necessary to improve the efficiency as much as possible.
- This embodiment combines the advantages of two reflection processing technologies, Planar Reflection and SSR, and is equivalent to the reflection calculation of SSPR. Specifically in the forward rendering pipeline, the reflection can be calculated through the screen space.
- SSPR uses the forward ray to find which pixel the current pixel will reflect to.
- SSR SSR reverse ray finds which pixel is reflected to the current pixel
- the multi-core feature of GPU is used to speed up the calculation of reflection and improve the efficiency of reflection calculation.
- a higher quality Mipmap is obtained, and through the RenderFeature of URP, the material of the reflection plane is drawn separately in advance, and the roughness and material normal on the reflection plane are obtained. The result is perturbed to make the reflection look better.
- the overhead of optimizing reflection can be calculated by frame.
- this embodiment provides a game image processing apparatus.
- the apparatus includes: an acquisition module 31 , a calculation module 32 , and a rendering module 33 .
- an acquisition module 31 used for acquiring the reflection plane defined in the game scene
- the acquisition module 31 is also used to acquire the structural data of the reflection plane
- the calculation module 32 is configured to perform reflection calculation according to the structural data, the projection data of the current camera, the pixel information of the current screen and the depth map information of the current screen space, and obtain the texture information including the reflection result;
- the rendering module 33 is configured to perform image rendering of the game scene by using the texture information.
- the structural data includes: a specular reflection transformation matrix of the vertex relative to the reflection plane; correspondingly, the calculation module 32 is specifically configured to pass the depth map information of the current screen space and The projection data of the current camera is used to calculate the world coordinate of each pixel in the pixel information of the current screen; with reference to the specular reflection transformation matrix, the world coordinate of the pixel is changed in the world space to obtain a reflection result The world coordinates; through the projection data of the current camera, the world coordinates of the reflection result are converted into the current screen space, so as to perform a depth test according to the depth map information of the current screen space and the reflection result; The texture information is generated according to the pixels whose depth test is successful.
- the calculation module 32 is further configured to compare the depth value of the target pixel in the depth map information of the current screen space with the A channel of the target pixel in the reflection result RGBA value to compare;
- the calculation module 32 is further configured to write the color value and depth value of the target pixel into the map information including the reflection result if the depth test is determined to be successful according to the comparison result; if the depth test is determined according to the comparison result If it fails, the target pixel is discarded.
- the rendering module 33 is specifically configured to perform noise reduction processing on the reflection result in the texture information through a Gaussian blur algorithm; Image rendering.
- the acquiring module 31 is further configured to acquire roughness and normal information including the material of each pixel corresponding to the reflection plane before the image rendering of the game scene is performed by using the texture information the target map;
- the rendering module 33 is further configured to perform Mipmap processing on the texture information to obtain multiple Mipmap textures of different definitions; read the roughness of each pixel material in the target texture; Roughness, determine the Mipmap maps of the corresponding resolutions sampled by the respective pixel points, wherein, materials of different roughness have their respective Mipmap maps of the corresponding resolution; use the Mipmap maps containing the respective samples of the respective pixel points
- the texture information is used to render the image of the game scene.
- the rendering module 33 is further configured to read the normal information of each pixel point material in the target map;
- the texture information of the Mipmap textures sampled respectively is subjected to perturbation processing; and the image rendering of the game scene is performed by using the texture information after the perturbation processing.
- the rendering module 33 is further configured to calculate the texture map coordinates according to the current screen coordinates of the pixel points; and superimpose on the texture map coordinates according to the normal direction and noise intensity of the pixel point material Calculation; sampling map information based on the texture map coordinates after overlay calculation.
- the R and G channels of the target map record the normal information of the pixel material, and the B channel of the target map records the roughness of the pixel material;
- the rendering module 33 is further configured to obtain the roughness of each pixel material by reading the B channel of the target map;
- the rendering module 33 is further configured to acquire the normal information of each pixel point material by reading the R and G channels of the target texture.
- the rendering module 33 is further configured to perform frame-by-frame image rendering on the game scene, so that each frame of image performs reflection calculation, image noise reduction processing, or image overlay optimization processing.
- the calculation module 32 is further configured to obtain the texture information by performing reflection calculation using multiple GPU cores in the forward rendering pipeline.
- the acquiring module 31 is further configured to determine the distance between the reflection plane defined in the game scene and the camera; acquire the distance between the reflection plane whose distance from the camera meets the preset distance condition. structured data.
- Various component embodiments of the present invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof.
- a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all functions of some or all components of the game image processing apparatus according to the embodiments of the present invention.
- the present invention can also be implemented as a program/instruction (eg, computer program/instruction and computer program product) for an apparatus or apparatus for performing some or all of the methods described herein.
- Such programs/instructions implementing the present invention may be stored on a computer readable medium, or may exist in the form of one or more signals, such signals may be downloaded from an Internet website, or provided on a carrier signal, or in any form Available in other formats.
- Computer-readable media includes both persistent and non-permanent, removable and non-removable media, and storage of information may be implemented by any method or technology.
- Information may be computer readable instructions, data structures, modules of programs, or other data.
- Examples of computer storage media include, but are not limited to, phase-change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Flash Memory or other memory technology, Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cartridges, disk storage, quantum memory, graphene-based storage media or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
- PRAM phase-change memory
- SRAM static random access memory
- DRAM dynamic random access memory
- RAM random access memory
- ROM read only memory
- EEPROM Electrically Erasable Programm
- FIG. 9 schematically shows a computer device/device/system that can implement the game image processing method according to the present invention, the computer device/device/system including a processor 410 and a computer-readable medium in the form of a memory 420 .
- Memory 420 is an example of a computer-readable medium having storage space 430 for storing computer programs/instructions 431 .
- the computer program/instructions 431 are executed by the processor 410, various steps in the game image processing method described above can be implemented.
- Figure 10 schematically shows a block diagram of a computer program product implementing the method according to the invention.
- the computer program product includes a computer program/instruction 510, which when executed by a processor such as the processor 410 shown in FIG. 9, can implement the game image processing method described above. of the various steps.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Multimedia (AREA)
- Image Generation (AREA)
Abstract
Description
Claims (15)
- 一种游戏图像处理方法,其特征在于,包括:获取游戏场景中定义的反射平面;获取所述反射平面的结构数据;根据所述结构数据、当前相机的投影数据以及当前屏幕的像素信息和当前屏幕空间的深度图信息进行反射计算,得到包含反射结果的贴图信息;利用所述贴图信息进行所述游戏场景的图像渲染。
- 根据权利要求1所述的方法,其特征在于,所述结构数据包括:顶点相对所述反射平面的镜面反射变换矩阵;所述根据所述结构数据、当前相机的投影数据以及当前屏幕的像素信息和当前屏幕空间的深度图信息进行反射计算,得到包含反射结果的贴图信息,具体包括:通过所述当前屏幕空间的深度图信息和所述当前相机的投影数据,计算所述当前屏幕的像素信息中每个像素点的世界坐标;参照所述镜面反射变换矩阵,在世界空间将所述像素点的世界坐标进行变化,得到反射结果的世界坐标;通过所述当前相机的投影数据,将所述反射结果的世界坐标转换到所述当前屏幕空间中,以便根据所述当前屏幕空间的深度图信息和所述反射结果进行深度测试;依据深度测试成功的像素点生成所述贴图信息。
- 根据权利要求2所述的方法,其特征在于,所述根据所述当前屏幕空间的深度图信息和所述反射结果进行深度测试,具体包括:将目标像素点在所述当前屏幕空间的深度图信息中的深度值,与所述目标像素点在所述反射结果RGBA中A通道的值进行比较;所述依据测试通过的像素点生成所述贴图信息,具体包括:若根据比较结果判定深度测试成功,则将所述目标像素点的颜色值和深度值写入到包含所述反射结果的贴图信息中;若根据比较结果判定深度测试失败,则抛弃所述目标像素点。
- 根据权利要求1所述的方法,其特征在于,所述利用所述贴图信息进行所述游戏场景的图像渲染,具体包括:通过高斯模糊算法对所述贴图信息中的所述反射结果进行降噪处理;利用降噪处理后的所述贴图信息进行所述游戏场景的图像渲染。
- 根据权利要求1所述的方法,其特征在于,在所述利用所述贴图信息进行所述游戏场景的图像渲染之前,所述方法还包括:获取包含所述反射平面对应各个像素点材质的粗糙度和法线信息的目标贴图;所述利用所述贴图信息进行所述游戏场景的图像渲染,具体包括:将所述贴图信息进行Mipmap处理,得到不同清晰度的多个Mipmap贴图;读取所述目标贴图中各个像素点材质的粗糙度;根据所述各个像素点材质的粗糙度,确定所述各个像素点各自采样的对应清晰度的Mipmap贴图,其中,不同粗糙度的材质均有各自对应清晰度的Mipmap贴图;利用包含所述各个像素点各自采样的所述Mipmap贴图的贴图信息,进行所述游戏场景的图像渲染。
- 根据权利要求5所述的方法,其特征在于,所述利用包含所述各个像素点各自采样的所述Mipmap贴图的贴图信息,进行所述游戏场景的图像渲染,具体包括:读取所述目标贴图中各个像素点材质的法线信息;根据所述各个像素点材质的法线信息,对包含所述各个像素点各自采样的所述Mipmap贴图的贴图信息进行扰动处理;利用扰动处理后的贴图信息,进行所述游戏场景的图像渲染。
- 根据权利要求6所述的方法,其特征在于,所述根据所述各个像素点材质的法线信息,对包含所述各个像素点各自采样的所述Mipmap贴图的贴图信息进行扰动处理,具体包括:根据像素点的当前屏幕坐标,计算出纹理贴图坐标;依据像素点材质的法线方向和噪音强度,在所述纹理贴图坐标上进行叠加计算;基于叠加计算后的纹理贴图坐标采样贴图信息。
- 根据权利要求6所述的方法,其特征在于,所述目标贴图的R和G通道记录像素点材质的法线信息,所述目标贴图的B通道记录像 素点材质的粗糙度;所述读取所述目标贴图中各个像素点材质的粗糙度,具体包括:通过读取所述目标贴图的B通道,获取各个像素点材质的粗糙度;所述读取所述目标贴图中各个像素点材质的法线信息,具体包括:通过读取所述目标贴图的R和G通道,获取各个像素点材质的法线信息。
- 根据权利要求1所述的方法,其特征在于,所述方法还包括:对所述游戏场景进行分帧图像渲染,使得每帧图像执行反射计算、或图像降噪处理、或图像叠加优化处理。
- 根据权利要求1所述的方法,其特征在于,所述根据所述结构数据、当前相机的投影数据以及当前屏幕的像素信息和当前屏幕空间的深度图信息进行反射计算,得到包含反射结果的贴图信息,具体包括:在前向渲染管线中,利用GPU多核进行反射计算得到所述贴图信息。
- 根据权利要求1所述的方法,其特征在于,所述获取所述反射平面的结构数据,具体包括:确定所述游戏场景中定义的反射平面与摄像机之间的距离;获取与所述摄像机之间的距离符合预设距离条件的反射平面的结构数据。
- 一种游戏图像处理装置,其特征在于,包括:获取模块,用于获取游戏场景中定义的反射平面;获取模块,还用于获取所述反射平面的结构数据;计算模块,用于根据所述结构数据、当前相机的投影数据以及当前屏幕的像素信息和当前屏幕空间的深度图信息进行反射计算,得到包含反射结果的贴图信息;渲染模块,用于利用所述贴图信息进行所述游戏场景的图像渲染。
- 一种计算机装置/设备/系统,包括存储器、处理器及存储在存储器上的计算机程序/指令,所述处理器执行所述计算机程序/指令时实现根据权利要求1-11中任一项所述的游戏图像处理方法的步骤。
- 一种计算机可读介质,其上存储有计算机程序/指令,所述计算机程序/指令被处理器执行时实现根据权利要求1-11中任一项所述的 游戏图像处理方法的步骤。
- 一种计算机程序产品,包括计算机程序/指令,所述计算机程序/指令被处理器执行时实现根据权利要求1-11中任一项所述的游戏图像处理方法的步骤。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011499787.2 | 2020-12-18 | ||
CN202011499787.2A CN112233216B (zh) | 2020-12-18 | 2020-12-18 | 游戏图像处理方法、装置及电子设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022127242A1 true WO2022127242A1 (zh) | 2022-06-23 |
Family
ID=74124910
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/119152 WO2022127242A1 (zh) | 2020-12-18 | 2021-09-17 | 游戏图像处理方法、装置、程序和可读介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112233216B (zh) |
WO (1) | WO2022127242A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115797226A (zh) * | 2023-01-09 | 2023-03-14 | 腾讯科技(深圳)有限公司 | 图像降噪方法、装置、计算机设备和存储介质 |
CN116630486A (zh) * | 2023-07-19 | 2023-08-22 | 山东锋士信息技术有限公司 | 一种基于Unity3D渲染的半自动化动画制作方法 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112233216B (zh) * | 2020-12-18 | 2021-03-02 | 成都完美时空网络技术有限公司 | 游戏图像处理方法、装置及电子设备 |
CN112973121B (zh) * | 2021-04-30 | 2021-07-20 | 成都完美时空网络技术有限公司 | 反射效果生成方法及装置、存储介质、计算机设备 |
CN113283543B (zh) * | 2021-06-24 | 2022-04-15 | 北京优锘科技有限公司 | 一种基于WebGL的图像投影融合方法、装置、存储介质和设备 |
CN113570696B (zh) * | 2021-09-23 | 2022-01-11 | 深圳易帆互动科技有限公司 | 动态模型的镜像图像处理方法、装置及可读存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150146971A1 (en) * | 2013-11-27 | 2015-05-28 | Autodesk, Inc. | Mesh reconstruction from heterogeneous sources of data |
CN106056661A (zh) * | 2016-05-31 | 2016-10-26 | 钱进 | 基于Direct3D 11的三维图形渲染引擎 |
CN112233216A (zh) * | 2020-12-18 | 2021-01-15 | 成都完美时空网络技术有限公司 | 游戏图像处理方法、装置及电子设备 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7768523B2 (en) * | 2006-03-09 | 2010-08-03 | Microsoft Corporation | Shading using texture space lighting and non-linearly optimized MIP-maps |
CN103544731B (zh) * | 2013-09-30 | 2016-08-17 | 北京航空航天大学 | 一种基于多相机的快速反射绘制方法 |
CN104463944B (zh) * | 2014-07-10 | 2017-09-29 | 无锡梵天信息技术股份有限公司 | 一种基于物理的高光计算方法 |
CN104240286A (zh) * | 2014-09-03 | 2014-12-24 | 无锡梵天信息技术股份有限公司 | 基于屏幕空间的实时反射方法 |
CN105261059B (zh) * | 2015-09-18 | 2017-12-12 | 浙江大学 | 一种基于在屏幕空间计算间接反射高光的渲染方法 |
WO2017168038A1 (en) * | 2016-03-31 | 2017-10-05 | Umbra Software Oy | Virtual reality streaming |
CN109064533B (zh) * | 2018-07-05 | 2023-04-07 | 奥比中光科技集团股份有限公司 | 一种3d漫游方法及系统 |
CN111768473B (zh) * | 2020-06-28 | 2024-03-22 | 完美世界(北京)软件科技发展有限公司 | 图像渲染方法、装置及设备 |
-
2020
- 2020-12-18 CN CN202011499787.2A patent/CN112233216B/zh active Active
-
2021
- 2021-09-17 WO PCT/CN2021/119152 patent/WO2022127242A1/zh active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150146971A1 (en) * | 2013-11-27 | 2015-05-28 | Autodesk, Inc. | Mesh reconstruction from heterogeneous sources of data |
CN106056661A (zh) * | 2016-05-31 | 2016-10-26 | 钱进 | 基于Direct3D 11的三维图形渲染引擎 |
CN112233216A (zh) * | 2020-12-18 | 2021-01-15 | 成都完美时空网络技术有限公司 | 游戏图像处理方法、装置及电子设备 |
Non-Patent Citations (2)
Title |
---|
DEMON COUSIN: "Screen Space Planar Reflection (SSPR) trips for Unity URP mobile platforms", 3 August 2020 (2020-08-03), pages 1 - 3, XP055944446, Retrieved from the Internet <URL:https://zhuanlan.zhihu.com/p/150890059> * |
TIANSHU SOUL SEAT_XUEFENG: "The self-study HLSL road of the URP pipeline 35th SSPR screen space plane reflection", 25 August 2020 (2020-08-25), pages 1 - 16, XP055944450, Retrieved from the Internet <URL:https://www.bilibili.com/read/cv7332339/> * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115797226A (zh) * | 2023-01-09 | 2023-03-14 | 腾讯科技(深圳)有限公司 | 图像降噪方法、装置、计算机设备和存储介质 |
CN115797226B (zh) * | 2023-01-09 | 2023-04-25 | 腾讯科技(深圳)有限公司 | 图像降噪方法、装置、计算机设备和存储介质 |
CN116630486A (zh) * | 2023-07-19 | 2023-08-22 | 山东锋士信息技术有限公司 | 一种基于Unity3D渲染的半自动化动画制作方法 |
CN116630486B (zh) * | 2023-07-19 | 2023-11-07 | 山东锋士信息技术有限公司 | 一种基于Unity3D渲染的半自动化动画制作方法 |
Also Published As
Publication number | Publication date |
---|---|
CN112233216B (zh) | 2021-03-02 |
CN112233216A (zh) | 2021-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022127242A1 (zh) | 游戏图像处理方法、装置、程序和可读介质 | |
US9569885B2 (en) | Technique for pre-computing ambient obscurance | |
KR101082215B1 (ko) | 하이브리드 광선 추적 시스템을 위한 프래그먼트 쉐이더 및 동작 방법 | |
US9342919B2 (en) | Image rendering apparatus and method for preventing pipeline stall using a buffer memory unit and a processor | |
CN111127623B (zh) | 模型的渲染方法、装置、存储介质及终端 | |
US7924281B2 (en) | System and method for determining illumination of a pixel by shadow planes | |
US7551178B2 (en) | Apparatuses and methods for processing graphics and computer readable mediums storing the methods | |
US9501860B2 (en) | Sparse rasterization | |
WO2022057598A1 (zh) | 图像渲染的方法和装置 | |
US20110141112A1 (en) | Image processing techniques | |
US20080143715A1 (en) | Image Based Rendering | |
US11954830B2 (en) | High dynamic range support for legacy applications | |
US8854392B2 (en) | Circular scratch shader | |
US20070097118A1 (en) | Apparatus and method for a frustum culling algorithm suitable for hardware implementation | |
US20210012562A1 (en) | Probe-based dynamic global illumination | |
KR20180023856A (ko) | 그래픽 처리 시스템 및 그래픽 프로세서 | |
KR20230073222A (ko) | 깊이 버퍼 프리-패스 | |
KR102477265B1 (ko) | 그래픽스 프로세싱 장치 및 그래픽스 파이프라인의 텍스쳐링을 위한 LOD(level of detail)를 결정하는 방법 | |
TWI417808B (zh) | 可重建幾何陰影圖的方法 | |
WO2022193941A1 (zh) | 图像渲染方法、装置、设备、介质和计算机程序产品 | |
CN113129420B (zh) | 一种基于深度缓冲加速的光线追踪渲染方法 | |
US20240177394A1 (en) | Motion vector optimization for multiple refractive and reflective interfaces | |
EP1227443A2 (en) | System and method for fast and smooth rendering of lit, textured spheres | |
US11875478B2 (en) | Dynamic image smoothing based on network conditions | |
US10212406B2 (en) | Image generation of a three-dimensional scene using multiple focal lengths |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21905174 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21905174 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21905174 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21905174 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 18.01.2024) |