CN114882164A - Game image processing method and device, storage medium and computer equipment - Google Patents

Game image processing method and device, storage medium and computer equipment Download PDF

Info

Publication number
CN114882164A
CN114882164A CN202210542191.9A CN202210542191A CN114882164A CN 114882164 A CN114882164 A CN 114882164A CN 202210542191 A CN202210542191 A CN 202210542191A CN 114882164 A CN114882164 A CN 114882164A
Authority
CN
China
Prior art keywords
rendering
space
data
interface
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210542191.9A
Other languages
Chinese (zh)
Inventor
张猛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Perfect Time And Space Software Co ltd
Original Assignee
Shanghai Perfect Time And Space Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Perfect Time And Space Software Co ltd filed Critical Shanghai Perfect Time And Space Software Co ltd
Priority to CN202210542191.9A priority Critical patent/CN114882164A/en
Publication of CN114882164A publication Critical patent/CN114882164A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/11File system administration, e.g. details of archiving or snapshots
    • G06F16/116Details of conversion of file system types or formats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • G06F16/168Details of user interfaces specifically adapted to file systems, e.g. browsing and visualisation, 2d or 3d GUIs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Generation (AREA)

Abstract

The application discloses a game image processing method and device, a storage medium and computer equipment, wherein the method comprises the following steps: acquiring three-dimensional scene resource data and interface UI material data corresponding to a target game scene; writing the three-dimensional scene resource data into a three-dimensional image cache of a target rendering pipeline, and rendering the three-dimensional scene resource data in the three-dimensional image cache through a three-dimensional rendering channel of the target rendering pipeline to obtain three-dimensional image cache data in a linear space; converting the color space of the three-dimensional image cache data into a gamma space through a first conversion channel, putting the three-dimensional image cache data and interface UI material data in the gamma space into a UI cache of a target rendering pipeline, and performing UI rendering on the three-dimensional image cache data and the interface UI material data in the UI cache through the UI rendering channel of the target rendering pipeline; the color space of the UI rendering result is converted to a linear space through a second conversion channel.

Description

Game image processing method and device, storage medium and computer equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a game image processing method and apparatus, a storage medium, and a computer device.
Background
Since the Unity engine supports the Linear color space (Linear color space), the light and shadow levels of Unity's 3D rendering become more accurate and finer, but UI rendering under the Linear color space becomes worse. Most UI assets are manufactured in an sRGB color space (gamma color space) through software such as PS and the like, and the format of a UI picture is also based on sRGB, so that an alpha channel of the UI picture based on sRGB cannot be correctly mixed in the Linear color space of an engine, and the opacity of the UI picture cannot be correctly presented.
In order to match alpha blending under the Linear color space, some teams limit the use of semi-transparent assets, resulting in transparency accuracy loss of UI materials; some are blindly adjusted in PS software, and a manufacturer adjusts the picture for many times until the picture is comfortable, so that the operation is complicated and the efficiency is low.
Disclosure of Invention
In view of the above, the present application provides a game image processing method and apparatus, a storage medium, and a computer device, which solve the problem of color difference after the current UI picture enters a game development engine for rendering without changing the current UI picture making workflow, and improve the game expressive force
According to an aspect of the present application, there is provided a game image processing method including:
acquiring three-dimensional scene resource data and interface UI material data corresponding to a target game scene, wherein the color space of the three-dimensional scene resource data is a linear space, and the color space of the interface UI material data is a gamma space;
writing the three-dimensional scene resource data into a three-dimensional image cache of a target rendering pipeline, and rendering the three-dimensional scene resource data in the three-dimensional image cache through a three-dimensional rendering channel of the target rendering pipeline to obtain three-dimensional image cache data in a linear space, wherein the target rendering pipeline is obtained by adding a first conversion channel and a second conversion channel in a general rendering pipeline;
converting the color space of the three-dimensional image cache data into a gamma space through the first conversion channel, putting the three-dimensional image cache data and the interface UI material data in the gamma space into a UI cache of the target rendering pipeline, and carrying out UI rendering on the three-dimensional image cache data and the interface UI material data in the UI cache through the UI rendering channel of the target rendering pipeline; and converting the color space of the UI rendering result into a linear space through the second conversion channel.
Optionally, the obtaining of the three-dimensional scene resource data and the interface UI material data corresponding to the target game scene specifically includes:
traversing a three-dimensional scene object in the target game scene, and acquiring the three-dimensional scene resource data corresponding to the three-dimensional scene object;
and traversing the UI canvas object corresponding to the target game scene to acquire the interface UI material data corresponding to the UI canvas object.
Optionally, before traversing the three-dimensional scene object in the target game scene, the method further includes:
identifying whether the target game scene contains a three-dimensional scene object;
if yes, executing the traversal of the three-dimensional scene object in the target game scene;
and if not, traversing the UI canvas object corresponding to the target game scene to acquire the interface UI material data corresponding to the UI canvas object.
Optionally, the placing the three-dimensional image cache data in the gamma space and the interface UI material data into the UI cache of the target rendering pipeline specifically includes:
converting the color space of the interface UI material data into a linear space, carrying out reverse correction on the interface UI material data in the linear space, and sampling the interface UI material data subjected to the reverse correction so as to convert the sampled interface UI material data back to a gamma space;
and putting the three-dimensional image cache data in the gamma space and the sampled interface UI material data into a UI cache.
Optionally, before the obtaining of the three-dimensional scene resource data and the interface UI material data corresponding to the target game scene, the method further includes:
determining a generic rendering pipeline in a game engine;
adding a first conversion channel for converting the color space from linear space to gamma space after the three-dimensional rendering channel of the generic rendering pipeline, and adding a second conversion channel for converting the color space from gamma space to linear space after the UI rendering channel of the generic rendering pipeline URP to obtain the target rendering pipeline.
Optionally, the method further comprises:
when a new camera instruction for the target rendering pipeline is received, identifying whether a new camera indicated by the new camera instruction is a UI camera;
if the newly-built camera is a UI camera, adding the newly-built camera at the tail of a camera stack of the target rendering pipeline;
and if the newly-built camera is not the UI camera, inserting the newly-built camera in front of the UI camera in the camera stack of the target rendering pipeline.
Optionally, after the converting the color space of the UI rendering result into the linear space through the second conversion channel, the method further includes:
and displaying a game image containing the three-dimensional scene resources and the interface UI materials based on the UI rendering result in the linear space.
Optionally, the three-dimensional image cache data in the linear space obtained through the three-dimensional rendering channel is in a first data format; the three-dimensional image cache data in the gamma space obtained through the first conversion channel is in a second data format; and the UI rendering result in the linear space obtained through the second conversion channel is in a third data format.
According to another aspect of the present application, there is provided a game image processing apparatus including:
the data acquisition module is used for acquiring three-dimensional scene resource data and interface UI material data corresponding to a target game scene, wherein the color space of the three-dimensional scene resource data is a linear space, and the color space of the interface UI material data is a gamma space;
the first rendering module is used for writing the three-dimensional scene resource data into a three-dimensional image cache of a target rendering pipeline, rendering the three-dimensional scene resource data in the three-dimensional image cache through a three-dimensional rendering channel of the target rendering pipeline, and obtaining the three-dimensional image cache data in a linear space, wherein the target rendering pipeline is obtained by adding a first conversion channel and a second conversion channel into a general rendering pipeline;
the second rendering module is used for converting the color space of the three-dimensional image cache data into a gamma space through the first conversion channel, putting the three-dimensional image cache data and the interface UI material data in the gamma space into a UI cache of the target rendering pipeline, and performing UI rendering on the three-dimensional image cache data and the interface UI material data in the UI cache through the UI rendering channel of the target rendering pipeline; and converting the color space of the UI rendering result into a linear space through the second conversion channel.
Optionally, the data obtaining module is specifically configured to:
traversing a three-dimensional scene object in the target game scene, and acquiring the three-dimensional scene resource data corresponding to the three-dimensional scene object;
and traversing the UI canvas object corresponding to the target game scene to acquire the interface UI material data corresponding to the UI canvas object.
Optionally, the data obtaining module is further configured to:
before traversing the three-dimensional scene object in the target game scene, identifying whether the target game scene contains the three-dimensional scene object;
if yes, executing the traversal of the three-dimensional scene object in the target game scene;
and if not, executing the UI canvas object corresponding to the target game scene.
Optionally, the first rendering module is further configured to:
converting the color space of the interface UI material data into a linear space, carrying out reverse correction on the interface UI material data in the linear space, and sampling the interface UI material data subjected to the reverse correction so as to convert the sampled interface UI material data back to a gamma space;
and putting the three-dimensional image cache data in the gamma space and the sampled interface UI material data into a UI cache.
Optionally, the apparatus further comprises: a pipeline retrofit module for:
before acquiring three-dimensional scene resource data and interface UI material data corresponding to a target game scene, determining a universal rendering pipeline in a game engine;
adding a first conversion channel for converting a color space from a linear space to a gamma space after a three-dimensional rendering channel of the generic rendering pipeline, and adding a second conversion channel for converting a color space from a gamma space to a linear space after a UI rendering channel of the generic rendering pipeline to obtain the target rendering pipeline.
Optionally, the pipeline retrofit module is further configured to:
when a new camera instruction for the target rendering pipeline is received, identifying whether a new camera indicated by the new camera instruction is a UI camera;
if the newly-built camera is a UI camera, adding the newly-built camera at the tail of a camera stack of the target rendering pipeline;
and if the newly-built camera is not the UI camera, inserting the newly-built camera in front of the UI camera in the camera stack of the target rendering pipeline.
Optionally, the apparatus further comprises:
and the display module is used for displaying the game image containing the three-dimensional scene resource and the interface UI material based on the UI rendering result in the linear space after the color space of the UI rendering result is converted into the linear space through the second conversion channel.
Optionally, the three-dimensional image cache data in the linear space obtained through the three-dimensional rendering channel is in a first data format; the three-dimensional image cache data in the gamma space obtained through the first conversion channel is in a second data format; and the UI rendering result in the linear space obtained through the second conversion channel is in a third data format.
According to still another aspect of the present application, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the game image processing method described above.
According to yet another aspect of the present application, there is provided a computer device comprising a storage medium, a processor, and a computer program stored on the storage medium and executable on the processor, the processor implementing the game image processing method described above when executing the program.
By means of the technical scheme, the game image processing method and device, the storage medium and the computer equipment, the three-dimensional scene resource data and the interface UI material data are obtained, the pre-configured target rendering pipeline is called, then the three-dimensional scene resource data are subjected to 3D rendering through the default three-dimensional rendering channel in the target rendering pipeline, the color space of the 3D rendering result is converted into the gamma space through the first conversion channel, the 3D rendering result and the UI material are subjected to UI rendering through the default UI rendering channel in the gamma space, so that alpha mixing of the three-dimensional scene resource and the interface UI material is achieved, and finally the color space of the UI rendering result is converted into the linear space through the second conversion channel so as to be imaged. The method and the device do not need to change the existing UI picture manufacturing workflow, do not need to manually adjust the plotting space of the UI materials, improve the image manufacturing efficiency, avoid color precision loss, solve the problem that color difference exists after the current UI picture enters a game development engine to be rendered, and improve the game expressive force.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a flow chart illustrating a game image processing method according to an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating a method for generating a target rendering pipeline according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a target rendering pipeline provided by an embodiment of the present application;
fig. 4 is a schematic flowchart illustrating a newly-built camera according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating a method for generating a target rendering pipeline according to an embodiment of the present application;
FIG. 6 is a flow chart illustrating another method for generating a target rendering pipeline according to an embodiment of the present disclosure;
FIG. 7 is a diagram illustrating a comparison of rendering effects provided by an embodiment of the present application;
fig. 8 is a schematic structural diagram illustrating a game image processing apparatus according to an embodiment of the present application.
Detailed Description
The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
In the present embodiment, there is provided a game image processing method, as shown in fig. 1, the method including:
step 101, acquiring three-dimensional scene resource data and interface UI material data corresponding to a target game scene, wherein the color space of the three-dimensional scene resource data is a linear space, and the color space of the interface UI material data is a gamma space;
102, writing the three-dimensional scene resource data into a three-dimensional image cache of a target rendering pipeline, and rendering the three-dimensional scene resource data in the three-dimensional image cache through a three-dimensional rendering channel of the target rendering pipeline to obtain three-dimensional image cache data in a linear space, wherein the target rendering pipeline is obtained by adding a first conversion channel and a second conversion channel into a general rendering pipeline;
103, converting the color space of the three-dimensional image cache data into a gamma space through the first conversion channel, putting the three-dimensional image cache data and the interface UI material data in the gamma space into a UI cache of the target rendering pipeline, and performing UI rendering on the three-dimensional image cache data and the interface UI material data in the UI cache through the UI rendering channel of the target rendering pipeline;
and 104, converting the color space of the UI rendering result into a linear space through the second conversion channel.
According to the embodiment of the application, under the condition that the UI material manufacturing workflow is not changed, the problem that color difference exists after the three-dimensional scene graph in the linear space and the interface UI graph manufactured in the gamma space are mixed is solved, the accuracy of the UI graph cannot be lost, and the image effect can be improved.
In the above embodiment, first, three-dimensional scene resource data of any target game scene in the game and interface UI material data that is pre-made corresponding to the scene are obtained, where the three-dimensional scene resource data may be obtained by a game developer by making the resource in a linear space through a game engine and by drawing the scene, and the interface UI material data may be obtained by making the interface UI material data in a gamma space through image processing software PS. Secondly, the data is written into a pre-constructed target rendering pipeline, which is obtained by adding two conversion channels for color space conversion to a general rendering pipeline. Rendering three-dimensional scene resource data in a three-dimensional image cache of a target rendering pipeline through a default three-dimensional rendering channel (such as Draw Object Pass), capturing the rendered three-dimensional image cache data, converting a color space of the three-dimensional image cache data from a linear space to a gamma space through a first conversion channel added in the pipeline in advance, transferring the color space to a UI cache of the gamma space, and simultaneously putting interface UI material data belonging to the gamma space into the UI cache so as to perform UI rendering. And finally, UI rendering is carried out on the three-dimensional image cache data and the interface UI material data which are placed in the UI cache through a default UI rendering channel in the target rendering pipeline, alpha mixing is carried out on the three-dimensional scene cache data and the UI material in the linear space in the gamma space through the UI rendering process, correct transparency can be ensured, then the final rendering result is converted back to the linear space through a second conversion channel which is added in the pipeline in advance, image output is carried out, and the RGB color of the finally output image is ensured to be the correct color for replacing the linear space. Because the alpha blending of the UI picture is completed in the gamma space, there is no wrong alpha blending result, and 3D rendering in the linear space and UI rendering in the gamma space are compatible.
By applying the technical scheme of the embodiment, three-dimensional scene resource data and interface UI material data are obtained, a pre-configured target rendering pipeline is called, then 3D rendering is carried out on the three-dimensional scene resource data through a default three-dimensional rendering channel in the target rendering pipeline, a color space of a 3D rendering result is converted into a gamma space through a first conversion channel, UI rendering is carried out on the 3D rendering result and the UI material through the default UI rendering channel under the gamma space, so that alpha mixing of the three-dimensional scene resource and the interface UI material is achieved, and finally the color space of the UI rendering result is converted into a linear space through a second conversion channel so as to be imaged. According to the embodiment of the application, the existing UI picture manufacturing workflow does not need to be changed, the drawing space of the UI material does not need to be manually adjusted, the color precision loss is avoided while the image manufacturing efficiency is improved, the problem that color difference exists after the current UI picture enters a game development engine to be rendered is solved, and the game expressive force is improved.
In the embodiment of the application, the user-defined general rendering pipeline is transformed, the UI rendering problem is solved by the transformed target rendering pipeline, and the normal sRGB asset manufacturing process of a UI designer is supported. Such as the universal rendering pipeline URP provided by game engine unit, which is currently a programmable rendering pipeline, supporting customization of the pipeline.
In this embodiment of the application, the target rendering pipeline may be obtained by modifying a general rendering pipeline in the game engine, as shown in fig. 2, optionally, the target rendering pipeline may be generated through the following steps:
s1, determining a universal rendering pipeline in the game engine;
s2, adding a first conversion channel for converting the color space from the linear space to the gamma space after the three-dimensional rendering channel of the general rendering pipeline, and adding a second conversion channel for converting the color space from the gamma space to the linear space after the UI rendering channel of the general rendering pipeline to obtain the target rendering pipeline.
The method and the device for achieving the universal rendering pipeline URP can be modified based on the universal rendering pipeline URP of the game development engine unit to obtain the target rendering pipeline, and the universal rendering pipeline URP comprises a plurality of universal rendering processes, such as a three-dimensional rendering channel (used for achieving 3D rendering), a UI rendering channel (used for achieving UI rendering), a post-processing process and the like. As shown in fig. 3, after determining the general rendering pipeline, a conversion channel of a color space is added after the three-dimensional rendering channel and the UI rendering channel, wherein a first conversion channel is added after a three-dimensional rendering channel 3D pass, for performing a conversion from a linear space into a gamma space, such that a color space of a 3D rendering result is converted from the linear space into the gamma space after 3D rendering, for subsequent alpha blending with UI in gamma space to get the correct transparency, and, in addition, adding a second conversion channel after the UI pass for converting the gamma space to the linear space, so that the color space of the UI rendering result is converted from gamma space into linear space after the UI rendering to make a graph in linear space, ensuring that the RGB colors are correct.
In this embodiment of the application, the rendering process in the rendering pipeline is implemented by a rendering camera, and optionally, the constructing process of the target rendering pipeline further includes:
when a new camera instruction for the target rendering pipeline is received, identifying whether a new camera indicated by the new camera instruction is a UI camera; if the newly-built camera is a UI camera, adding the newly-built camera at the tail of a camera stack of the target rendering pipeline; and if the newly-built camera is not the UI camera, inserting the newly-built camera in front of the UI camera in the camera stack of the target rendering pipeline.
In this embodiment, since the original color space of the rendering pipeline is a linear space, the space is converted to gamma before the UI is rendered in this embodiment, so that alpha blending can be performed to obtain the correctly blended transparency, but since the color (rgb channel) is converted into the space, there is a color difference, so that it is necessary to convert the UI camera into the linear space after the rendering is completed to finally obtain the correct color and transparency. However, if the UI camera is rendered by another camera, the transparency is rendered in the linear space, and the transparency still gets wrong transparency, so the UI camera rendering should be the last rendering process of the pipeline. As shown in fig. 4, taking the Camera Stack of the universal rendering pipeline URP of the game development engine unit as an example, if a Camera is to be newly built in the Camera Stack, the identity of the newly built Camera is confirmed first, if the Camera is a UI Camera for UI rendering, the Camera is added at the last side of the Camera Stack list, and if other cameras (for example, special effect cameras) are inserted in front of the UI Camera in the Camera Stack list, so as to ensure that the rendering order of the target rendering pipeline is 3D Camera- > other Camera- > UI Camera.
Further, as a refinement and an extension of the specific implementation of the above embodiment, in order to fully explain the specific implementation process of the embodiment, another game image processing method is provided, as shown in fig. 5, and the method includes:
step 201, identifying whether the target game scene contains a three-dimensional scene object.
Step 202, if yes, traversing a three-dimensional scene object in the target game scene, and acquiring the three-dimensional scene resource data corresponding to the three-dimensional scene object; and traversing the UI canvas object corresponding to the target game scene to acquire the interface UI material data corresponding to the UI canvas object.
And 203, if not, traversing the UI canvas object corresponding to the target game scene to acquire the interface UI material data corresponding to the UI canvas object.
204, writing the three-dimensional scene resource data into a three-dimensional image cache of a target rendering pipeline, and rendering the three-dimensional scene resource data in the three-dimensional image cache through a three-dimensional rendering channel of the target rendering pipeline to obtain three-dimensional image cache data in a linear space; converting the color space of the three-dimensional image buffer data to a gamma space through the first conversion channel.
Optionally, the three-dimensional image cache data in the linear space obtained through the three-dimensional rendering channel is in a first data format; and the three-dimensional image cache data in the gamma space obtained through the first conversion channel is in a second data format.
In this embodiment, as shown in fig. 6, it is first determined whether a three-dimensional scene object, such as a 3D scene object, is included in the target game scene. If the data is contained in the interface UI material data, traversing a three-dimensional scene object in the scene, acquiring three-dimensional scene resource data, performing 3D rendering through a 3D pass, and converting three-dimensional image cache data (3D cache) in a linear space into a gamma space, wherein the cache format of the three-dimensional image cache data in the linear space can be RGB111110Float (a first data format), and the cache format can be RGBA32UNorm (a second data format) after the data is converted into the gamma space, and further traversing UI canvas objects in pre-made UI canvases, namely the UI canvas objects in FIG. 4, and acquiring the interface UI material data. And if the target game scene does not comprise the three-dimensional scene object, directly traversing the UI canvas to obtain UI material data.
Step 205, converting the color space of the interface UI material data into a linear space, performing reverse correction on the interface UI material data in the linear space, and sampling the interface UI material data after the reverse correction, so as to convert the sampled interface UI material data back to the gamma space.
In this embodiment, the color space of the UI material data is converted into the linear space before the UI is rendered, so as to ensure that the UI image can be previewed in the game development engine, ensure that the UI image can be color previewed under the editor, and possibly that part of the UI is rendered on the linear 3D cache as a 3D object (e.g. a monster blood bar), and converting the UI from the gamma space into the linear space can ensure that the UI can be accurately displayed in the linear space of the editor. In addition, the method can also be used for reversely correcting the interface UI materials converted into the linear space in the shader based on the existing flow in the pipeline, then sampling the UI subjected to the reverse correction and converting the UI back into the gamma space so as to perform UI rendering in the next step.
And step 206, putting the three-dimensional image cache data in the gamma space and the sampled interface UI material data into a UI cache.
Step 207, performing UI rendering on the three-dimensional image cache data and the interface UI material data in the UI cache through a UI rendering channel of the target rendering pipeline; and converting the color space of the UI rendering result into a linear space through the second conversion channel.
And step 208, displaying a game image containing the three-dimensional scene resources and the interface UI materials based on the UI rendering result in the linear space.
Optionally, the UI rendering result in the linear space obtained through the second conversion channel is in a third data format.
In this embodiment, the three-dimensional image cache data converted into the gamma space is put into a UI cache as a UI material, UI rendering is performed in the UI cache together with the sampled interface UI material data, and finally, after the UI rendering is completed, the UI rendering result in the UI cache is converted into a linear space to obtain a UI rendering result in a third data format (e.g., RGBA32sRGB), and drawing is performed, and a game image including the three-dimensional scene resource and the interface UI material is displayed on a screen, so as to ensure that the color and the transparency of the game screen displayed on the screen are correct. Fig. 7 shows a rendering effect comparison diagram provided by the embodiment of the application, and as shown in fig. 7, a picture rendered by using the scheme has a better transparency effect than a picture rendered by a current 3D engine.
Through the technical scheme of this embodiment of application, the production of UI picture still uses gamma color space, and color space conversion carries out in 3D engine rendering pipeline, can not cause UI picture source data precision loss, has also avoided the blind accent of developer to cause the work load to increase. In addition, in the scheme, the UI picture production still uses the Gamma (Gamma) color UI picture asset manufacturing stage without changing the workflow, so that the additional workload brought to a UI designer is avoided. The method is beneficial to saving the UI making period in the linear color space and enhancing the universality of the UI assets.
Further, as a specific implementation of the method in fig. 1, an embodiment of the present application provides a game image processing apparatus, as shown in fig. 8, the apparatus includes:
the data acquisition module is used for acquiring three-dimensional scene resource data and interface UI material data corresponding to a target game scene, wherein the color space of the three-dimensional scene resource data is a linear space, and the color space of the interface UI material data is a gamma space;
the first rendering module is used for writing the three-dimensional scene resource data into a three-dimensional image cache of a target rendering pipeline, rendering the three-dimensional scene resource data in the three-dimensional image cache through a three-dimensional rendering channel of the target rendering pipeline, and obtaining the three-dimensional image cache data in a linear space, wherein the target rendering pipeline is obtained by adding a first conversion channel and a second conversion channel into a general rendering pipeline;
a second rendering module, configured to convert the color space of the three-dimensional image cache data into a gamma space through the first conversion channel, put the three-dimensional image cache data in the gamma space and the interface UI material data into a UI cache of the target rendering pipeline, and perform UI rendering on the three-dimensional image cache data and the interface UI material data in the UI cache through a UI rendering channel of the target rendering pipeline; and converting the color space of the UI rendering result into a linear space through the second conversion channel.
Optionally, the data obtaining module is specifically configured to:
traversing a three-dimensional scene object in the target game scene, and acquiring the three-dimensional scene resource data corresponding to the three-dimensional scene object;
and traversing the UI canvas object corresponding to the target game scene to acquire the interface UI material data corresponding to the UI canvas object.
Optionally, the data obtaining module is further configured to:
before traversing the three-dimensional scene object in the target game scene, identifying whether the target game scene contains the three-dimensional scene object;
if yes, executing the traversal of the three-dimensional scene object in the target game scene;
and if not, executing the UI canvas object corresponding to the target game scene.
Optionally, the first rendering module is further configured to:
converting the color space of the interface UI material data into a linear space, carrying out reverse correction on the interface UI material data in the linear space, and sampling the interface UI material data subjected to the reverse correction so as to convert the sampled interface UI material data back to a gamma space;
and putting the three-dimensional image cache data in the gamma space and the sampled interface UI material data into a UI cache.
Optionally, the apparatus further comprises: a pipeline retrofit module for:
before acquiring three-dimensional scene resource data and interface UI material data corresponding to a target game scene, determining a universal rendering pipeline in a game engine;
adding a first conversion channel for converting the color space from linear space to gamma space after the three-dimensional rendering channel of the generic rendering pipeline, and adding a second conversion channel for converting the color space from gamma space to linear space after the UI rendering channel of the generic rendering pipeline URP to obtain the target rendering pipeline.
Optionally, the pipeline retrofit module is further configured to:
when a new camera instruction for the target rendering pipeline is received, identifying whether a new camera indicated by the new camera instruction is a UI camera;
if the newly-built camera is a UI camera, adding the newly-built camera at the tail of a camera stack of the target rendering pipeline;
and if the newly-built camera is not the UI camera, inserting the newly-built camera in front of the UI camera in the camera stack of the target rendering pipeline.
Optionally, the apparatus further comprises:
and the display module is used for displaying the game image containing the three-dimensional scene resource and the interface UI material based on the UI rendering result in the linear space after the color space of the UI rendering result is converted into the linear space through the second conversion channel.
Optionally, the three-dimensional image cache data in the linear space obtained through the three-dimensional rendering channel is in a first data format; the three-dimensional image cache data in the gamma space obtained through the first conversion channel is in a second data format; and the UI rendering result in the linear space obtained through the second conversion channel is in a third data format.
It should be noted that, other corresponding descriptions of the functional units related to the game image processing apparatus provided in the embodiment of the present application may refer to the corresponding descriptions in the methods in fig. 1 to fig. 6, and are not repeated herein.
Based on the methods shown in fig. 1 to 6, correspondingly, the present application further provides a storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the game image processing method shown in fig. 1 to 6.
Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the implementation scenarios of the present application.
Based on the above methods shown in fig. 1 to fig. 6 and the virtual device embodiment shown in fig. 8, in order to achieve the above object, an embodiment of the present application further provides a computer device, which may specifically be a personal computer, a server, a network device, and the like, where the computer device includes a storage medium and a processor; a storage medium for storing a computer program; a processor for executing a computer program to implement the above-described game image processing method as shown in fig. 1 to 6.
Optionally, the computer device may also include a user interface, a network interface, a camera, Radio Frequency (RF) circuitry, sensors, audio circuitry, a WI-FI module, and so forth. The user interface may include a Display screen (Display), an input unit such as a keypad (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, etc. The network interface may optionally include a standard wired interface, a wireless interface (e.g., a bluetooth interface, WI-FI interface), etc.
It will be appreciated by those skilled in the art that the present embodiment provides a computer device structure that is not limited to the computer device, and may include more or less components, or some components in combination, or a different arrangement of components.
The storage medium may further include an operating system and a network communication module. An operating system is a program that manages and maintains the hardware and software resources of a computer device, supporting the operation of information handling programs, as well as other software and/or programs. The network communication module is used for realizing communication among components in the storage medium and other hardware and software in the entity device.
Through the description of the above embodiments, those skilled in the art can clearly understand that the present application can be implemented by software plus a necessary general hardware platform, or can obtain three-dimensional scene resource data and interface UI material data by hardware, call a pre-configured target rendering pipeline, perform 3D rendering on the three-dimensional scene resource data through a default three-dimensional rendering channel in the target rendering pipeline, convert a color space of a 3D rendering result into a gamma space by using a first conversion channel, perform UI rendering on the 3D rendering result and the UI material under the gamma space by using the default UI rendering channel, so as to mix the three-dimensional scene resource and the interface UI material, and finally convert the color space of the UI rendering result into a linear space by using a second conversion channel, so as to perform alpha mapping. According to the embodiment of the application, the existing UI picture manufacturing workflow does not need to be changed, the drawing space of the UI material does not need to be manually adjusted, the color precision loss is avoided while the image manufacturing efficiency is improved, the problem that color difference exists after the current UI picture enters a game development engine to be rendered is solved, and the game expressive force is improved.
Those skilled in the art will appreciate that the figures are merely schematic representations of one preferred implementation scenario and that the blocks or flow diagrams in the figures are not necessarily required to practice the present application. Those skilled in the art will appreciate that the modules in the devices in the implementation scenario may be distributed in the devices in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the present implementation scenario with corresponding changes. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The above application serial numbers are for description purposes only and do not represent the superiority or inferiority of the implementation scenarios. The above disclosure is only a few specific implementation scenarios of the present application, but the present application is not limited thereto, and any variations that can be made by those skilled in the art are intended to fall within the scope of the present application.

Claims (10)

1. A game image processing method, comprising:
acquiring three-dimensional scene resource data and interface UI material data corresponding to a target game scene, wherein the color space of the three-dimensional scene resource data is a linear space, and the color space of the interface UI material data is a gamma space;
writing the three-dimensional scene resource data into a three-dimensional image cache of a target rendering pipeline, and rendering the three-dimensional scene resource data in the three-dimensional image cache through a three-dimensional rendering channel of the target rendering pipeline to obtain three-dimensional image cache data in a linear space, wherein the target rendering pipeline is obtained by adding a first conversion channel and a second conversion channel in a general rendering pipeline;
converting the color space of the three-dimensional image cache data into a gamma space through the first conversion channel, putting the three-dimensional image cache data and the interface UI material data in the gamma space into a UI cache of the target rendering pipeline, and carrying out UI rendering on the three-dimensional image cache data and the interface UI material data in the UI cache through the UI rendering channel of the target rendering pipeline;
and converting the color space of the UI rendering result into a linear space through the second conversion channel.
2. The method according to claim 1, wherein the placing the three-dimensional image cache data in the gamma space and the interface UI material data into the UI cache of the target rendering pipeline specifically comprises:
converting the color space of the interface UI material data into a linear space, carrying out reverse correction on the interface UI material data in the linear space, and sampling the interface UI material data subjected to the reverse correction so as to convert the sampled interface UI material data back to a gamma space;
and putting the three-dimensional image cache data in the gamma space and the sampled interface UI material data into a UI cache.
3. The method according to claim 1, wherein before the acquiring the three-dimensional scene resource data and the interface UI material data corresponding to the target game scene, the method further comprises:
determining a generic rendering pipeline in a game engine;
adding a first conversion channel for converting a color space from a linear space to a gamma space after a three-dimensional rendering channel of the generic rendering pipeline, and adding a second conversion channel for converting a color space from a gamma space to a linear space after a UI rendering channel of the generic rendering pipeline to obtain the target rendering pipeline.
4. The method of claim 3, further comprising:
when a new camera instruction for the target rendering pipeline is received, identifying whether a new camera indicated by the new camera instruction is a UI camera;
if the newly-built camera is a UI camera, adding the newly-built camera at the tail of a camera stack of the target rendering pipeline;
and if the newly-built camera is not the UI camera, inserting the newly-built camera in front of the UI camera in the camera stack of the target rendering pipeline.
5. The method according to any one of claims 1 to 4, wherein after the converting the color space of the UI rendering result into the linear space through the second conversion channel, the method further comprises:
and displaying a game image containing the three-dimensional scene resources and the interface UI materials based on the UI rendering result in the linear space.
6. The method according to any one of claims 1 to 4,
three-dimensional image cache data in a linear space obtained through the three-dimensional rendering channel is in a first data format; the three-dimensional image cache data in the gamma space obtained through the first conversion channel is in a second data format; and the UI rendering result in the linear space obtained through the second conversion channel is in a third data format.
7. The method according to any one of claims 1 to 4, wherein the acquiring of the three-dimensional scene resource data and the interface UI material data corresponding to the target game scene specifically includes:
traversing a three-dimensional scene object in the target game scene, and acquiring the three-dimensional scene resource data corresponding to the three-dimensional scene object;
and traversing the UI canvas object corresponding to the target game scene to acquire the interface UI material data corresponding to the UI canvas object.
8. A game image processing apparatus, comprising:
the data acquisition module is used for acquiring three-dimensional scene resource data and interface UI material data corresponding to a target game scene, wherein the color space of the three-dimensional scene resource data is a linear space, and the color space of the interface UI material data is a gamma space;
the first rendering module is used for writing the three-dimensional scene resource data into a three-dimensional image cache of a target rendering pipeline, rendering the three-dimensional scene resource data in the three-dimensional image cache through a three-dimensional rendering channel of the target rendering pipeline, and obtaining the three-dimensional image cache data in a linear space, wherein the target rendering pipeline is obtained by adding a first conversion channel and a second conversion channel into a general rendering pipeline;
the second rendering module is used for converting the color space of the three-dimensional image cache data into a gamma space through the first conversion channel, putting the three-dimensional image cache data and the interface UI material data in the gamma space into a UI cache of the target rendering pipeline, and performing UI rendering on the three-dimensional image cache data and the interface UI material data in the UI cache through the UI rendering channel of the target rendering pipeline; and converting the color space of the UI rendering result into a linear space through the second conversion channel.
9. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the method of any of claims 1 to 7.
10. A computer device comprising a storage medium, a processor and a computer program stored on the storage medium and executable on the processor, characterized in that the processor implements the method of any one of claims 1 to 7 when executing the computer program.
CN202210542191.9A 2022-05-18 2022-05-18 Game image processing method and device, storage medium and computer equipment Pending CN114882164A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210542191.9A CN114882164A (en) 2022-05-18 2022-05-18 Game image processing method and device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210542191.9A CN114882164A (en) 2022-05-18 2022-05-18 Game image processing method and device, storage medium and computer equipment

Publications (1)

Publication Number Publication Date
CN114882164A true CN114882164A (en) 2022-08-09

Family

ID=82674991

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210542191.9A Pending CN114882164A (en) 2022-05-18 2022-05-18 Game image processing method and device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN114882164A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110975284A (en) * 2019-12-06 2020-04-10 珠海金山网络游戏科技有限公司 Unity-based NGUI resource rendering processing method and device
CN111068314A (en) * 2019-12-06 2020-04-28 珠海金山网络游戏科技有限公司 Unity-based NGUI resource rendering processing method and device
CN114307144A (en) * 2021-12-31 2022-04-12 上海完美时空软件有限公司 Image processing method and device, storage medium and computer equipment
CN114307143A (en) * 2021-12-31 2022-04-12 上海完美时空软件有限公司 Image processing method and device, storage medium and computer equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110975284A (en) * 2019-12-06 2020-04-10 珠海金山网络游戏科技有限公司 Unity-based NGUI resource rendering processing method and device
CN111068314A (en) * 2019-12-06 2020-04-28 珠海金山网络游戏科技有限公司 Unity-based NGUI resource rendering processing method and device
CN114307144A (en) * 2021-12-31 2022-04-12 上海完美时空软件有限公司 Image processing method and device, storage medium and computer equipment
CN114307143A (en) * 2021-12-31 2022-04-12 上海完美时空软件有限公司 Image processing method and device, storage medium and computer equipment

Similar Documents

Publication Publication Date Title
CN108876887B (en) Rendering method and device
CN112102437B (en) Canvas-based radar map generation method and device, storage medium and terminal
CN109874048B (en) Video window assembly semitransparent display method and device and computer equipment
CN111068314B (en) NGUI resource rendering processing method and device based on Unity
US10664980B2 (en) Vector graphics handling processes for user applications
CN106651999A (en) Method and device for accelerating frame animation loading
EP2595117A1 (en) Method for enabling animation during screen switching and mobile terminal
JP6028527B2 (en) Display processing apparatus, display processing method, and program
CN113313802A (en) Image rendering method, device and equipment and storage medium
JP2021006982A (en) Method and device for determining character color
CN109829963B (en) Image drawing method and device, computing equipment and storage medium
CN110782387A (en) Image processing method and device, image processor and electronic equipment
CN108364335A (en) A kind of animation method for drafting and device
CN109615583B (en) Game map generation method and device
CN114307143A (en) Image processing method and device, storage medium and computer equipment
CN111930461B (en) Mobile terminal APP full page graying method and device based on Android
CN114307144A (en) Image processing method and device, storage medium and computer equipment
CN114882164A (en) Game image processing method and device, storage medium and computer equipment
CN110975284A (en) Unity-based NGUI resource rendering processing method and device
CN112486476A (en) Map generation method, map generation device, storage medium and computer equipment
CN115471592A (en) Dynamic image processing method and system
CN111179390A (en) Method and device for efficiently previewing CG assets
CN113867857B (en) Progress bar display method, device and equipment based on Android system
JP2010282200A (en) Structure of animation font file and text-displaying method for mobile terminal
CN112367399B (en) Filter effect generation method and device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination