CN114307143A - Image processing method and device, storage medium and computer equipment - Google Patents

Image processing method and device, storage medium and computer equipment Download PDF

Info

Publication number
CN114307143A
CN114307143A CN202111675781.0A CN202111675781A CN114307143A CN 114307143 A CN114307143 A CN 114307143A CN 202111675781 A CN202111675781 A CN 202111675781A CN 114307143 A CN114307143 A CN 114307143A
Authority
CN
China
Prior art keywords
scene
rendering
texture
interface
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111675781.0A
Other languages
Chinese (zh)
Inventor
朱瀚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Perfect Time And Space Software Co ltd
Original Assignee
Shanghai Perfect Time And Space Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Perfect Time And Space Software Co ltd filed Critical Shanghai Perfect Time And Space Software Co ltd
Priority to CN202111675781.0A priority Critical patent/CN114307143A/en
Publication of CN114307143A publication Critical patent/CN114307143A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Generation (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image processing method and device, a storage medium and computer equipment, wherein the method comprises the following steps: acquiring an interface rendering map and a first scene rendering texture of a target game scene, wherein the interface rendering map is generated in a gamma space, and the first scene rendering texture is applied in a linear space; converting the first scene rendering texture from the linear space to the gamma space, generating a second scene rendering texture, and performing alpha mixing on the second scene rendering texture and the interface rendering map to obtain a third scene rendering texture; and converting the third scene rendering texture from the gamma space to the linear space to obtain a target scene rendering texture. The scheme solves the problem of chromatic aberration after the map of the image processing software enters the game development engine, and improves the expressive force of the game.

Description

Image processing method and device, storage medium and computer equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, a storage medium, and a computer device.
Background
The game display includes an interactive interface UI in addition to an in-game scene, and the in-game scene is generally rendered in a linear space by a game engine, while the game interactive interface is usually produced in a gamma space by image processing software, such as Photoshop. When the picture derived by the image processing software is mixed with the scene picture in the game in the linear space through the game engine, the color difference exists in the game picture due to the difference of the color space of the picture and the scene picture, and the display effect is poor.
Disclosure of Invention
In view of the above, the present application provides an image processing method and apparatus, a storage medium, and a computer device.
According to an aspect of the present application, there is provided an image processing method including:
acquiring an interface rendering map and a first scene rendering texture of a target game scene, wherein the interface rendering map is generated in a gamma space, and the first scene rendering texture is applied in a linear space;
converting the first scene rendering texture from the linear space to the gamma space, generating a second scene rendering texture, and performing alpha mixing on the second scene rendering texture and the interface rendering map to obtain a third scene rendering texture;
and converting the third scene rendering texture from the gamma space to the linear space to obtain a target scene rendering texture.
Optionally, before obtaining the interface rendering map and the first scene rendering texture of the target game scene, the method further includes:
cutting the initial interface rendering graph to obtain a plurality of interface rendering pixels;
storing information related to a plurality of the interface rendering pixels in a cutting table.
Optionally, the obtaining of the interface rendering of the target game scene specifically includes:
and importing the interface rendering pixels and the cutting chart into a game engine, and splicing the interface rendering pixels into the interface rendering chart through the game engine based on the relevant information in the cutting chart.
Optionally, before the alpha blending is performed on the second scene rendering texture and the interface rendering map to obtain a third scene rendering texture, the method further includes:
and adding a UI control in the interface rendering graph based on the related information of the interface rendering pixels.
Optionally, the method further comprises:
adding a camera stack in the game engine, wherein the camera stack sequentially comprises a first camera, a UI camera and a second camera;
setting attributes of the first camera and the second camera as overlapping cameras, respectively.
Optionally, the converting the first scene rendering texture from the linear space to the gamma space to generate a second scene rendering texture, and performing alpha blending on the second scene rendering texture and the interface rendering map to obtain a third scene rendering texture specifically includes:
converting, by the first camera, the first scene rendering texture from the linear space to the gamma space, generating a second scene rendering texture; and performing alpha mixing on the second scene rendering texture and the interface rendering graph through the UI camera to obtain a third scene rendering texture.
Optionally, the converting the third scene rendering texture from the gamma space to the linear space to obtain a target scene rendering texture specifically includes:
converting, by the second camera, the third scene rendering texture from the gamma space to the linear space, resulting in a target scene rendering texture.
Optionally, the method further comprises:
obtaining font information of the target game scene, wherein the font information comprises a character transparency parameter;
and correcting the character transparency parameter according to a preset transparency correction equation, wherein the preset correction equation is obtained by fitting the sample transparency initial parameter and the sample transparency correction parameter.
Optionally, after the converting the third scene rendering texture from the gamma space to the linear space to obtain the target scene rendering texture, the method further includes:
and rendering according to the target scene rendering texture and the source color correction parameter corresponding to the target scene rendering texture to obtain a scene graph of the target game scene.
According to another aspect of the present application, there is provided an image processing apparatus including:
the image acquisition module is used for acquiring an interface rendering map and a first scene rendering texture of a target game scene, wherein the interface rendering map is generated in a gamma space, and the first scene rendering texture is applied in a linear space;
the image correction module is used for converting the first scene rendering texture from the linear space to the gamma space, generating a second scene rendering texture, and performing alpha mixing on the second scene rendering texture and the interface rendering map to obtain a third scene rendering texture;
and the texture generating module is used for converting the third scene rendering texture from the gamma space to the linear space to obtain a target scene rendering texture.
Optionally, the apparatus further comprises:
the map cutting module is used for cutting the initial interface rendering map to obtain a plurality of interface rendering pixels before the interface rendering map and the first scene rendering texture of the target game scene are obtained; storing information related to a plurality of the interface rendering pixels in a cutting table.
Optionally, the image obtaining module is specifically configured to: and importing the interface rendering pixels and the cutting chart into a game engine, and splicing the interface rendering pixels into the interface rendering chart through the game engine based on the relevant information in the cutting chart.
Optionally, the apparatus further comprises:
and the control adding module is used for adding a UI control in the interface rendering graph based on the relevant information of the interface rendering pixel.
Optionally, the apparatus further comprises:
the game engine comprises an engine configuration module, a game processing module and a game processing module, wherein the engine configuration module is used for adding a camera stack in the game engine, and the camera stack sequentially comprises a first camera, a UI camera and a second camera; setting attributes of the first camera and the second camera as overlapping cameras, respectively.
Optionally, the image correction module is specifically configured to: converting, by the first camera, the first scene rendering texture from the linear space to the gamma space, generating a second scene rendering texture; and performing alpha mixing on the second scene rendering texture and the interface rendering graph through the UI camera to obtain a third scene rendering texture.
Optionally, the texture generating module is specifically configured to: converting, by the second camera, the third scene rendering texture from the gamma space to the linear space, resulting in a target scene rendering texture.
Optionally, the apparatus further comprises:
the font correction module is used for acquiring font information of the target game scene, wherein the font information comprises a character transparency parameter; and correcting the character transparency parameter according to a preset transparency correction equation, wherein the preset correction equation is obtained by fitting the sample transparency initial parameter and the sample transparency correction parameter.
Optionally, the apparatus further comprises:
and the rendering module is used for converting the third scene rendering texture from the gamma space to the linear space to obtain a target scene rendering texture, and then rendering according to the target scene rendering texture and the source color correction parameters corresponding to the target scene rendering texture to obtain a scene graph of the target game scene.
According to yet another aspect of the present application, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described image processing method.
According to yet another aspect of the present application, there is provided a computer device comprising a storage medium, a processor, and a computer program stored on the storage medium and executable on the processor, the processor implementing the image processing method when executing the program.
By means of the technical scheme, the image processing method and device, the storage medium and the computer equipment provided by the application perform space conversion on rendering textures of a game scene in a linear space, convert the rendering textures into a gamma space from the linear space, perform alpha mixing on the rendering textures and an interface rendering image in the gamma space, and uniformly convert the rendering textures and the interface rendering image back to the linear space. The method has the advantages that the chart space does not need to be manually adjusted by game developers, the image correction is manually carried out, the chart efficiency is improved, the problem that the color difference exists after the chart of the image processing software enters the game development engine is solved, and the game expressive force is improved.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart illustrating an image processing method according to an embodiment of the present application;
FIG. 2 is a flow chart of another image processing method provided by the embodiment of the application;
fig. 3 shows a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
In the present embodiment, there is provided an image processing method, as shown in fig. 1, including:
step 101, obtaining an interface rendering map and a first scene rendering texture of a target game scene, wherein the interface rendering map is generated in a gamma space, and the first scene rendering texture is applied in a linear space;
102, converting the first scene rendering texture from the linear space to the gamma space, generating a second scene rendering texture, and performing alpha mixing on the second scene rendering texture and the interface rendering map to obtain a third scene rendering texture;
step 103, converting the third scene rendering texture from the gamma space to the linear space to obtain a target scene rendering texture.
The embodiment of the application aims at the problem that color difference exists after a scene rendering graph manufactured in a linear space and an interface rendering graph generated in a gamma space are mixed, space conversion is carried out on rendering textures of the scene rendering graph, the scene rendering graph is converted into the gamma space from the linear space, the rendering textures and the interface rendering graph are subjected to alpha mixing in the gamma space, then the rendering textures and the interface rendering graph are uniformly converted back into the linear space, the problem that color change can occur when pictures derived by image processing software PS are mixed in the game engine unit linear space, and the artistic effect is inconsistent is solved.
In the above embodiment, first, an interface rendering map of any target game scene in a game and a first scene rendering texture corresponding to the scene rendering map are obtained, where the scene rendering map may be a picture that is obtained by a game developer through a game engine or other software to make a rendering in a linear space and is used for game display, the first scene rendering texture is a rendering texture required for rendering the scene rendering map, and the interface rendering map may be obtained through an image processing software PS to make a rendering in a gamma space. And secondly, performing space conversion on the first scene rendering texture, converting the first scene rendering texture in the linear space into the gamma space to obtain a second scene rendering texture in the gamma space, and further performing alpha mixing on the second scene rendering texture converted into the gamma space and an interface rendering map in the gamma space to obtain a rendering texture capable of showing the effect of looking at the scene rendering map through the interface rendering map, namely a third scene rendering texture obtained after alpha mixing. And finally, performing space conversion on the rendering texture of the third scene, and converting the rendering texture of the third scene from the gamma space back to the linear space, so that a game picture finally rendered through the rendering texture of the target scene has no color difference, and the effect of the game picture is improved.
By applying the technical scheme of the embodiment, rendering textures of a game scene in a linear space are subjected to spatial conversion, the rendering textures are converted into a gamma space from the linear space, the rendering textures and an interface rendering image are subjected to alpha mixing in the gamma space, and then the rendering textures and the interface rendering image are uniformly converted back to the linear space. The method has the advantages that the chart space does not need to be manually adjusted by game developers, the image correction is manually carried out, the chart efficiency is improved, the problem that the color difference exists after the chart of the image processing software enters the game development engine is solved, and the game expressive force is improved.
Further, as a refinement and an extension of the specific implementation of the above embodiment, in order to fully illustrate the specific implementation process of the embodiment, another image processing method is provided, as shown in fig. 2, and the method includes:
step 201, cutting an initial interface rendering graph to obtain a plurality of interface rendering pixels; storing information related to a plurality of the interface rendering pixels in a cutting table.
Step 202, importing a plurality of interface rendering pixels and the graph cut table into a game engine, splicing the interface rendering pixels into the interface rendering table through the game engine based on relevant information in the graph cut table, and acquiring a first scene rendering texture of a target game scene.
In the embodiment of the application, an initial interface rendering graph made by image processing software PS may be automatically cut through an editing script to obtain a plurality of interface rendering pixels, specifically, each pixel may be obtained by cutting the graph through a naming specification pre-set in the PS according to the edited script according to the relevant information such as the position of each pixel, and the relevant information may be stored in the cutting graph according to the naming specification, for example, the relevant information of the pixel is stored in an xml file. And further, a cutting chart of the xml file can be imported into a game engine, and the engine generates each pixel into a scene through relevant information such as the position of the xml analysis pixel, so that the pixels are spliced in the engine to obtain a complete interface rendering chart.
Step 203, adding a UI control in the interface rendering graph based on the relevant information of the interface rendering pixel.
Step 204, converting the first scene rendering texture from the linear space to the gamma space through a first camera, and generating a second scene rendering texture; and carrying out alpha mixing on the second scene rendering texture and the interface rendering graph through a UI camera to obtain a third scene rendering texture.
Step 205, converting the third scene rendering texture from the gamma space to the linear space through a second camera to obtain a target scene rendering texture.
In this embodiment, the game engine may further obtain, according to the information related to the interface rendering pixels in the graph, a UI control of each position pixel in combination with the naming rule, and add a corresponding UI control in the interface rendering. Further, the embodiment of the application can complete the processing flow of two times of space conversion and alpha mixing of the image by adding a camera stack in the game engine in advance. Optionally, a camera stack is added to the game engine, wherein the camera stack sequentially includes a first camera, a UI camera, and a second camera; setting attributes of the first camera and the second camera as overlapping cameras, respectively.
In the embodiment, a Linear to Gamma camera (Gamma to Linear camera) is used to convert the rendering texture of the main camera MainCamera into a Gamma space, the rendering texture and the UI diagram are alpha-blended in the Gamma space, and then the rendering texture and the UI diagram are uniformly converted back into the Linear space through the Gamma to Gamma camera (Gamma to Linear camera). The flow can be implemented by adding a camera stack, specifically, the following camera stacks may be added: linear to Gamma Camera > UICamera > Gamma to Linear Camera. It should be noted that, in an actual application scenario, two cameras, namely, a first camera Gamma to Gamma and a second camera Gamma to Gamma, may be created, and are respectively set to an overlay overlapping camera mode, and the following information is respectively set to save engine performance and improve processing efficiency: the related camera tags Tag are respectively set to Linear to Gamma and Gamma to Linear (the tags can be used for progress judgment in post-processing flow), pop processing is cancelled, clear depth is cancelled, render windows is cancelled, and pullmask is set to notify).
Therefore, through two post-treatments, the current rendering texture in the three-dimensional scene is converted from the linear space to the gamma space for the first time, alpha mixing is carried out on the rendering texture and the interface image of the image in the gamma space, and the blended rendering texture is converted back to the linear space for the second time, so that color correction of the rendering texture is realized, and the problem that color difference exists in the semitransparent effect is solved.
And step 206, rendering according to the target scene rendering texture and the source color correction parameter corresponding to the target scene rendering texture to obtain a scene graph of the target game scene.
Due to the linear data generated in the shader, in the "linear color space", the rendering target is sRGB frame buffer (frame buffer of sRGB), the linear data is converted to sRGB, and the image looks redder than the real effect when previewed. In order to solve the problem, in the embodiment of the present application, during rendering an image, LUT color correction may be performed once, specifically, in the process of rendering the image through a target scene rendering texture, color data of each pixel point in a scene to be rendered is calculated based on the target scene rendering texture, and a color output is stored in a frame buffer, and then rendering is performed through reading the frame buffer, after the color data of each pixel point is calculated, in order to prevent an image color from having a chromatic aberration due to direct rendering using the data, color correction is performed on the color data of each pixel point in the frame buffer by using a source color correction parameter, and then the color data of each pixel point after color correction is written into the frame buffer, so that the scene image is rendered by using the color data after correction in the frame buffer in the subsequent process. The correction of the color data of each pixel point in the frame buffer may be performed by adding corresponding preset values, such as R channel color value +0.008, G channel color value +0.008, and B channel color value +0.02, to the RGB channel values of the color data of the pixel points, respectively.
In the embodiment of the present application, optionally, the method further includes: obtaining font information of the target game scene, wherein the font information comprises a character transparency parameter; and correcting the character transparency parameter according to a preset transparency correction equation, wherein the preset correction equation is obtained by fitting the sample transparency initial parameter and the sample transparency correction parameter.
Because the embodiment of the application carries out image color correction in a self-defining mode and does not adopt the self-function of the game engine, the checking of the sRGB (red, green and blue) is required to be cancelled when the game engine is used for an interface rendering in the engine, and the checking of the sRGB cannot be cancelled because the font is filled by pure color, so the font can be blackened and darker. If the game scene contains font information, the font information can be corrected, and only normal Gamma correction is needed for three values of RGB (i.e. saturrate (RGB,0.454545)), but the problem of picture transparency gradient display error occurs when the translucent or gradient transparent pixels in Ps are directly Gamma corrected in the engine, because the game engine Unity has the curve change for the translucent correction, so the embodiment of the application obtains the change curve by color correction of sample transparent data, and thus the character transparency is corrected by using the fitted change curve. After the character transparency parameter of the font is corrected, when the scene graph is finally rendered, the color data of each pixel point in the scene can be calculated according to the corrected character transparency parameter, and the color data is stored in the frame buffer for subsequent rendering graph.
Wherein, an alternative form of the fitting equation is: y is a + b.X + c.X2+d·X3+e·X4+f·X5+g·X6+h·X7+i·X8+j·X9+k·X10Wherein a, b, c, d, e, f, g, h, i, j and k are constants obtained by fitting, X is a character transparency parameter before correction, and Y is a character transparency parameter after correction.
Further, as a specific implementation of the method in fig. 1, an embodiment of the present application provides an image processing apparatus, as shown in fig. 3, the apparatus includes:
the image acquisition module is used for acquiring an interface rendering map and a first scene rendering texture of a target game scene, wherein the interface rendering map is generated in a gamma space, and the first scene rendering texture is applied in a linear space;
the image correction module is used for converting the first scene rendering texture from the linear space to the gamma space, generating a second scene rendering texture, and performing alpha mixing on the second scene rendering texture and the interface rendering map to obtain a third scene rendering texture;
and the texture generating module is used for converting the third scene rendering texture from the gamma space to the linear space to obtain a target scene rendering texture.
Optionally, the apparatus further comprises:
the map cutting module is used for cutting the initial interface rendering map to obtain a plurality of interface rendering pixels before the interface rendering map and the first scene rendering texture of the target game scene are obtained; storing information related to a plurality of the interface rendering pixels in a cutting table.
Optionally, the image obtaining module is specifically configured to: and importing the interface rendering pixels and the cutting chart into a game engine, and splicing the interface rendering pixels into the interface rendering chart through the game engine based on the relevant information in the cutting chart.
Optionally, the apparatus further comprises:
and the control adding module is used for adding a UI control in the interface rendering graph based on the relevant information of the interface rendering pixel.
Optionally, the apparatus further comprises:
the game engine comprises an engine configuration module, a game processing module and a game processing module, wherein the engine configuration module is used for adding a camera stack in the game engine, and the camera stack sequentially comprises a first camera, a UI camera and a second camera; setting attributes of the first camera and the second camera as overlapping cameras, respectively.
Optionally, the image correction module is specifically configured to: converting, by the first camera, the first scene rendering texture from the linear space to the gamma space, generating a second scene rendering texture; and performing alpha mixing on the second scene rendering texture and the interface rendering graph through the UI camera to obtain a third scene rendering texture.
Optionally, the texture generating module is specifically configured to: converting, by the second camera, the third scene rendering texture from the gamma space to the linear space, resulting in a target scene rendering texture.
Optionally, the apparatus further comprises:
the font correction module is used for acquiring font information of the target game scene, wherein the font information comprises a character transparency parameter; and correcting the character transparency parameter according to a preset transparency correction equation, wherein the preset correction equation is obtained by fitting the sample transparency initial parameter and the sample transparency correction parameter.
Optionally, the apparatus further comprises:
and the rendering module is used for converting the third scene rendering texture from the gamma space to the linear space to obtain a target scene rendering texture, and then rendering according to the target scene rendering texture and the source color correction parameters corresponding to the target scene rendering texture to obtain a scene graph of the target game scene.
It should be noted that other corresponding descriptions of the functional units related to the image processing apparatus provided in the embodiment of the present application may refer to the corresponding descriptions in the methods in fig. 1 to fig. 2, and are not repeated herein.
Based on the method shown in fig. 1 to 2, correspondingly, the present application further provides a storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the image processing method shown in fig. 1 to 2.
Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the implementation scenarios of the present application.
Based on the above methods shown in fig. 1 to fig. 2 and the virtual device embodiment shown in fig. 3, in order to achieve the above object, an embodiment of the present application further provides a computer device, which may specifically be a personal computer, a server, a network device, and the like, where the computer device includes a storage medium and a processor; a storage medium for storing a computer program; a processor for executing a computer program to implement the image processing method as described above and illustrated in fig. 1 to 2.
Optionally, the computer device may also include a user interface, a network interface, a camera, Radio Frequency (RF) circuitry, sensors, audio circuitry, a WI-FI module, and so forth. The user interface may include a Display screen (Display), an input unit such as a keypad (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, etc. The network interface may optionally include a standard wired interface, a wireless interface (e.g., a bluetooth interface, WI-FI interface), etc.
It will be appreciated by those skilled in the art that the present embodiment provides a computer device architecture that is not limiting of the computer device, and that may include more or fewer components, or some components in combination, or a different arrangement of components.
The storage medium may further include an operating system and a network communication module. An operating system is a program that manages and maintains the hardware and software resources of a computer device, supporting the operation of information handling programs, as well as other software and/or programs. The network communication module is used for realizing communication among components in the storage medium and other hardware and software in the entity device.
Through the above description of the embodiments, those skilled in the art can clearly understand that the present application can be implemented by software plus a necessary general hardware platform, and also can implement spatial transformation on rendering textures of a game scene in a linear space by hardware, transform the rendering textures from the linear space to a gamma space, perform alpha blending on the rendering textures and an interface rendering map in the gamma space, and then uniformly transform the rendering textures and the interface rendering map back to the linear space. The method has the advantages that the chart space does not need to be manually adjusted by game developers, the image correction is manually carried out, the chart efficiency is improved, the problem that the color difference exists after the chart of the image processing software enters the game development engine is solved, and the game expressive force is improved.
Those skilled in the art will appreciate that the figures are merely schematic representations of one preferred implementation scenario and that the blocks or flow diagrams in the figures are not necessarily required to practice the present application. Those skilled in the art will appreciate that the modules in the devices in the implementation scenario may be distributed in the devices in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the present implementation scenario with corresponding changes. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The above application serial numbers are for description purposes only and do not represent the superiority or inferiority of the implementation scenarios. The above disclosure is only a few specific implementation scenarios of the present application, but the present application is not limited thereto, and any variations that can be made by those skilled in the art are intended to fall within the scope of the present application.

Claims (12)

1. An image processing method, comprising:
acquiring an interface rendering map and a first scene rendering texture of a target game scene, wherein the interface rendering map is generated in a gamma space, and the first scene rendering texture is applied in a linear space;
converting the first scene rendering texture from the linear space to the gamma space, generating a second scene rendering texture, and performing alpha mixing on the second scene rendering texture and the interface rendering map to obtain a third scene rendering texture;
and converting the third scene rendering texture from the gamma space to the linear space to obtain a target scene rendering texture.
2. The method of claim 1, wherein prior to obtaining the interface rendering map and the first scene rendering texture for the target game scene, the method further comprises:
cutting the initial interface rendering graph to obtain a plurality of interface rendering pixels;
storing information related to a plurality of the interface rendering pixels in a cutting table.
3. The method according to claim 2, wherein the obtaining of the interface rendering of the target game scene specifically comprises:
and importing the interface rendering pixels and the cutting chart into a game engine, and splicing the interface rendering pixels into the interface rendering chart through the game engine based on the relevant information in the cutting chart.
4. The method of claim 3, wherein prior to alpha blending the second scene rendering texture and the interface rendering map to obtain a third scene rendering texture, the method further comprises:
and adding a UI control in the interface rendering graph based on the related information of the interface rendering pixels.
5. The method of claim 3, further comprising:
adding a camera stack in the game engine, wherein the camera stack sequentially comprises a first camera, a UI camera and a second camera;
setting attributes of the first camera and the second camera as overlapping cameras, respectively.
6. The method according to claim 5, wherein the converting the first scene rendering texture from the linear space to the gamma space to generate a second scene rendering texture, and alpha blending the second scene rendering texture and the interface rendering map to obtain a third scene rendering texture comprises:
converting, by the first camera, the first scene rendering texture from the linear space to the gamma space, generating a second scene rendering texture; and performing alpha mixing on the second scene rendering texture and the interface rendering graph through the UI camera to obtain a third scene rendering texture.
7. The method according to claim 6, wherein the converting the third scene rendering texture from the gamma space to the linear space to obtain a target scene rendering texture comprises:
converting, by the second camera, the third scene rendering texture from the gamma space to the linear space, resulting in a target scene rendering texture.
8. The method according to any one of claims 1 to 7, further comprising:
obtaining font information of the target game scene, wherein the font information comprises a character transparency parameter;
and correcting the character transparency parameter according to a preset transparency correction equation, wherein the preset correction equation is obtained by fitting the sample transparency initial parameter and the sample transparency correction parameter.
9. The method of any of claims 1-7, wherein after converting the third scene rendering texture from the gamma space to the linear space, resulting in a target scene rendering texture, the method further comprises:
and rendering according to the target scene rendering texture and the source color correction parameter corresponding to the target scene rendering texture to obtain a scene graph of the target game scene.
10. An image processing apparatus characterized by comprising:
the image acquisition module is used for acquiring an interface rendering map and a first scene rendering texture of a target game scene, wherein the interface rendering map is generated in a gamma space, and the first scene rendering texture is applied in a linear space;
the image correction module is used for converting the first scene rendering texture from the linear space to the gamma space, generating a second scene rendering texture, and performing alpha mixing on the second scene rendering texture and the interface rendering map to obtain a third scene rendering texture;
and the texture generating module is used for converting the third scene rendering texture from the gamma space to the linear space to obtain a target scene rendering texture.
11. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the method of any of claims 1 to 9.
12. A computer device comprising a storage medium, a processor and a computer program stored on the storage medium and executable on the processor, characterized in that the processor implements the method of any one of claims 1 to 9 when executing the computer program.
CN202111675781.0A 2021-12-31 2021-12-31 Image processing method and device, storage medium and computer equipment Pending CN114307143A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111675781.0A CN114307143A (en) 2021-12-31 2021-12-31 Image processing method and device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111675781.0A CN114307143A (en) 2021-12-31 2021-12-31 Image processing method and device, storage medium and computer equipment

Publications (1)

Publication Number Publication Date
CN114307143A true CN114307143A (en) 2022-04-12

Family

ID=81022228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111675781.0A Pending CN114307143A (en) 2021-12-31 2021-12-31 Image processing method and device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN114307143A (en)

Similar Documents

Publication Publication Date Title
CN111193876B (en) Method and device for adding special effect in video
CN108876887B (en) Rendering method and device
KR20080045132A (en) Hardware-accelerated color data processing
US8879842B2 (en) Method and non-transitory computer readable medium for HTML file conversion
CN112714357A (en) Video playing method, video playing device, electronic equipment and storage medium
US9723286B2 (en) Image processing apparatus and control method thereof
EP3520408A1 (en) 32-bit hdr pixel format
CN109615583B (en) Game map generation method and device
CN114297546A (en) Method for loading 3D model to realize automatic thumbnail generation based on WebGL
CN114307143A (en) Image processing method and device, storage medium and computer equipment
CN115471592A (en) Dynamic image processing method and system
CN115487495A (en) Data rendering method and device
CN112367399B (en) Filter effect generation method and device, electronic device and storage medium
US20220139355A1 (en) Color gamut compression and extension
CN111068314B (en) NGUI resource rendering processing method and device based on Unity
CN112132919B (en) Electronic seal presenting method for simulating inkpad effect
CN114693894A (en) Method, system and device for converting pictures into building blocks in virtual world
CN109829963B (en) Image drawing method and device, computing equipment and storage medium
CN114307144A (en) Image processing method and device, storage medium and computer equipment
CN114882164A (en) Game image processing method and device, storage medium and computer equipment
CN110992242A (en) Method and device for eliminating water ripples of transparent picture
EP3454538A1 (en) Method for determining a transfer function adapted to be used for color grading a visual content and corresponding electronic device, system, computer readable program product and computer readable storage medium
CN113628292B (en) Method and device for previewing pictures in target terminal
JP2013149276A (en) Html file causing browser accommodating to canvas element-usable object in html to perform process for displaying designated image by converting color thereof by designated characteristic, method for file conversion, and file conversion program
CN114286163B (en) Sequence chart recording method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination