WO2023165198A1 - 图像渲染方法、装置、电子设备、计算机可读存储介质及计算机程序产品 - Google Patents
图像渲染方法、装置、电子设备、计算机可读存储介质及计算机程序产品 Download PDFInfo
- Publication number
- WO2023165198A1 WO2023165198A1 PCT/CN2022/136193 CN2022136193W WO2023165198A1 WO 2023165198 A1 WO2023165198 A1 WO 2023165198A1 CN 2022136193 W CN2022136193 W CN 2022136193W WO 2023165198 A1 WO2023165198 A1 WO 2023165198A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- rendering
- texture data
- image
- virtual object
- texture
- Prior art date
Links
- 238000009877 rendering Methods 0.000 title claims abstract description 454
- 238000000034 method Methods 0.000 title claims abstract description 135
- 238000004590 computer program Methods 0.000 title claims abstract description 20
- 238000006243 chemical reaction Methods 0.000 claims abstract description 156
- 238000012545 processing Methods 0.000 claims description 76
- 230000008569 process Effects 0.000 claims description 66
- 230000009466 transformation Effects 0.000 claims description 14
- 230000008859 change Effects 0.000 claims description 11
- 238000005070 sampling Methods 0.000 claims description 9
- 230000001902 propagating effect Effects 0.000 claims 1
- 230000006870 function Effects 0.000 description 65
- 230000000694 effects Effects 0.000 description 38
- 238000010586 diagram Methods 0.000 description 23
- 238000013507 mapping Methods 0.000 description 19
- 238000005516 engineering process Methods 0.000 description 18
- 239000000463 material Substances 0.000 description 14
- 238000004422 calculation algorithm Methods 0.000 description 13
- 238000004364 calculation method Methods 0.000 description 13
- 238000010801 machine learning Methods 0.000 description 12
- 230000003993 interaction Effects 0.000 description 8
- 230000008447 perception Effects 0.000 description 7
- 238000012546 transfer Methods 0.000 description 7
- 238000013473 artificial intelligence Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 238000009434 installation Methods 0.000 description 5
- 238000012804 iterative process Methods 0.000 description 4
- 238000004806 packaging method and process Methods 0.000 description 4
- 230000016776 visual perception Effects 0.000 description 4
- HPTJABJPZMULFH-UHFFFAOYSA-N 12-[(Cyclohexylcarbamoyl)amino]dodecanoic acid Chemical compound OC(=O)CCCCCCCCCCCNC(=O)NC1CCCCC1 HPTJABJPZMULFH-UHFFFAOYSA-N 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000005293 physical law Methods 0.000 description 3
- 238000013515 script Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000007123 defense Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 241001270131 Agaricus moelleri Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000009193 crawling Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000014860 sensory perception of taste Effects 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
- G06T15/205—Image-based rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
- G06T7/41—Analysis of texture based on statistical description of texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T9/00—Image coding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/54—Extraction of image or video features relating to texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2012—Colour editing, changing, or manipulating; Use of colour codes
Definitions
- the present application relates to computer graphics and image technology, and in particular to an image rendering method, device, electronic equipment, computer-readable storage medium and computer program product.
- the display technology based on graphics processing hardware expands the channels for perceiving the environment and obtaining information, especially the display technology for virtual scenes, which can realize diversified interactions between virtual objects controlled by users or artificial intelligence according to actual application requirements. It has various typical application scenarios, for example, in virtual scenes such as games, it can simulate the real battle process between virtual objects.
- high dynamic range texture resources are usually compressed to reduce the bandwidth consumption caused by texture resources, but it is difficult to achieve high rendering effects when rendering based on compressed texture resources.
- Embodiments of the present application provide an image rendering method, device, electronic equipment, computer-readable storage medium, and computer program product, which can utilize the loss between different rendering images to achieve higher rendering based on less bandwidth-occupied texture data effect, thereby improving the utilization of rendering resources.
- An embodiment of the present application provides an image rendering method, including:
- a real-time rendering process is performed to obtain a target rendering image including the virtual object.
- An embodiment of the present application provides an image rendering device, including: .
- An acquisition module configured to acquire first texture data of a virtual object and conversion parameters corresponding to second texture data of the virtual object, wherein the data volume of the first texture data is smaller than the data volume of the second texture data , and the image information range of the first texture data is smaller than the image information range of the second texture data;
- a fitting module configured to perform fitting rendering processing based on the conversion parameters and the first texture data, to obtain a fitting rendering image including the virtual object;
- a loss module configured to determine a rendering loss between the fitted rendering image and a standard rendering image, and update the conversion parameters and the first texture data based on the rendering loss, wherein the standard rendering image is based on A rendering image including the virtual object obtained by performing standard rendering processing on the second texture data;
- the rendering module is configured to perform real-time rendering processing based on the updated conversion parameters and the updated first texture data to obtain a target rendering image including the virtual object.
- An embodiment of the present application provides an electronic device, including:
- memory for storing computer-executable instructions
- the processor is configured to implement the image rendering method provided in the embodiment of the present application when executing the computer-executable instructions stored in the memory.
- An embodiment of the present application provides a computer-readable storage medium storing computer-executable instructions for implementing the image rendering method provided in the embodiment of the present application when executed by a processor.
- An embodiment of the present application provides a computer program product, including a computer program or a computer-executable instruction.
- the computer program or computer-executable instruction is executed by a processor, the image rendering method provided in the embodiment of the present application is implemented.
- the image range information of the texture data is better than that of the first texture data, and the data volume of the first texture data is smaller than that of the second texture data, so when real-time image rendering is performed based on the updated first texture data and conversion parameters, it can only consume less storage space and computing resources to achieve the rendering effect of the second texture data, thereby effectively improving the utilization of rendering resources.
- FIG. 1A-FIG. 1B are schematic diagrams of the architecture of the image rendering system provided by the embodiment of the present application.
- FIG. 2 is a schematic structural diagram of an electronic device for image rendering provided by an embodiment of the present application
- 3A-3C are schematic flowcharts of the image rendering method provided by the embodiment of the present application.
- Fig. 4 is a rendering schematic diagram of an image rendering method provided by an embodiment of the present application.
- FIG. 5 is a schematic diagram of a fitting rendering process of an image rendering method provided in an embodiment of the present application.
- 6A-6E are schematic diagrams of dynamic range conversion of the image rendering method provided by the embodiment of the present application.
- FIGS. 7A-7B are rendering schematic diagrams of the image rendering method provided by the embodiment of the present application.
- FIG. 8 is a schematic diagram of a rasterized rendering process of an image rendering method provided in an embodiment of the present application.
- FIG. 9 is a schematic diagram of a machine learning fitting scene of an image rendering method provided in an embodiment of the present application.
- Fig. 10 is a schematic diagram of fitting texture of the image rendering method provided by the embodiment of the present application.
- Fig. 11 is a schematic flow chart of machine learning fitting of the image rendering method provided by the embodiment of the present application.
- Fig. 12 is a schematic diagram of the initialization value of the first texture data of the image rendering method provided by the embodiment of the present application.
- Fig. 13 is a screen space rendering diagram of a virtual object in the image rendering method provided by the embodiment of the present application.
- Fig. 14 is a screen space rendering diagram of a virtual object in the image rendering method provided by the embodiment of the present application.
- Fig. 15 is a screen space rendering diagram of fitting of a virtual object in the image rendering method provided by the embodiment of the present application.
- Fig. 16 is a schematic diagram of the pixel difference value in the screen space between the standard rendering result and the fitting result of the image rendering method provided by the embodiment of the present application;
- Fig. 17 is a schematic diagram of the loss effect of the image rendering method provided by the embodiment of the present application.
- Fig. 18 is a schematic diagram of fitting a dynamic range space conversion function of the image rendering method provided by the embodiment of the present application.
- first ⁇ second are only used to distinguish similar objects, and do not represent a specific order for objects. Understandably, “first ⁇ second” can be The specific order or sequencing is interchanged such that the embodiments of the application described herein can be practiced in other sequences than illustrated or described herein.
- Client an application running on a terminal for providing various services, such as a video playback client, a game client, and the like.
- Virtual scene a virtual game scene displayed (or provided) when the game application runs on the terminal.
- the virtual scene can be a simulation environment of the real world, a semi-simulation and semi-fictional virtual environment, or a purely fictional virtual environment.
- the virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the embodiment of the present application does not limit the dimensions of the virtual scene.
- the virtual scene may include sky, land, ocean, etc.
- the land may include environmental elements such as deserts and cities, and the user may control virtual objects to move in the virtual scene.
- Virtual objects images of various people and objects that can interact in the virtual scene, or movable objects in the virtual scene.
- the movable object may be a virtual character, a virtual animal, an animation character, etc., for example, a character, an animal, etc. displayed in a virtual scene.
- the virtual object may be a virtual avatar representing the user in the virtual scene.
- the virtual scene may include multiple virtual objects, and each virtual object has its own shape and volume in the virtual scene and occupies a part of the space in the virtual scene.
- Image rendering the process of converting a three-dimensional radiosity process into a two-dimensional image. Scenes and objects are expressed in three-dimensional form, which is closer to the real world and easy to manipulate and transform, while most image display devices are two-dimensional raster displays and dot-matrix printers. From the representation of the three-dimensional scene to the representation of N-dimensional raster and lattice, it is image rendering, that is, rasterization.
- the raster display can be regarded as a pixel matrix, and any image displayed on the raster display is actually a collection of pixels with one or more colors and grayscales.
- Tone Mapping Its function is to map the color of high dynamic range HDR to low dynamic range LDR color, so that the display can display normally, and the function of the corresponding dynamic range space conversion function (Inverse Tone Mapping) is to convert The color mapping of the low dynamic range LDR to the high dynamic range HDR color restores the color of the original brightness range.
- High-dynamic range High-Dynamic Range
- high-dynamic-range images can provide more dynamic range (for example, the width of the dynamic range is greater than the width threshold) compared with ordinary images, and have a higher degree of detail (For example, the degree of detail exceeds the threshold of the degree of detail), according to the low dynamic range (LDR, Low-Dynamic Range) images of different exposure times, the LDR image corresponding to the best detail of each exposure time is used to synthesize the final HDR image, It can better reflect the visual effect of people in the real environment.
- LDR Low-Dynamic Range
- Low dynamic range (LDR, Low-Dynamic Range)
- the width of the dynamic range is not greater than the width threshold, and the degree of detail does not exceed the threshold of detail level, called low dynamic range, which will cause the loss of details in highlights or shadows.
- dynamic range is measured as the difference in exposure values.
- High Dynamic Range Rendering High Dynamic Range Rendering
- High Dynamic Range Texture High Dynamic Range Texture
- Low dynamic range texture is a low dynamic range texture Textures, when the lighting brightness range in the game is relatively large or the brightness difference in different areas is obvious, the difference in rendering effect is more obvious.
- HDRR and HDRT When implementing the embodiment of the present application, the applicant found that although the combination of HDRR and HDRT can provide a more realistic rendering effect, the rendering cost required by the combination of HDRR and HDRT is relatively high. For example, in uncompressed HDRT, each Each channel requires 32 bits, thus each pixel (including 3 channels) requires 96 bits, whereas in uncompressed LDRT, each channel requires 8 bits, and can be even smaller after compression.
- HDRR and HDRT will not only bring the burden of package body and memory occupation to the game, but also bring more bandwidth overhead and calculation burden when rendering, so for each game, especially It is a mobile game that is more scarce in resources.
- HDRT will be compressed and HDRR will be calculated and optimized. The purpose is to minimize the package and bandwidth occupation brought by HDRT, reduce the calculation consumption of HDRR, and try to approach the original HDRT and rendering effects provided by HDRR.
- a method of compressing HDRT to LDRT is provided in the related art, by calculating the difference value between LDRT and HDRT in the texture space, and then using the Levenberg-Marquadt algorithm to obtain the conversion parameter of the tone mapping function that minimizes the difference value, using this By converting the parameters, HDRT can be converted into LDRT for packaging.
- the applicant finds that only the texture difference value in the texture space is considered in the related art, and the difference value between rendering results should be considered for rendering, so that the obtained result will be more accurate.
- Embodiments of the present application provide an image rendering method, device, electronic equipment, computer-readable storage medium, and computer program product, which can utilize the loss between different rendering images to achieve higher rendering based on less bandwidth-occupied texture data effect, thereby improving the utilization of rendering resources.
- an exemplary implementation scenario of the image rendering method provided by the embodiment of the present application is first described.
- the virtual objects in the image rendering method provided by the embodiment of the present application can be completely based on terminal output , or based on terminal and server collaborative output.
- the virtual scene can be an environment for game characters to interact, for example, it can be used for game characters to fight in the virtual scene, by controlling the actions of the game characters, the two sides can interact in the virtual scene, so that users can play in the virtual scene. Relieve the stress of life during the game.
- FIG. 1A is a schematic diagram of the architecture of an image rendering system provided by the embodiment of the present application, which is suitable for some calculations related to the virtual scene 100 that are completely dependent on the computing power of the graphics processing hardware of the terminal 400.
- the application mode of the application mode such as a stand-alone version/offline mode game, completes the output of the virtual scene through various types of terminals 400 such as smart phones, tablet computers and virtual reality/augmented reality devices.
- the type of graphics processing hardware includes a central processing unit (CPU, Central Processing Unit) and a graphics processing unit (GPU, Graphics Processing Unit).
- CPU Central Processing Unit
- GPU Graphics Processing Unit
- the terminal 400 calculates and displays the required data through the graphics computing hardware, and completes the loading, parsing and rendering of the display data, and outputs video frames capable of forming a visual perception of the virtual scene on the graphics output hardware, For example, a two-dimensional video frame is presented on the display screen of a smartphone, or a three-dimensional display effect is projected on the lenses of augmented reality/virtual reality glasses; in addition, in order to enrich the perception effect, the terminal 400 can also use different hardware to form one or more of auditory perception, tactile perception, motion perception, and taste perception.
- a client 410 (such as a stand-alone game application) is running on the terminal 400, and a virtual scene including role-playing is output during the running of the client 410.
- the virtual scene can be an environment for game characters to interact, for example, These are plains, streets, valleys, etc. for game characters to fight against; taking the first-person perspective to display the virtual scene 100 as an example, a virtual object 401 is displayed in the virtual scene 100, and the virtual object 401 can be controlled by the user (or player).
- the game character controlled will operate in the virtual scene in response to the real user's operation on the buttons (including joystick button, attack button, defense button, etc.).
- the virtual object 401 can also be an artificial intelligence (AI, Artificial Intelligence) that is set in the virtual scene by training. ); the virtual object 401 can also be a non-user character (NPC, Non-Player Character) set in the virtual scene interaction; the virtual object 401 can also be an inactive object or a movable object in the virtual scene 100.
- AI Artificial Intelligence
- NPC Non-Player Character
- the terminal 400 is a terminal used by game developers. During the development and packaging stage of the game, the terminal 400 acquires the first texture data of the virtual object 401 and the conversion parameters of the second texture data corresponding to the virtual object, wherein the first The data volume of the first texture data is smaller than the data volume of the second texture data, and the image information range of the first texture data is smaller than the image information range of the second texture data; based on the conversion parameters and the first texture data, a fitting rendering process is performed to obtain Including a fitting rendering image of a virtual object; determining a rendering loss between the fitting rendering image and a standard rendering image, and updating conversion parameters and first texture data based on the rendering loss, wherein the standard rendering image is standardized based on the second texture data
- the rendered image including the virtual object obtained by the rendering process is the processing performed before the game starts so far.
- the terminal 400 performs real-time rendering based on the updated conversion parameters and the updated first texture data.
- the rendering process is to obtain the target rendering image including the virtual object, and perform human-computer interaction of the virtual scene based on the target rendering image, such as game confrontation.
- FIG. 1B is a schematic diagram of the architecture of the image rendering system provided by the embodiment of the present application, which is applied to the terminal 400 and the server 200, and is suitable for relying on the computing power of the server 200 to complete the calculation of the virtual scene, and The application mode of the virtual scene is output at the terminal 400 .
- the server 200 calculates the display data related to the virtual scene (such as scene data) and sends it to the terminal 400 through the network 300.
- the terminal 400 relies on graphics computing hardware to complete the loading and analysis of the display data.
- graphics output hardware to output virtual scenes to form visual perception
- two-dimensional video frames can be presented on the display screen of a smartphone, or a video that can be projected on the lenses of augmented reality/virtual reality glasses to achieve three-dimensional display effects Frame; for the perception of the form of the virtual scene, it can be understood that the corresponding hardware output of the terminal 400 can be used, such as using a microphone to form an auditory perception, using a vibrator to form a tactile perception, and so on.
- the terminal 400 runs a client 410 (such as a game application in the online version), and interacts with other users by connecting to the server 200 (such as a game server), and the terminal 400 outputs the virtual scene 100 of the client 410.
- client 410 such as a game application in the online version
- server 200 such as a game server
- the terminal 400 outputs the virtual scene 100 of the client 410.
- the perspective display virtual scene 100 Take the perspective display virtual scene 100 as an example.
- a virtual object 401 is displayed.
- the virtual object 401 can be a game character controlled by the user (or player), and will respond to the real user's actions on buttons (including rocker buttons, attack buttons, etc.) , defense buttons, etc.) to operate in the virtual scene, for example, when the real user moves the joystick button to the left, the virtual object will move to the left in the virtual scene, and can also stay still, jump and use various Function (such as skills and props); virtual object 401 can also be the artificial intelligence (AI, Artificial Intelligence) that is set in the virtual scene battle by training; virtual object 401 can also be the non-user character (NPC) that is set in the virtual scene interaction , Non-Player Character); The virtual object 401 can also be an inactive object or an active object in the virtual scene 100.
- buttons including rocker buttons, attack buttons, etc.
- the server 200 acquires the first texture data of the virtual object 401 and the conversion parameters of the second texture data corresponding to the virtual object, wherein the data volume of the first texture data is smaller than the data volume of the second texture data, and the first The image information range of the texture data is smaller than the image information range of the second texture data; based on the conversion parameters and the first texture data, performing a fitting rendering process to obtain a fitting rendering image including a virtual object; determining the fitting rendering image and the standard rendering image The rendering loss between them, and update the conversion parameters and the first texture data based on the rendering loss.
- the standard rendering image is a rendering image including virtual objects obtained by performing standard rendering processing based on the second texture data.
- the terminal 400 receives the updated first texture data sent by the server 200 and the updated conversion parameters (which can be encapsulated in the installation package), during the game installation and running process, the terminal 400 based on
- the updated conversion parameters and the updated first texture data are subjected to real-time rendering processing to obtain a target rendering image including the virtual object, and human-computer interaction of the virtual scene is performed based on the target rendering image, such as game confrontation.
- the terminal 400 can implement the image rendering method provided by the embodiment of the present application by running a computer program
- the computer program can be a native program or a software module in the operating system; it can be a local (Native) application program (APP, APPlication), that is, a program that needs to be installed in the operating system to run, such as a game changer APP (that is, the above-mentioned client 410); it can also be a small program, that is, it can be run only after being downloaded to the browser environment program; it can also be a small game program that can be embedded in any APP.
- the above-mentioned computer program can be any form of application program, module or plug-in.
- the terminal 400 installs and runs an application program supporting a virtual scene.
- the application program may be any one of a first-person shooter game (FPS, First-Person Shooting game), a third-person shooter game, a virtual reality application program, a three-dimensional map program, or a multiplayer gun battle survival game.
- the user uses the terminal 400 to operate the virtual objects located in the virtual scene to carry out activities, such activities include but not limited to: adjusting body posture, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing, building virtual buildings at least one of the Schematically, the virtual object may be a virtual character, such as a simulated character or an anime character.
- Cloud Technology refers to the unification of a series of resources such as hardware, software, and network in a wide area network or a local area network to realize data calculation, storage, A hosted technology for processing and sharing.
- Cloud technology is a general term for network technology, information technology, integration technology, management platform technology, and application technology based on cloud computing business models. It can form a resource pool and be used on demand, which is flexible and convenient. Cloud computing technology will become an important support. The background service of the technical network system requires a large amount of computing and storage resources.
- the server 200 in FIG. 1B can be an independent physical server, or a server cluster or distributed system composed of multiple physical servers, and can also provide cloud services, cloud databases, cloud computing, cloud functions, and cloud storage. , network services, cloud communications, middleware services, domain name services, security services, CDN, and cloud servers for basic cloud computing services such as big data and artificial intelligence platforms.
- the terminal 400 may be a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, etc., but is not limited thereto.
- the terminal and the server may be connected directly or indirectly through wired or wireless communication, which is not limited in this embodiment of the present invention.
- FIG. 2 is a schematic structural diagram of an electronic device for image rendering provided by an embodiment of the present application.
- the terminal 400 shown in FIG. 2 includes: at least one processor 410, a memory 450, at least one network interface 420 and a user interface 430.
- Various components in the terminal 400 are coupled together through a bus system 440 .
- the bus system 440 is used to realize connection and communication among these components.
- the bus system 440 also includes a power bus, a control bus and a status signal bus.
- the various buses are labeled as bus system 440 in FIG. 2 .
- Processor 410 can be a kind of integrated circuit chip, has signal processing capability, such as general processor, digital signal processor (DSP, Digital Signal Processor), or other programmable logic device, discrete gate or transistor logic device, discrete hardware Components, etc., wherein the general-purpose processor can be a microprocessor or any conventional processor, etc.
- DSP digital signal processor
- DSP Digital Signal Processor
- User interface 430 includes one or more output devices 431 that enable presentation of media content, including one or more speakers and/or one or more visual displays.
- the user interface 430 also includes one or more input devices 432, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
- Memory 450 may be removable, non-removable or a combination thereof.
- Exemplary hardware devices include solid state memory, hard drives, optical drives, and the like.
- Memory 450 optionally includes one or more storage devices located physically remote from processor 410 .
- Memory 450 includes volatile memory or nonvolatile memory, and may include both volatile and nonvolatile memory.
- the non-volatile memory can be a read-only memory (ROM, Read Only Memory), and the volatile memory can be a random access memory (RAM, Random Access Memory).
- ROM read-only memory
- RAM random access memory
- the memory 450 described in the embodiment of the present application is intended to include any suitable type of memory.
- memory 450 is capable of storing data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
- Operating system 451 including system programs for processing various basic system services and performing hardware-related tasks, such as framework layer, core library layer, driver layer, etc., for implementing various basic services and processing hardware-based tasks;
- the network communication module 452 is used to reach other electronic devices via one or more (wired or wireless) network interfaces 420.
- Exemplary network interfaces 420 include: Bluetooth, Wireless Compatibility Authentication (WiFi), and Universal Serial Bus ( USB, Universal Serial Bus), etc.;
- Presentation module 453 for enabling presentation of information via one or more output devices 431 (e.g., display screen, speakers, etc.) associated with user interface 430 (e.g., a user interface for operating peripherals and displaying content and information );
- output devices 431 e.g., display screen, speakers, etc.
- user interface 430 e.g., a user interface for operating peripherals and displaying content and information
- the input processing module 454 is configured to detect one or more user inputs or interactions from one or more of the input devices 432 and translate the detected inputs or interactions.
- the image rendering device provided by the embodiment of the present application can be realized by software.
- FIG. 2 shows an image rendering device 455 stored in the memory 450, which can be software in the form of programs and plug-ins, including the following Software modules: acquisition module 4551, fitting module 4552, loss module 4553, and rendering module 4554. These modules are logical, so they can be combined or further divided arbitrarily according to the realized functions. The function of each module will be explained below.
- the image rendering method provided by the embodiment of the present application will be specifically described below with reference to the accompanying drawings.
- the image rendering method provided in the embodiment of the present application may be executed solely by the terminal 400 in FIG. 1A , or may be executed cooperatively by the terminal 400 and the server 200 in FIG. 1B .
- FIG. 3A is a schematic flowchart of an image rendering method provided by an embodiment of the present application, which will be described in conjunction with steps 101 to 104 shown in FIG. 3A .
- FIG. 3A can be executed by various forms of computer programs running on the terminal 400, and is not limited to the above-mentioned client 410, but can also be the above-mentioned operating system 451, software modules and scripts. , so the client should not be regarded as limiting the embodiment of this application.
- step 101 first texture data of a virtual object and conversion parameters corresponding to second texture data of the virtual object are acquired.
- the conversion parameters refer to the parameters involved in the rendering process using the second texture data.
- the virtual object is in the state to be rendered, and the virtual object in the state to be rendered is the base object model, wherein the base object model Including the torso of the virtual object, excluding the display information used to decorate the virtual object, for example, the display information includes makeup (such as lip shape, eye shadow, pupil, iris, blush, etc.) Clothing for the limbs (such as costumes, combat uniforms, etc.) and so on.
- the first texture data may be an LDRT texture
- the second texture data may be an HDRT texture
- the LDRT texture includes a first color value for each texel
- the HDRT texture includes a second color value for each texel
- the first texture The amount of data is less than that of the second texture data.
- each channel requires 32 bits, so each pixel (including 3 channels) requires 96 bits.
- each channel 8 bits are required, thus 24 bits per pixel (including 3 channels).
- the image information range of the first texture data is smaller than the image information range of the second texture data
- the image information range includes a dynamic range
- the dynamic range is the ratio of the highest brightness to the lowest brightness
- the dynamic range of HDRT is higher than that of LDRT.
- the first texture data can be the pixel parameters of each pixel in the initialized texture image, including material parameters and texture parameters, and the first texture data can also be obtained through the n-1th iteration in the iterative process, so that The first texture data obtained in the n-1th iteration is used as the input of the nth iteration, where n is an integer greater than 1, and subsequent steps 102 and 103 need to be performed for each iteration.
- the first texture data is the pixel parameter of each pixel in the initialized texture image
- step 102 based on the conversion parameters and the first texture data, a fitting rendering process is performed to obtain a fitting rendering image including the virtual object.
- the fitting rendering process is performed on the virtual object, and the fitting rendering process is performed on the virtual object based on the conversion parameters and the first texture data to obtain a fitting rendering image including the virtual object.
- the obtained first texture data is used as a resource, and the first texture data is fitted and rendered in an unlit material (Unlit) rendering manner to obtain a fitted rendering image including the virtual object.
- Unlit unlit material
- FIG. 3B is a schematic flow chart of the image rendering method provided by the embodiment of the present application.
- step 102 based on the conversion parameters and the first texture data, a fitting rendering process is performed to obtain a virtual object including virtual objects.
- Composite rendering may be implemented through steps 1021 to 1022 in FIG. 3B.
- step 1021 based on the conversion parameters, the first texture data is transformed into the second texture data to obtain third texture data in the same dynamic range space as the second texture data.
- the first texture data includes a first color value of each texel
- the second texture data includes a second color value of each texel
- the conversion parameters include a first first-order parameter and a second first-order parameter
- the first texture data of the virtual object is transformed towards the second texture data to obtain the third texture data in the same dynamic range space as the second texture data, which can be realized through the following technical solutions : perform the following processing for each texel in the first texture data: determine the first product result between the first color value of the texel and the first first-order parameter; determine the first product result and the second first-order parameter The first summation result; the ratio between the first color value of the texel and the first summation result is determined as the third color value of the texel in the same dynamic range space as the second color value; multiple textures The third color value of the pixel, which constitutes the third texture data.
- the space transformation process is realized by a first-order high dynamic range space transformation function
- the first-order high dynamic range space transformation function refers to formula (1):
- p and q are the parameters that need to be fitted for the first-order tone mapping function
- p is the first first-order parameter
- q is the second first-order parameter
- different p correspond to different curves
- c H is the third Color value
- c L is the first color value
- different curves corresponding to different p and q are shown in Fig. 6D
- the abscissa of the curve is the first color value
- the ordinate of the curve is the third color value.
- the first texture data includes a first color value of each texel
- the second texture data includes a second color value of each texel
- the conversion parameters include a first second-order parameter, a second second-order parameter , the third second-order parameter, the fourth second-order parameter, and the fifth second-order parameter
- step 1021 based on the conversion parameters, the first texture data of the virtual object is transformed towards the second texture data to obtain the third texture data
- step 1021 based on the conversion parameters, the first texture data of the virtual object is transformed towards the second texture data to obtain the third texture data
- step 1021 based on the conversion parameters, the first texture data of the virtual object is transformed towards the second texture data to obtain the third texture data
- step 1021 based on the conversion parameters, the first texture data of the virtual object is transformed towards the second texture data to obtain the third texture data
- step 1021 based on the conversion parameters, the first texture data of the virtual object is transformed towards the second texture data to obtain the third texture data
- step 1021 based on the conversion parameters
- the space conversion process is realized by a second-order high dynamic range space conversion function
- the first-order high dynamic range space conversion function is shown in formula (2):
- p, q, r, s, and t are the parameters that need to be fitted for the second-order high dynamic range space conversion function
- p is the first and second-order parameters
- q is the second and second-order parameters
- r is the third and second-order parameters Parameters
- s is the fourth second-order parameter
- t is the fifth second-order parameter
- different combinations of p, q, r, s, t correspond to different curves
- c H is the third color value
- c L is the first color value
- Fig. 6E shows different curves corresponding to different combinations of p, q, r, s, t, the abscissa of the curve is the first color value
- the ordinate of the curve is the third color value.
- step 1022 based on the third texture data, a fitting rendering process is performed to obtain a fitting rendering image including the virtual object.
- the third texture data includes the third color value of each texel; in step 1022, based on the third texture data, a fitting rendering process is performed to obtain a fitting rendering image including a virtual object, which can be achieved through the following techniques Solution implementation: obtain the two-dimensional texture coordinates of the virtual object; obtain the differentiable rendering frame corresponding to the fitting rendering process; forward-propagate the two-dimensional texture coordinates and the third color value of each texel in the differentiable rendering frame, A fitted rendering including the virtual object is obtained.
- the differentiable rendering framework is obtained by software-encapsulating the hardware-based rendering process.
- the hardware-based rendering process can be a rendering method without light-receiving materials. Due to the software-based encapsulation, it can be differentiated and guided, so it can be based on gradients in the future. Do backpropagation.
- the two-dimensional texture coordinates of the virtual object are acquired, the two-dimensional texture coordinates are predetermined or automatically generated, the two-dimensional texture coordinates may be UV2 coordinates, and all image files are two-dimensional planes.
- the horizontal direction is U, and the vertical direction is V. Any pixel on the image can be located through the two-dimensional plane, which defines the position information of each point on the image.
- the position of UV coordinates is to accurately correspond each point on the image to the surface of the model object, obtain the differentiable rendering frame corresponding to the fitting rendering process, and convert the two-dimensional texture coordinates and the third color value of each texture pixel in the differentiable Forward propagation is performed in the rendering framework to obtain a fitting rendering image including virtual objects.
- the data participating in the forward propagation also includes rendering resources such as materials and lighting.
- FIG 8 is the rasterized rendering process provided by the embodiment of this application Schematic diagram, the entire rasterized rendering process after the transformation is realized through the unified electronic device architecture (CUDA, Compute Unified Device Architecture), and runs as a module of the machine learning library (such as PyTorch), where CUDA is the above-mentioned differentiable rendering Framework, the entire rendering process can backpropagate gradients, so the entire process is differentiable.
- CUDA Compute Unified Device Architecture
- machine learning library such as PyTorch
- step 103 the rendering loss between the fitted rendering image and the standard rendering image is determined, and the conversion parameters and the first texture data are updated based on the rendering loss.
- the orientation of the virtual object in the standard rendering image is the same as the orientation set during the fitting rendering in the iterative process
- the standard rendering image is a rendering image including the virtual object obtained by performing standard rendering processing based on the second texture data.
- the standard rendering processing may be physically based rendering processing. Specifically, based on the second texture data of the virtual object, the physically based rendering processing corresponding to the virtual object is performed to obtain a standard rendering image.
- the second texture data of the virtual object is loaded through the rendering technology based on physical laws (PBR, Physically Based Rendering), and the second texture data is rendered based on the physical laws to obtain a standard rendering image conforming to the physical laws.
- PBR Physically Based Rendering
- FIG. 3C is a schematic flow chart of the image rendering method provided by the embodiment of the present application.
- step 103 the rendering loss between the fitted rendering image and the standard rendering image can be determined through the Step 1031 to step 1032 are implemented.
- step 1031 the overall pixel value difference between the standard rendering image and the fitting rendering image in screen space is determined.
- the determination of the overall pixel value difference between the standard rendering image and the fitting rendering image in screen space in step 1031 can be achieved through the following technical solutions:
- the pixel performs the following processing: determine the first pixel value of the corresponding pixel in the fitting rendering image, and determine the second pixel value of the corresponding pixel in the standard rendering image; calculate the difference between the first pixel value and the second pixel value
- the absolute value is used as the pixel value difference of the corresponding pixel; the pixel value difference corresponding to multiple pixels is summed to obtain the overall pixel value difference. Taking each pixel as the minimum unit for measuring the difference can effectively increase the value of the rendering loss, so that when updating based on the rendering loss, conversion parameters and first texture data with better rendering effects can be obtained.
- the following processing is performed for each identical pixel in screen space for both the fitted rendering and the standard rendering: determining a first pixel value for a corresponding pixel in the fitted rendering, and determining a corresponding pixel in the standard rendering The second pixel value of the second pixel value; the absolute value of the difference between the first pixel value and the second pixel value is used as the pixel value difference of the corresponding pixel; the pixel value difference corresponding to multiple pixels is summed to obtain the overall pixel value difference.
- a rendering loss is determined based on the overall pixel value difference, the length of the fitted rendered image, and the width of the fitted rendered image.
- the rendering loss is obtained based on formula (3):
- Img1 and Img2 represent the standard rendering image and the fitting rendering image respectively
- H and W represent the length and width of Img1 (or Img2) respectively
- (i, j) indicates any pixel in the screen space in the standard rendering image.
- rendering loss in the embodiment of the present application is not limited to the formula (3), and may also be other deformation formulas.
- updating the conversion parameters and the first texture data at the rendering loss may be implemented through the following technical solutions: performing partial derivative processing on the first texture data based on the rendering loss to obtain the gradient corresponding to the first texture data; Perform partial derivative processing on the conversion parameters based on the rendering loss to obtain gradients corresponding to the conversion parameters; update the first texture data based on the gradients corresponding to the first texture data, and update the conversion parameters based on the gradients corresponding to the conversion parameters.
- the aforementioned updating of the first texture data based on the gradient of the first texture data can be achieved through the following technical solution: multiply the set learning rate by the gradient corresponding to the first texture data to obtain the gradient of the first texture data Data change value; the data change value of the first texture data is added to the first texture data to obtain updated first texture data; the above-mentioned conversion parameter update based on the gradient of the corresponding conversion parameter can be realized by the following technical scheme: set The fixed learning rate is multiplied by the gradient of the corresponding conversion parameter to obtain the data change value of the conversion parameter; the data change value of the conversion parameter is added to the conversion parameter to obtain an updated conversion parameter.
- the process of updating the first texture data based on the rendering loss is similar to the backpropagation process of machine learning.
- the first texture data is input to the differentiable rasterizer, and the forward rendering of the differentiable rasterizer The process of outputting a fitting rendering image. Since the output result (fitting rendering image) of the differentiable rasterization renderer has an error (rendering loss) with the standard rendering image, the error between the output result and the standard rendering image is calculated, and based on The error is backpropagated, and the backpropagation is implemented based on the gradient descent algorithm.
- the first texture data and conversion parameters are adjusted according to the error, for example, the partial derivative of the rendering loss to the first texture data is obtained , to generate the gradient of the rendering loss to the first texture data. Since the direction of the gradient indicates the direction of error expansion, the gradient descent algorithm is used to update the first texture data, and the partial derivative of the rendering loss to the conversion parameter is obtained to generate the rendering loss to the conversion parameter. Gradient, because the direction of the gradient indicates the direction of error expansion, the gradient descent algorithm is used to update the conversion parameters, and the above process is iterated continuously until the iteration end condition is met.
- step 104 real-time rendering is performed based on the updated conversion parameters and the updated first texture data to obtain a target rendering image including the virtual object.
- step 104 the real-time rendering process is performed based on the updated conversion parameters and the updated first texture data, which may be implemented through the following technical solutions: when the rendering loss is less than the loss threshold, based on the updated conversion parameters and Perform real-time rendering processing on the first texture data; or perform real-time rendering processing based on the updated conversion parameters and the updated first texture data when the update count reaches the update count threshold.
- the iteration end condition includes at least one of the following: the value of the rendering loss function is less than a loss threshold; the number of iterations reaches a set number.
- the updated conversion parameters and the updated first texture data are used as the final saved rendering basic resources for rendering processing.
- the target rendering is defined by setting the loss threshold
- the degree of similarity between the image and the standard rendering image when the loss threshold is smaller, the similarity between the target rendering image and the standard rendering image is more similar; when the loss threshold is not limited, the target rendering image and the standard rendering image are defined by setting the number of iterations
- the degree of similarity between the images when the set iteration number is larger, the similarity between the target rendering image and the standard rendering image is more. Judging whether real-time rendering can be performed based on the currently updated first texture data and conversion parameters by using the loss threshold or the number of updates can improve fitting efficiency and prevent waste of fitting resources.
- step 104 based on the updated conversion parameters and the updated first texture data, real-time rendering processing is performed to obtain the target rendering image including the virtual object, which can be achieved through the following technical solutions: based on the updated conversion parameters, the The updated first texture data is subjected to space conversion processing towards the second texture data to obtain fourth texture data in the same dynamic range space as the second texture data; at least one two-dimensional texture coordinate of the virtual object is determined; for each two-dimensional The texture coordinates perform the following processing: sample the texture image corresponding to the two-dimensional texture coordinates from the fourth texture data, and perform fitting processing on the sampled texture images; based on the fitting results of each two-dimensional texture coordinates, generate A target rendering of a virtual object.
- the two-dimensional texture coordinates are predetermined or automatically generated, the two-dimensional texture coordinates can be UV2 coordinates, and the UV2 coordinates define the position information of each point on the image, these points It is interrelated with the model to determine the position of the surface texture map.
- the UV coordinates are to accurately correspond to each point on the image to the surface of the model object.
- the fourth texture data is sampled through UV2, and the fourth texture data obtained by sampling is formed.
- the texture map of the virtual object is pasted on the base model of the virtual object, so as to realize the real-time rendering process.
- the first texture data and the second texture data are respectively rendered, and the first texture data and the conversion parameters involved in rendering based on the first texture data are updated based on the loss between the rendering results, because
- the image range information of the second texture data is better than that of the first texture data, and the data volume of the first texture data is due to the second texture data, so when performing real-time image rendering based on the updated first texture data and conversion parameters, it can achieve
- the rendering effect of the second texture data only consumes less storage space and computing resources, thereby effectively improving the utilization rate of rendering resources.
- the server obtains the first texture data of the virtual object and the conversion parameters of the second texture data corresponding to the virtual object, wherein the data amount of the first texture data is smaller than the data amount of the second texture data, and the first The image information range of the texture data is smaller than the image information range of the second texture data; based on the conversion parameters and the first texture data, performing a fitting rendering process to obtain a fitting rendering image including a virtual object; determining the fitting rendering image and the standard rendering image The rendering loss between them, and update the conversion parameters and the first texture data based on the rendering loss.
- the standard rendering image is the rendering image including virtual objects obtained by performing standard rendering processing based on the second texture data. So far, it is executed before the game starts. Processing, for example, in the development and packaging stage of the game, the terminal receives the updated first texture data and the updated conversion parameters (which can be encapsulated in the installation package) sent by the server, during the game installation and running process, the terminal receives the updated conversion parameters and The updated first texture data is used to perform real-time rendering processing on the virtual object to obtain a target rendering image including the virtual object, and perform human-computer interaction on the virtual scene based on the target rendering image, such as game confrontation. It should be noted that the embodiment of this application is applicable to most mobile game projects, especially mobile game projects that require high rendering performance and rendering effects. This solution can greatly improve rendering efficiency and achieve the difference from the original effect. very small.
- Fig. 4 is a rendering schematic diagram provided by the embodiment of the present application.
- the rendering effect in the game of the embodiment of the present application only needs one texture and the corresponding UV2 vertex data, which can be regarded as the texture that will be fitted 401 uses the second texture coordinate (UV2) to paste on the model 402 (that is, the virtual object to be rendered).
- UV2 second texture coordinate
- Fig. 5 is a schematic flow chart of the fitting rendering provided by the embodiment of the present application, as shown in Fig. 5, the rendering process is a very complicated function f, its input is x, and x is a parameter set (to be fitted parameters), including data such as model vertex positions, material parameters, texture parameters, etc., and the output y of f is the rendering result.
- the embodiment of this application will not optimize data such as vertex positions. Only the first texture data and conversion parameters in x need to be optimized.
- the goal is to calculate x to make f(x) as close to HDRR as possible given the HDRR rendering result Rendering results, the degree of approximation is calculated using a loss function (Loss). by the following formula Calculate the partial differential of the fitting parameters of the fitting rendering image, through the following formula Calculate the partial differential of the loss function to the fitted rendering image, through the following formula Computes the partial derivative of the loss function for the parameters to be fitted.
- the optimization algorithm uses the gradient descent algorithm.
- f is a very complicated function, and the normal rendering process is not differentiable. Therefore, in order to use the gradient descent algorithm to find the optimal x, f needs to be differentiable.
- the embodiment of this application provides a differentiable (that is, differentiable) function. Rasterized rendering framework. It should be noted that f is a complete rasterization rendering process. In order to make it differentiable, the embodiment of the present application improves the rasterization rendering. See FIG. CUDA, Compute Unified Device Architecture), and run as a module of a machine learning library (such as PyTorch), the entire rendering process can backpropagate the gradient, so the entire process is differentiable.
- a machine learning library such as PyTorch
- the final stored texture data is LDRT, and the storage range is the data between [0-1]. It will be converted to the high dynamic range space during rendering, and then based on the converted data For rendering, the data range of the high dynamic range space is wider, including more data that the human visual system can perceive.
- the function used in this conversion process is the high dynamic range space conversion function, which can be regarded as the inverse of tone mapping. process.
- the tone mapping function Reinhard can refer to formula (4), and its function curve is shown in the dotted line curve in Fig. 6A:
- c H is the color value of the high dynamic range space
- c L is the color value of the low dynamic range space
- its corresponding high dynamic range space conversion function (Inverse Reinhard) can be referred to formula (5), and its function curve is shown in the figure
- the dashed curve in 6B shows:
- tone mapping function Aces can also refer to formula (6), and its function curve is shown in the solid line curve in Figure 6A
- c H is the original HDR color value
- c L is the LDR color value
- its corresponding high dynamic range space conversion function (Inverse Aces) can refer to formula (7)
- its function curve is the solid line curve in Figure 6B Shown:
- the LDRT needs to be fitted for different rendering effects of different HDRRs, so that the rendering effect of the HDRR is approached after rendering the unlit material based on the LDRT.
- their brightness ranges are actually different.
- the parameters of a single high dynamic range space conversion function are not the optimal solution, whether it corresponds to Reinhard's high dynamic range space conversion function or corresponds to Aces' high
- the parameters of the dynamic range space conversion function are fixed, and they cannot be well adapted to various rendering brightness ranges.
- Figure 7A- Figure 7B is a rendering diagram of the image rendering method provided by the embodiment of the present application, the color brightness range of the virtual object in Figure 7A is more concentrated in the dark part, and the color brightness of the virtual object in Figure 7B The range is more concentrated in the bright part, so ideally, the curve of their high dynamic range space transfer function should be as shown in Figure 6C, and the virtual object in Figure 7A should use the dotted curve shown in Figure 6C, that is, more
- the texture data range is assigned to the darker color space
- the dashed curve assigns the texture data range [0-0.9] to the brightness interval [0-0.2], and assigns less texture data range to the brighter color space
- the dashed curve assigns the texture
- the data range [0.9-1.0] gives the brightness interval [0.2-1.6], and the maximum brightness is controlled to be around 1.6.
- the virtual object in Figure 7B should use the solid line curve in Figure 6C, that is, allocate more texture data ranges to brighter color spaces, and the solid line curve allocates the texture data range [0.44-1.0] to the brightness range [0.2-4.34] ], allocate less texture data range to the darker color space, the solid line curve allocates the texture data range [0-0.44] to the brightness range [0-0.2], and controls the maximum brightness to about 4.34.
- This distribution of data allows for a higher accuracy range and best fit in the brighter areas of the rendered result.
- the first-order tone mapping function is shown in equation (8):
- a, b, c, and d are the parameters of the first-order tone mapping function
- c H is the original HDR color value
- c L is the LDR color value.
- the first-order high dynamic range space conversion function corresponding to the first-order tone mapping function is shown in formula (9):
- e, f, g, and h are the parameters of the first-order high dynamic range space conversion function
- c H is the original HDR color value
- c L is the LDR color value.
- p and q are the parameters that need to be fitted in the first-order high dynamic range space conversion function. Different p and q correspond to different curves, c H is the original HDR color value, and c L is the LDR color value, as shown in Figure 6D shows different curves corresponding to different p, q, the abscissa is the original HDR color value, and the ordinate is the LDR color value.
- Equation (11) the second-order tone mapping function is shown in equation (11):
- a, b, c, d, e, and f are the parameters of the second-order tone mapping function
- c H is the original HDR color value
- c L is the LDR color value.
- p, q, r, s, t, u, and v are the parameters of the second-order high dynamic range space conversion function
- c H is the original HDR color value
- c L is the LDR color value.
- p, q, r, s, and t are the parameters that need to be fitted for the second-order high dynamic range space conversion function. Different p, q, r, s, and t correspond to different curves, and c H is the original HDR Color value, c L is the LDR color value, different curves corresponding to different p, q are shown in Fig. 6E, the abscissa is the original HDR color value, and the ordinate is the LDR color value.
- the second-order high dynamic range space transfer function can give more curve shapes than the first-order high dynamic range space transfer function, and supports more brightness range compression methods.
- the calculation amount corresponding to the first-order high dynamic range space transfer function is less than Second-order high dynamic range spatial transfer function.
- the input part includes: 1) the original virtual object rendered by PBR, including a model 901, a material ball, and a lighting 902 (may include parallel light, point light, surface light source, etc.); During the integration process, the camera remains still.
- the output part includes: 1) automatically generated model UV2, which is used to map the texture space to the model rendering result; 2) a texture 1001 based on UV2 expansion, as shown in Figure 10, when the fitting is completed, the texture stored in this texture It is the baked LDRT data.
- the pixel parameters of this texture are represented by ⁇ in the following description; 3) The parameters of the optimal high dynamic range space conversion function (Inverse Tone Mapping) obtained by fitting.
- step 801 the scene and various parameters are initialized.
- step 802 the orientation of the character is set, specifically, the orientation of the virtual object (that is, the model) is set during the machine learning fitting process.
- the virtual object In the actual game running, the virtual object will not always be in the same orientation, so it is necessary to fit the rendering effect of each orientation of the character, so as to ensure that the fitting effect of the virtual object in each state is relatively accurate.
- the orientation of the virtual object in FIG. 13 there are three different orientations of the virtual object 1301 , and each iteration will randomly set the orientation of a virtual object.
- step 803 standard rendering processing is performed to obtain HDRR rendering results.
- the HDRR rendering result is a standard rendering effect that needs to be fitted. It is necessary to first perform PBR rendering with HDRT to obtain a rendering image of a certain orientation of the virtual object 1401 (ie, a standard rendering image) as shown in FIG. 14 .
- step 804 a fitting rendering process is performed to obtain a fitting rendering image including the virtual object.
- step 805 the rendering loss Loss is calculated.
- Img1 and Img2 represent the standard rendering image and the fitting rendering image respectively
- H and W represent the length and width of Img1 (or Img2) respectively
- (i, j) indicates any pixel in the screen space in the standard rendering image.
- step 806 it is judged whether the rendering loss is smaller than the threshold, if smaller, go to step 807 and step 808, if not, go to step 809.
- step 807 the gradient (Gradient) of the first texture data and the gradient (Gradient) of the conversion parameters are calculated.
- the gradient of the rendering loss Loss to the first texture data and the gradient of the conversion parameter can be calculated through the PyTorch framework and the differentiable rasterization renderer.
- step 808 update the first texture data and the conversion parameters with corresponding gradients, and continue to execute steps 802 to 805 .
- step 802 After calculating the gradient of the first texture data, use the optimizer of PyTorch to update the first texture data, after calculating the gradient of the conversion parameters, use the optimizer of PyTorch to update the conversion parameters, and then turn to step 802 to enter the next step One iteration, repeating the previous iterative process of step 802-step 807, the first texture data and the conversion parameters will gradually converge to the optimal value.
- step 809 the formats of the first texture data and conversion parameters are converted and output.
- the characterization fitting rendering image is very close to the standard rendering image, that is, the entire iterative process can be exited, the first texture data and conversion parameters can be saved, and UV2 can be used for sampling in the game. texture maps for close to HDRR rendering results.
- Step 804 is described in detail below, which includes 2 sub-steps and solves how to use the first texture data for Unlit rendering.
- formula (10) or formula (13) to convert the first texture data in LDR space to HDR space.
- the parameters of the formula are the default initial values.
- transition parameters the parameters (transition parameters) of the formula are updated based on the previous iteration.
- Gradient gradually Update to the optimal parameters.
- use the UV2 texture coordinates to render the sampled HDR space texture data into the world space to obtain the final screen space rendering result.
- Figure 17 is a schematic diagram of the loss effect of the image rendering method provided by the embodiment of the present application
- 1701 is a standard rendering image
- 1702 is the first texture data output using step 808, and then the high dynamic range space conversion function is obtained by fitting
- the target rendering image obtained by rendering is fitted to obtain the high dynamic range space conversion function as shown in Figure 18.
- the conversion parameters in the high dynamic range space conversion function are output in step 808, and 1703 is both the target rendering image and the standard rendering image It can be seen that the pixel difference value between the two is very small.
- the embodiment of the present application provides an image rendering method, which can automatically fit the rendering method of HDRR, and can fit LDRT to replace HDRT as the rendering basis, greatly reduce the amount of texture data, and can The overhead achieves a rendering effect close to HDRR, which greatly improves the frame rate of the game and reduces power consumption.
- the software modules stored in the image rendering device 455 of the memory 450 may include : An acquisition module 4551 configured to acquire the first texture data of the virtual object and the conversion parameters of the second texture data corresponding to the virtual object, wherein the data volume of the first texture data is smaller than the data volume of the second texture data, and the first The image information range of the texture data is smaller than the image information range of the second texture data; the fitting module 4552 is configured to perform fitting rendering processing based on the conversion parameters and the first texture data to obtain a fitting rendering image including the virtual object; the loss module 4553, configured to determine the rendering loss between the fitting rendering image and the standard rendering image, and update the conversion parameters and the first texture data based on the rendering loss, where the standard rendering image is obtained by performing standard rendering processing based on the second texture data, including The rendering image of the virtual object; the rendering module 4554 is configured to perform
- the fitting module 4552 is further configured to: based on the conversion parameters, perform space transformation processing on the first texture data of the virtual object toward the second texture data to obtain the third texture data, wherein the third texture data It is in the same dynamic range space as the second texture data; based on the third texture data, a fitting rendering process corresponding to the virtual object is performed to obtain a fitting rendering image including the virtual object.
- the first texture data includes a first color value of each texel
- the second texture data includes a second color value of each texel
- the conversion parameters include a first first-order parameter and a second first-order parameter
- the fitting module 4552 is also configured to: perform the following processing for each texel in the first texture data: determine the first product result between the first color value of the texel and the first first-order parameter; determine the first The product result and the first summation result of the second first-order parameter; the ratio between the first color value corresponding to the texel and the first summation result is determined as the texture pixel and the second color value are in the same dynamic range space the third color value; the third color values of multiple texels are combined to form the third texture data.
- the first texture data includes a first color value of each texel
- the second texture data includes a second color value of each texel
- the conversion parameters include a first second-order parameter, a second second-order parameter , the third second-order parameter, the fourth second-order parameter, and the fifth second-order parameter
- the fitting module 4552 is also configured to: perform the following processing for each texel in the first texture data: determine the first color of the texel the second product result between the square of the value and the first second-order parameter, and the third product result between the first color value of the texel and the second second-order parameter; for the second product result, the third product result, and The third second-order parameters are summed to obtain the second summation result; the fourth product result between the first color value of the texel and the fourth second-order parameter is determined; the square root of the second summation result, the fourth The product result and the square root of the third second-order parameter are summed to obtain the third summation
- the third texture data includes the third color value of each texel; the fitting module 4552 is further configured to: obtain the two-dimensional texture coordinates of the virtual object; obtain a differentiable rendering frame corresponding to the fitting rendering process ; Perform forward propagation of the two-dimensional texture coordinates and the third color value of each texel in the differentiable rendering framework to obtain a fitting rendering image including the virtual object.
- the loss module 4553 is further configured to: determine the overall pixel value difference between the standard rendering image and the fitting rendering image in screen space; based on the overall pixel value difference, the length of the fitting rendering image, and the Width, which determines the rendering loss.
- the loss module 4553 is further configured to: perform the following processing on any same pixel in the screen space of the fitting rendering image and the standard rendering image: determine the first pixel value of the corresponding pixel in the fitting rendering image, and Determining the second pixel value of the corresponding pixel in the standard rendering image; using the absolute value of the difference between the first pixel value and the second pixel value as the pixel value difference of the pixel; performing the pixel value difference of multiple pixels in the screen space The summation process is performed to obtain the overall pixel value difference.
- the loss module 4553 is further configured to: perform partial derivative processing on the first texture data based on the rendering loss to obtain the gradient corresponding to the first texture data; perform partial derivative processing on the transformation parameters based on the rendering loss to obtain the corresponding transformation Gradient of the parameter; updating the first texture data based on the gradient corresponding to the first texture data, and updating the transformation parameter based on the gradient corresponding to the transformation parameter.
- the loss module 4553 is further configured to: multiply the set learning rate by the gradient corresponding to the first texture data to obtain the data change value of the first texture data; Adding the first texture data to obtain updated first texture data; multiplying the set learning rate with the gradient of the corresponding conversion parameter to obtain the data change value of the conversion parameter; adding the data change value of the conversion parameter to the conversion parameter, Get updated transformation parameters.
- the rendering module 4554 is further configured to: when the rendering loss is less than the loss threshold, perform real-time rendering processing based on the updated conversion parameters and the updated first texture data; or when the number of updates reaches the threshold of update times, Real-time rendering processing is performed based on the updated conversion parameters and the updated first texture data.
- the rendering module 4554 is further configured to: based on the updated conversion parameters, perform space conversion processing on the updated first texture data toward the second texture data to obtain fourth texture data, wherein the fourth The texture data and the second texture data are in the same dynamic space range; determine at least one two-dimensional texture coordinate of the virtual object; perform the following processing for each two-dimensional texture coordinate: sample the fourth texture data and the two-dimensional texture coordinate Corresponding texture images, and fitting processing on the sampled texture images; Based on the fitting results of each two-dimensional texture coordinates, generate a target rendering image including the virtual object.
- An embodiment of the present application provides a computer program product or computer program, where the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
- the processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the computer device executes the above-mentioned image rendering method in the embodiment of the present application.
- An embodiment of the present application provides a computer-readable storage medium storing executable instructions, wherein executable instructions are stored, and when the executable instructions are executed by a processor, it will cause the processor to execute the image rendering method provided in the embodiment of the present application , for example, the image rendering method shown in FIGS. 3A-3C .
- the computer-readable storage medium can be memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; Various equipment.
- executable instructions may take the form of programs, software, software modules, scripts, or code written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and its Can be deployed in any form, including as a stand-alone program or as a module, component, subroutine or other unit suitable for use in a computing environment.
- executable instructions may, but do not necessarily correspond to files in a file system, may be stored as part of a file that holds other programs or data, for example, in a Hyper Text Markup Language (HTML) document in one or more scripts, in a single file dedicated to the program in question, or in multiple cooperating files (for example, files that store one or more modules, subroutines, or sections of code).
- HTML Hyper Text Markup Language
- executable instructions may be deployed to be executed on one electronic device, or on multiple electronic devices located at one location, or, alternatively, on multiple electronic devices distributed across multiple locations and interconnected by a communication network. to execute.
- the first texture data and the second texture data are respectively rendered, and the first texture data is updated based on the loss between the rendering results, and the rendering is performed based on the first texture data.
- conversion parameters since the image range information of the second texture data is better than that of the first texture data, and the data volume of the first texture data is due to the second texture data, real-time image rendering is performed based on the updated first texture data and conversion parameters
- the rendering effect of the second texture data can be achieved, only consuming less storage space and computing resources, thereby effectively improving the utilization rate of rendering resources.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Health & Medical Sciences (AREA)
- Geometry (AREA)
- Probability & Statistics with Applications (AREA)
- Architecture (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Image Generation (AREA)
Abstract
Description
Claims (15)
- 一种图像渲染方法,所述方法由电子设备执行,所述方法包括:获取虚拟对象的第一纹理数据、以及与所述虚拟对象的第二纹理数据对应的转换参数,其中,所述第一纹理数据的数据量小于所述第二纹理数据的数据量,且所述第一纹理数据的图像信息范围小于所述第二纹理数据的图像信息范围;基于所述转换参数以及所述第一纹理数据,进行拟合渲染处理,得到包括所述虚拟对象的拟合渲染图;确定所述拟合渲染图与标准渲染图之间的渲染损失,并基于所述渲染损失更新所述转换参数以及所述第一纹理数据,其中,所述标准渲染图是基于所述第二纹理数据进行标准渲染处理得到的包括所述虚拟对象的渲染图;基于更新的转换参数以及更新的第一纹理数据,进行实时渲染处理,得到包括所述虚拟对象的目标渲染图。
- 根据权利要求1所述的方法,其中,所述基于所述转换参数以及所述第一纹理数据,进行拟合渲染处理,得到包括所述虚拟对象的拟合渲染图,包括:基于所述转换参数,对所述虚拟对象的第一纹理数据进行朝向所述第二纹理数据的空间转换处理,得到第三纹理数据,其中,所述第三纹理数据与所述第二纹理数据处于相同动态范围空间;基于所述第三纹理数据,进行拟合渲染处理,得到包括所述虚拟对象的拟合渲染图。
- 根据权利要求2所述的方法,其中,所述第一纹理数据包括每个纹理像素的第一颜色值,所述第二纹理数据包括每个所述纹理像素的第二颜色值,所述转换参数包括第一一阶参数以及第二一阶参数;所述基于所述转换参数,对所述虚拟对象的第一纹理数据进行朝向所述第二纹理数据的空间转换处理,得到第三纹理数据,包括:针对所述第一纹理数据中的每个所述纹理像素执行以下处理:确定所述纹理像素的第一颜色值与所述第一一阶参数之间的第一乘积结果;确定所述第一乘积结果与所述第二一阶参数的第一求和结果;将所述纹理像素的第一颜色值与所述第一求和结果之间的比值,确定为所述纹理像素的与所述第二颜色值处于相同动态范围空间的第三颜色值;将多个所述纹理像素的第三颜色值,组成所述第三纹理数据。
- 根据权利要求2所述的方法,其中,所述第一纹理数据包括每个纹理像素的第一颜色值,所述第二纹理数据包括每个所述纹理像素的第二颜色值,所述转换参数包括第一二阶参数、第二二阶参数、第三二阶参数、第四二阶参数以及第五二阶参数;所述基于所述转换参数,对所述虚拟对象的第一纹理数据进行朝向所述第二纹理数据的空间转换处理,得到第三纹理数据,包括:针对所述第一纹理数据中的每个所述纹理像素执行以下处理:确定所述纹理像素的第一颜色值的平方与所述第一二阶参数之间的第二乘积结果、以及所述纹理像素的第一颜色值与所述第二二阶参数之间的第三乘积结果;对所述第二乘积结果、所述第三乘积结果以及所述第三二阶参数进行求和处理,得到第二求和结果;确定所述纹理像素的第一颜色值与所述第四二阶参数之间的第四乘积结果;对所述第二求和结果的平方根、所述第四乘积结果以及所述第三二阶参数的平方根进行求和处理,得到第三求和结果;确定所述纹理像素的第一颜色值与所述第五二阶参数的第四求和结果;将所述第三求和结果与所述第四求和结果之间的比值,确定为所述纹理像素的与所述第二颜色值处于相同动态范围空间的第三颜色值;将多个所述纹理像素的第三颜色值,组成所述第三纹理数据。
- 根据权利要求2所述的方法,其中,所述第三纹理数据包括每个纹理像素的第三颜色值;所述基于所述第三纹理数据,进行对应所述虚拟对象的拟合渲染处理,得到包括所述虚拟对象的拟合渲染图,包括:获取所述虚拟对象的二维纹理坐标;获取对应所述拟合渲染处理的可微分渲染框架;将所述二维纹理坐标以及每个所述纹理像素的第三颜色值在所述可微分渲染框架中进行正向传播,得到包括所述虚拟对象的拟合渲染图。
- 根据权利要求1所述的方法,其中,所述确定所述拟合渲染图与标准渲染图之间的渲染损失,包括:确定所述标准渲染图与所述拟合渲染图在屏幕空间的整体像素值差异;基于所述整体像素值差异、所述拟合渲染图的长度以及所述拟合渲染图的宽度,确定所述渲染损失。
- 根据权利要求6所述的方法,其中,所述确定所述标准渲染图与所述拟合渲染图在屏幕空间的整体像素值差异,包括:针对所述拟合渲染图以及所述标准渲染图在所述屏幕空间中任一相同像素执行以下处理:确定所述拟合渲染图中对应所述像素的第一像素值,并确定所述标准渲染图中对应所述像素的第二像素值;将所述第一像素值与所述第二像素值之间的差值的绝对值作为所述像素的像素值差异;对所述屏幕空间中多个所述像素的像素值差异进行求和处理,得到所述整体像素值差异。
- 根据权利要求1所述的方法,其中,所述基于所述渲染损失更新所述转换参数以及所述第一纹理数据,包括:基于所述渲染损失对所述第一纹理数据进行偏导处理,得到对应所述第一纹理数据的梯度;基于所述渲染损失对所述转换参数进行偏导处理,得到对应所述转换参数的梯度;基于对应所述第一纹理数据的梯度,更新所述第一纹理数据,并基于对应所述转换参数的梯度,更新所述转换参数。
- 根据权利要求8所述的方法,其中,所述基于对应所述第一纹理数据的梯度,更新所述第一纹理数据,包括:将设定学习率与对应所述第一纹理数据的梯度相乘,得到所述第一纹理数据的数据变化值;将所述第一纹理数据的数据变化值与所述第一纹理数据相加,得到更新的第一纹理数据;所述基于对应所述转换参数的梯度,更新所述转换参数,包括:将设定学习率与对应所述转换参数的梯度相乘,得到所述转换参数的数据变化值;将所述转换参数的数据变化值与所述转换参数相加,得到更新的转换参数。
- 根据权利要求1所述的方法,其中,所述基于更新的转换参数以及更新的第一纹理数据,对所述虚拟对象进行实时渲染处理,包括:当所述渲染损失小于损失阈值时,基于更新的转换参数以及更新的第一纹理数据,对所述虚拟对象进行实时渲染处理;或者当更新次数达到更新次数阈值时,基于更新的转换参数以及更新的第一纹理数据,对所述虚拟对象进行实时渲染处理。
- 根据权利要求1所述的方法,其中,所述基于更新的转换参数以及更新的第一纹理数据,对所述虚拟对象进行实时渲染处理,得到包括所述虚拟对象的目标渲染图,包括:基于所述更新的转换参数,对所述更新的第一纹理数据进行朝向所述第二纹理数据的空间转换处理,得到第四纹理数据,其中,所述第四纹理数据与所述第二纹理数据处于相同动态空间范围;确定所述虚拟对象的至少一个二维纹理坐标;针对每个所述二维纹理坐标执行以下处理:从所述第四纹理数据中采样出与所述二维纹理坐标对应的纹理图像,并对采样得到的纹理图像进行贴合处理;基于每个所述二维纹理坐标的贴合结果,生成包括所述虚拟对象的目标渲染图。
- 一种图像渲染装置,所述装置包括:获取模块,配置为获取虚拟对象的第一纹理数据、以及对应所述虚拟对象的第二纹理数据的转换参数,其中,所述第一纹理数据的数据量小于所述第二纹理数据的数据量,且所述第一纹理数据的图像信息范围小于所述第二纹理数据的图像信息范围;拟合模块,配置为基于所述转换参数以及所述第一纹理数据,进行拟合渲染处理,得到包括所述虚拟对象的拟合渲染图;损失模块,配置为确定所述拟合渲染图与标准渲染图之间的渲染损失,并基于所述渲染损失更新所述转换参数以及所述第一纹理数据,其中,所述标准渲染图是基于所述第二纹理数据进行标准渲染处理得到的包括所述虚拟对象的渲染图;渲染模块,配置为基于更新的转换参数以及更新的第一纹理数据,进行实时渲染处理,得到包括所述虚拟对象的目标渲染图。
- 一种电子设备,所述电子设备包括:存储器,用于存储计算机可执行指令;处理器,用于执行所述存储器中存储的计算机可执行指令时,实现权利要求1至11任一项所述的图像渲染方法。
- 一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令被处理器执行时实现权利要求1至11任一项所述的图像渲染方法。
- 一种计算机程序产品,包括计算机程序或计算机可执行指令,所述计算机程序或计算机可执行指令被处理器执行时实现权利要求1至11任一项所述的图像渲染方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22929625.6A EP4394715A1 (en) | 2022-03-03 | 2022-12-02 | Image rendering method and apparatus, electronic device, computer-readable storage medium, and computer program product |
US18/369,721 US20240005588A1 (en) | 2022-03-03 | 2023-09-18 | Image rendering method and apparatus, electronic device, computer-readable storage medium, and computer program product |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210202833.0 | 2022-03-03 | ||
CN202210202833.0A CN116740256A (zh) | 2022-03-03 | 2022-03-03 | 图像渲染方法、装置、电子设备、存储介质及程序产品 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/369,721 Continuation US20240005588A1 (en) | 2022-03-03 | 2023-09-18 | Image rendering method and apparatus, electronic device, computer-readable storage medium, and computer program product |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023165198A1 true WO2023165198A1 (zh) | 2023-09-07 |
Family
ID=87882958
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/136193 WO2023165198A1 (zh) | 2022-03-03 | 2022-12-02 | 图像渲染方法、装置、电子设备、计算机可读存储介质及计算机程序产品 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240005588A1 (zh) |
EP (1) | EP4394715A1 (zh) |
CN (1) | CN116740256A (zh) |
WO (1) | WO2023165198A1 (zh) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7932914B1 (en) * | 2005-10-20 | 2011-04-26 | Nvidia Corporation | Storing high dynamic range data in a low dynamic range format |
CN102696220A (zh) * | 2009-10-08 | 2012-09-26 | 国际商业机器公司 | 将数字图像从低动态范围图像转换为高动态范围图像的方法与系统 |
CN106033617A (zh) * | 2015-03-16 | 2016-10-19 | 广州四三九九信息科技有限公司 | 一种结合可视化工具进行游戏图片智能压缩的方法 |
US20200151509A1 (en) * | 2018-11-12 | 2020-05-14 | Adobe Inc. | Learning to estimate high-dynamic range outdoor lighting parameters |
CN113963110A (zh) * | 2021-10-11 | 2022-01-21 | 北京百度网讯科技有限公司 | 纹理图生成方法、装置、电子设备及存储介质 |
CN114067042A (zh) * | 2021-11-08 | 2022-02-18 | 腾讯科技(深圳)有限公司 | 一种图像渲染方法、装置、设备、存储介质及程序产品 |
-
2022
- 2022-03-03 CN CN202210202833.0A patent/CN116740256A/zh active Pending
- 2022-12-02 WO PCT/CN2022/136193 patent/WO2023165198A1/zh active Application Filing
- 2022-12-02 EP EP22929625.6A patent/EP4394715A1/en active Pending
-
2023
- 2023-09-18 US US18/369,721 patent/US20240005588A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7932914B1 (en) * | 2005-10-20 | 2011-04-26 | Nvidia Corporation | Storing high dynamic range data in a low dynamic range format |
CN102696220A (zh) * | 2009-10-08 | 2012-09-26 | 国际商业机器公司 | 将数字图像从低动态范围图像转换为高动态范围图像的方法与系统 |
CN106033617A (zh) * | 2015-03-16 | 2016-10-19 | 广州四三九九信息科技有限公司 | 一种结合可视化工具进行游戏图片智能压缩的方法 |
US20200151509A1 (en) * | 2018-11-12 | 2020-05-14 | Adobe Inc. | Learning to estimate high-dynamic range outdoor lighting parameters |
CN113963110A (zh) * | 2021-10-11 | 2022-01-21 | 北京百度网讯科技有限公司 | 纹理图生成方法、装置、电子设备及存储介质 |
CN114067042A (zh) * | 2021-11-08 | 2022-02-18 | 腾讯科技(深圳)有限公司 | 一种图像渲染方法、装置、设备、存储介质及程序产品 |
Also Published As
Publication number | Publication date |
---|---|
CN116740256A (zh) | 2023-09-12 |
US20240005588A1 (en) | 2024-01-04 |
EP4394715A1 (en) | 2024-07-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2023160054A1 (zh) | 一种图像渲染方法、装置、电子设备、计算机可读存储介质及计算机程序产品 | |
US9928637B1 (en) | Managing rendering targets for graphics processing units | |
US11711563B2 (en) | Methods and systems for graphics rendering assistance by a multi-access server | |
US11615575B2 (en) | Methods and systems for constructing a shader | |
CN112215934A (zh) | 游戏模型的渲染方法、装置、存储介质及电子装置 | |
CN114067042A (zh) | 一种图像渲染方法、装置、设备、存储介质及程序产品 | |
WO2023202254A1 (zh) | 图像渲染方法、装置、电子设备、计算机可读存储介质及计算机程序产品 | |
CN117101127A (zh) | 虚拟场景中的图像渲染方法、装置、电子设备及存储介质 | |
CN115082607B (zh) | 虚拟角色头发渲染方法、装置、电子设备和存储介质 | |
CN111784817A (zh) | 阴影的展示方法和装置、存储介质、电子装置 | |
CN114359458A (zh) | 一种图像渲染方法、装置、设备、存储介质及程序产品 | |
WO2023142756A1 (zh) | 直播互动方法、装置以及系统 | |
CN115937389A (zh) | 阴影渲染方法、装置、存储介质和电子设备 | |
WO2023165198A1 (zh) | 图像渲染方法、装置、电子设备、计算机可读存储介质及计算机程序产品 | |
CN116152405B (zh) | 一种业务处理方法、装置及计算机设备、存储介质 | |
CN114399580A (zh) | 一种图像渲染方法、装置、设备、存储介质及程序产品 | |
US7710419B2 (en) | Program, information storage medium, and image generation system | |
CN115970275A (zh) | 虚拟对象的投影处理方法、装置、存储介质与电子设备 | |
CN118159341A (zh) | 一种图像帧的渲染方法及相关装置 | |
US7724255B2 (en) | Program, information storage medium, and image generation system | |
CN118615703A (zh) | 虚拟场景的渲染方法、装置、设备、存储介质及程序产品 | |
CN117078824A (zh) | 参数拟合方法、装置、设备、存储介质及程序产品 | |
WO2025020657A1 (zh) | 用于虚拟场景的图像显示方法、设备、介质及程序产品 | |
CN118537475A (zh) | 图像渲染方法、装置、电子设备、存储介质及程序产品 | |
CN114332316A (zh) | 一种虚拟角色的处理方法、装置、电子设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22929625 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022929625 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022929625 Country of ref document: EP Effective date: 20240329 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11202402798P Country of ref document: SG |
|
NENP | Non-entry into the national phase |
Ref country code: DE |