CN116993896A - Illumination information processing device, electronic device, and storage medium - Google Patents

Illumination information processing device, electronic device, and storage medium Download PDF

Info

Publication number
CN116993896A
CN116993896A CN202311056697.XA CN202311056697A CN116993896A CN 116993896 A CN116993896 A CN 116993896A CN 202311056697 A CN202311056697 A CN 202311056697A CN 116993896 A CN116993896 A CN 116993896A
Authority
CN
China
Prior art keywords
light source
point light
scene object
current frame
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311056697.XA
Other languages
Chinese (zh)
Inventor
王友良
吴嘉健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311056697.XA priority Critical patent/CN116993896A/en
Publication of CN116993896A publication Critical patent/CN116993896A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/21Collision detection, intersection

Abstract

The application provides a method and a device for processing illumination information, electronic equipment and a storage medium, and relates to the technical field of image rendering. According to the method, the information of the simulated point light source corresponding to the scene object of the current frame is generated in real time, so that the illumination effect of the simulated point light source on the scene object is used for simulating the reflected light of the game environment on the scene object. In addition, according to a series of theoretical principles that the reference parameter information and the intensity are inversely proportional to the distance, the light source is related to the lens and the like, the intensity, the position and the color of the simulated point light source are calculated, so that the accuracy of the obtained intensity, the position and the color information of the simulated point light source of each frame is higher, the indirect illumination is simulated based on the obtained simulated point light source, and the indirect illumination rendering effect on a scene object is better.

Description

Illumination information processing device, electronic device, and storage medium
Technical Field
The present application relates to the field of image rendering technologies, and in particular, to an illumination information processing apparatus, an electronic device, and a storage medium.
Background
In order to enhance the realism of the lighting rendering effect of objects in a game, indirect lighting rendering effects are often simulated in addition to direct lighting rendering.
Currently, a better way to simulate indirect lighting rendering is the Lumen global lighting system introduced in the game engine. The light ray tracing algorithm is adopted, so that the propagation and reflection of light rays can be more accurately simulated, dynamic global illumination is supported, illumination information can be automatically updated when a character moves, and therefore a dynamic indirect illumination effect is achieved.
However, this method is computationally expensive and is not suitable for all terminal devices to render indirect lighting for games.
Disclosure of Invention
The application aims to provide illumination information processing, device, electronic equipment and storage medium, reduce the calculated amount and memory occupation of indirect illumination rendering, and provide a general and simple indirect illumination rendering method.
In order to achieve the above purpose, the technical scheme adopted by the embodiment of the application is as follows:
in a first aspect, an embodiment of the present application provides a method for processing illumination information, including:
obtaining reference parameter information, wherein the reference parameter information at least comprises: intensity of the reference point light source, reference distance and reference included angle;
Determining the distance between the scene object and the virtual camera in the current frame according to the position information of the scene object in the current frame and the position information of the virtual camera, and determining the intensity of each simulated point light source corresponding to the scene object in the current frame according to the distance between the scene object and the virtual camera in the current frame, the intensity of the reference point light source and the reference distance;
determining the position of each simulated point light source corresponding to the scene object of the current frame according to the position information of the scene object of the current frame, the position information of the virtual camera, the reference distance and the reference included angle;
determining the color of each simulated point light source corresponding to the scene object of the current frame according to the position of each simulated point light source;
and performing indirect illumination rendering on the scene object of the current frame according to the intensity, the color and the position of the simulated point light source.
In a second aspect, an embodiment of the present application further provides an illumination information processing apparatus, including: the device comprises an acquisition module, a determination module and a rendering module;
the acquisition module is configured to acquire reference parameter information, where the reference parameter information at least includes: intensity of the reference point light source, reference distance and reference included angle;
The determining module is used for determining the distance between the scene object and the virtual camera in the current frame according to the position information of the scene object in the current frame and the position information of the virtual camera, and determining the intensity of each simulated point light source corresponding to the scene object in the current frame according to the distance between the scene object and the virtual camera in the current frame, the intensity of the reference point light source and the reference distance;
the determining module is used for determining the positions of the simulated point light sources corresponding to the scene objects of the current frame according to the position information of the scene objects of the current frame, the position information of the virtual camera, the reference distance and the reference included angle;
the determining module is used for determining the colors of the simulated point light sources corresponding to the scene objects of the current frame according to the positions of the simulated point light sources;
and the rendering module is used for performing indirect illumination rendering on the scene object of the current frame according to the intensity, the color and the position of the simulated point light source.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, a storage medium, and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating over the bus when the electronic device is operating, the processor executing the machine-readable instructions to perform the illumination information processing method as provided in the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the illumination information processing method as provided in the first aspect.
The beneficial effects of the application are as follows:
the application provides a lighting information processing method, a device, electronic equipment and a storage medium, wherein the method simulates the reflection light of a game environment to a scene object by generating the information of a simulated point light source corresponding to the scene object of a current frame in real time so as to simulate the lighting effect of the point light source to the scene object. In addition, according to the reference parameter information determined in the test stage, the intensity of the simulated point light source is updated by taking the fact that the intensity of the light source is inversely proportional to the distance between the scene object and the virtual camera as a theoretical basis; the horizontal position of the light source is related to the lens of the player, the irradiation effect of the light source always faces the lens at an initial angle, and the position of the simulated point light source is updated according to the fact that an included angle between the connecting line of the light source and the virtual character and the connecting line of the light source and the virtual camera is a certain angle, so that the scene object moves anyway, and the player can observe the environment reflected light of the simulated scene object from all angles. And the color of the simulated point light source is updated based on the position of the simulated point light source, so that the color information of the simulated point light source is more reasonable. The intensity, the position and the color information accuracy of the simulated point light source of each frame obtained based on the method are higher, so that the indirect illumination is simulated based on the obtained simulated point light source, the indirect illumination rendering effect on a scene object is better, the method is simple in execution process, the calculation consumption is smaller, and the method is suitable for various terminal equipment.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a method for processing illumination information according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a positional relationship according to an embodiment of the present application;
fig. 3 is a flowchart of another illumination information processing method according to an embodiment of the present application;
FIG. 4 is a flowchart of another method for processing illumination information according to an embodiment of the present application;
FIG. 5 is a schematic diagram of another positional relationship according to an embodiment of the present application;
fig. 6 is a schematic flow chart of a method for processing illumination information according to an embodiment of the present application;
fig. 7 is a flowchart of another illumination information processing method according to an embodiment of the present application;
FIG. 8 is a flowchart of another method for processing illumination information according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an illumination information processing apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for the purpose of illustration and description only and are not intended to limit the scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this disclosure, illustrates operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to or removed from the flow diagrams by those skilled in the art under the direction of the present disclosure.
In addition, the described embodiments are only some, but not all, embodiments of the application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that the term "comprising" will be used in embodiments of the application to indicate the presence of the features stated hereafter, but not to exclude the addition of other features.
First, some terms that may be referred to in the following embodiments will be described:
a unreal Engine: the illusion engine is a game engine developed by Epic Games, which is widely used to develop video Games, virtual reality and augmented reality applications, movie and television visual effects, and the like.
Gi: the abbreviation for global indirect lighting (Global Illumination) refers to the process of re-illuminating other objects after light is reflected, refracted, and scattered in a scene. Colloquially, GI refers to indirect lighting effects in which light rays in a scene not only strike objects directly, but also reflect, refract, and scatter between objects. The GI can greatly improve the realism and fidelity of a scene because it can simulate the reflection and scattering effects of light in the real world, making shadows, colors and illumination in the scene more natural.
Lumen: the illusion engine 5 is a generic term for a completely new global illumination system that can implement real-time global illumination and global reflection. Compared with the traditional global illumination technology, the Lumen is more efficient, and the high-quality shadow, reflection and refraction effects can be realized in real time by using the ray tracing technology. The Lumen also supports dynamic global illumination, and illumination information can be automatically updated when characters move, so that more realistic illumination effects are provided in real-time scenes.
Ray tracking: ray tracing is a technique for calculating the propagation of a ray in a scene that simulates the interaction of the ray with an object to calculate the final color and brightness of the ray. In computer graphics, ray tracing is widely used to render realistic images and animations.
Photon Mapping: photon mapping is an algorithm for computing illumination, often used to compute global illumination of static scenes. Which emit radiation from the light source and the camera, respectively, which are connected together for generating a radiance value in the next step when a termination condition is fulfilled. The method can calculate high-quality global illumination effect, is used for performing sense-of-reality simulation between light and an object, and particularly has excellent performance in the aspects of processing diffuse reflection and indirect illumination.
6. Character vision development: refers to the development of the appearance and action of personas using computer graphics and computer animation techniques. It relates to various aspects of character design, modeling, texture mapping, skeletal binding, animation, and the like. The character visual development can be applied to the fields of games, films, animations and the like, and is used for creating vivid and exquisite virtual characters and enhancing the immersion and participation of users. The visual development of the roles needs to comprehensively consider knowledge and skills in various aspects of art, animation, programming and the like, and is one of important applications in the fields of computer graphics and computer animation.
When dynamic objects are illuminated within the game engine, the character lighting environment lacks global indirect lighting (GI) effects. Even in the most advanced UE5 engines that incorporate the Lumen global illumination system, a large amount of computing resources are required to achieve the GI effect. In some resources, however, where tight projects are required, dynamic global illumination of the game engine is often pre-fabricated in order to reduce the computational resources consumed. Such a method may pre-calculate and store illumination information prior to or at the time of game play for use at the time of game play. However, since the pre-computed illumination information is static, as characters move, their appearance does not change with changes in ambient illumination, i.e., dynamic objects have no GI, thereby making the scene appear unrealistic.
The most advanced game engine solution at present is the Lumen global illumination system introduced in the UE 5. The light ray tracing algorithm is adopted, so that the propagation and reflection of light rays can be simulated more accurately, and a more real illumination effect is generated. In addition, the Lumen also supports dynamic global illumination, and illumination information can be automatically updated when the character moves, so that a dynamic illumination effect is realized.
The principle of the technology is that surrounding scene illumination information is stored in the optical probe in a spherical harmonic coefficient mode, an object can read the optical probe closest to the object in operation, and the illumination information of surrounding scenes is obtained, so that the GI effect is achieved. Since it saves scene illumination information in the light probe, it is more beneficial to realize GI of dynamic objects than traditional Lightmap, so commercial engines like UE4, unity etc. use this technology as complement to Lightmap for providing GI for dynamic objects in the scene.
However, the current methods all have the following problems:
1. the calculation consumption is huge, and both the Lumen calculation in real time and the pre-calculation of the optical probe require a large amount of calculation resources. Basically, the game can only be used in larger games, and the games are basically difficult to realize at the mobile phone end.
2. Occupying a stored amount of data, including illumination data for lightMap and light probe, would occupy a large amount of storage space.
Based on the above, according to the illumination information processing method provided by the scheme, the simulated point light source is arranged for the object to be rendered, and the simulated point light source is used as the emission light source to realize indirect illumination rendering of the object to be rendered. Wherein, according to the principle that the illumination intensity is inversely proportional to the distance between the object to be rendered and the lens, the updating of the illumination intensity of the simulated point light source can be realized; the simulated point light source is used for carrying out light collision towards the reference direction, the color of the collision intersection point is obtained to update the color of the simulated point light source, and the effect of real-time environment reflection of the object to be rendered in the moving process can be simulated; according to the principle that the horizontal position of the simulated point light source is related to the lens, the illumination effect is guaranteed to always face the lens at an initial angle, and a player can observe simulated environment reflected light from all angles, the position of the simulated point light source is updated; therefore, each frame is updated to the simulated point light source configured by the object to be rendered in real time, indirect illumination rendering is realized by the simulated point light source, the operation consumption is reduced, and meanwhile, the environment light information is not required to be stored in advance.
Fig. 1 is a schematic flow chart of a method for processing illumination information according to an embodiment of the present application; the execution subject of the method can be a terminal device, a server and other computer devices. As shown in fig. 1, the method may include:
s101, acquiring reference parameter information, wherein the reference parameter information at least comprises: intensity of reference point light source, reference distance and reference angle.
The corresponding reference parameter information may be different for different scene objects, and the reference parameter is used to give some distance or angle reference information to be maintained between the scene object and the virtual camera during real-time running of the game. So that the illumination effect of the point light source to be simulated on the scene object always faces the lens at the initial angle, ensuring that the player can observe the simulated environment reflected light from all angles.
It is noted that, in the real-time running process of the game, the reference parameter information is acquired, and the method is only required to be executed once, and the acquired reference parameter information can be multiplexed when each frame carries out indirect illumination rendering on the scene object in the running process of the game, that is, the reference parameter information is used as a reference standard to update the intensity, the position and the color of the simulated point light source configured when the scene object is in each frame, so that the optimal indirect illumination rendering of each frame is realized.
The scene object may be any dynamically movable object in the scene. For example: virtual characters, virtual vehicles, virtual monsters, etc.
S102, determining the distance between the current frame scene object and the virtual camera according to the position information of the current frame scene object and the position information of the virtual camera, and determining the intensity of each simulated point light source corresponding to the current frame scene object according to the distance between the current frame scene object and the virtual camera, the intensity of the reference point light source and the reference distance.
The current frame can be any frame, and the position information of the scene object of the current frame can refer to the world coordinate information of the scene object. The scene object can be any scene object in the game scene, and indirect illumination rendering can be realized by adopting the method provided by the scheme aiming at any scene object.
Virtual cameras are understood to be player cameras that provide a player's field of view, i.e. a lens, and may be at any angle to a scene object. The position information of the virtual camera may also refer to world coordinate information of the virtual camera.
Alternatively, the distance between the current frame scene object and the virtual camera may be determined based on the position information of the current frame scene object and the position information of the virtual camera.
In some embodiments, the intensity of each simulated point light source corresponding to the current frame scene object may be determined according to the distance between the current frame scene object and the virtual camera, and the intensity of the reference point light source and the reference distance determined in the test stage.
The determining basis of the intensity of each simulated point light source corresponding to the scene object of the current frame is that the inverse relation between the intensity of the simulated point light source and the distance between the scene object and the virtual camera is satisfied, that is, the closer the virtual camera is to the scene object, the stronger the intensity of the simulated point light source is, and conversely, the weaker the intensity is, when the distance exceeds a certain distance, some illumination information of the scene object may not be observed in the field of view of the player because the virtual camera is too far from the scene object, that is, when the field of view of the scene object is far from the player, some illumination information may be ignored, and the intensity of the simulated point light source may be set to 0 at this time, thereby saving calculation resources.
S103, determining the positions of the simulated point light sources corresponding to the current frame scene object according to the position information of the current frame scene object, the position information of the virtual camera, the reference distance and the reference included angle.
Optionally, the position of each simulated point light source corresponding to the current frame scene object may be determined according to the position information of the current frame scene object, the position information of the virtual camera, and the determined reference distance and reference angle.
The position of the simulated point light source is determined according to the position binding relation between the scene object and the reference point light source and between the scene object and the simulated point light source and between the scene object and the virtual camera, which are established in the test stage, so that the magnitude of an included angle formed by a connecting line between the scene object and the simulated point light source and a connecting line between the scene object and the virtual camera is kept unchanged in correspondence no matter how the scene object moves, and a player can observe simulated ambient light reflection from any angle, so that the positions of the simulated point light sources corresponding to the scene object of the current frame can be calculated reversely based on the reference distance and the reference included angle which are acquired in advance.
S104, determining the color of each simulated point light source corresponding to the current frame scene object according to the position of each simulated point light source.
Alternatively, based on the position of the simulated point light source, the simulated point light source may be placed at a corresponding position and light collision may be performed at the corresponding position to simulate reflection of ambient light, and the color of the simulated point light source may be determined according to collision point information of the light collision.
And S105, performing indirect illumination rendering on the scene object of the current frame according to the intensity, the color and the position of the simulated point light source.
Based on the above manner, the intensity, color and position of the simulated point light source corresponding to the current frame scene object can be obtained, namely the specific light source information of the simulated point light source corresponding to the current frame scene object is determined, so that the simulated point light source is used as a light source for generating indirect illumination, light projection is carried out on the current frame scene object, the indirect illumination information generated after the light projection is collected, and the indirect illumination rendering is carried out on the current frame scene object.
Optionally, based on the determined simulated point light source, the simulated point light source may be controlled to emit light to the scene object, and an intersection point of the emitted light and the scene object is determined, so as to obtain illumination parameter information such as a material, a texture, a color, and the like at the intersection point of the scene object, and the collected illumination parameter information is used as input data and input into an interval illumination rendering algorithm for illumination rendering, so that a rendering result may be generated.
And rendering for subsequent frames may continue as per steps S102-S105.
In summary, according to the illumination information processing method provided by the embodiment, the information of the simulated point light source corresponding to the scene object of the current frame is generated in real time, so that the illumination effect of the simulated point light source on the scene object is used for simulating the reflected light of the game environment on the scene object. In addition, according to the reference parameter information determined in the test stage, the intensity of the simulated point light source is updated by taking the fact that the intensity of the light source is inversely proportional to the distance between the scene object and the virtual camera as a theoretical basis; the horizontal position of the light source is related to the lens of the player, the irradiation effect of the light source always faces the lens at an initial angle, and the position of the simulated point light source is updated according to the fact that an included angle between the connecting line of the light source and the virtual character and the connecting line of the light source and the virtual camera is a certain angle, so that the scene object moves anyway, and the player can observe the environment reflected light of the simulated scene object from all angles. And the color of the simulated point light source is updated based on the position of the simulated point light source, so that the color information of the simulated point light source is more reasonable. The intensity, the position and the color information accuracy of the simulated point light source of each frame obtained based on the method are higher, so that the indirect illumination is simulated based on the obtained simulated point light source, the indirect illumination rendering effect on a scene object is better, the method is simple in execution process, the calculation consumption is smaller, and the method is suitable for various terminal equipment.
Optionally, in step S101, the reference distance and the reference angle may be determined based on a preset positional relationship among the scene object, the test camera, and the reference point light source.
First, a corresponding reference point light source can be set for the scene object in advance in the test stage, and the setting position and intensity of the reference point light source are based on the ideal indirect illumination effect of the scene object under the irradiation of the reference point light source.
Secondly, the position relation between the test camera and the reference point light source and the scene object can be set, the test camera can be understood as a camera used in the process of character vision development, and the test camera is mainly used for recording main vision of characters, for example: front or side 20 degrees, etc.
The preset position relation among the scene object, the test camera and the reference point light source is satisfied, so that the position of the reference point light source is related to the position of the test camera, the illumination effect of the reference point light source on the scene object always faces the lens at an initial angle, and the player can observe simulated environment reflected light from all angles.
Fig. 2 is a schematic diagram of a positional relationship provided by an embodiment of the present application, and fig. 2 is a schematic diagram of a positional relationship among a scene object, a test camera, and a reference point light source when viewed from a top view. As shown in fig. 2, the position relationship between the test camera and the scene object satisfies that the test camera and the scene object face forward or laterally do not exceed 20 degrees, and the reference point light sources are distributed on two sides of the scene object, so that the illumination effect of the reference point light source can always illuminate to the front of the scene object, and therefore, based on the position relationship between the test camera and the scene object, a player can always observe the indirect illumination effect generated after the scene object is illuminated by the reference point light source in the moving process, that is, the player can observe the simulated environment reflected light from all angles.
Alternatively, the reference parameter information may be acquired first based on a preset positional relationship determination among the scene object, the test camera, and the reference point light source.
Fig. 3 is a flowchart of another illumination information processing method according to an embodiment of the present application; optionally, before the reference parameter information is acquired in step S101, the method may further include:
s301, respectively determining a first distance vector between the scene object and the test camera and a second distance vector between the scene object and the reference point light source according to the position information of the scene object, the position information of the test camera and the position information of the at least one reference point light source.
In this embodiment, the description is made by taking the case of providing two reference point light sources as an example, and in practical application, the number of reference point light sources can be determined after multiple attempts to improve the scene object with a better indirect lighting effect.
Alternatively, in the case of position determination of the scene object, the test camera and the reference point light sources, the position information of the scene object, the position information of the test camera and the position information of the at least one reference point light source may be directly read, where the position information may refer to coordinate information for characterizing the position.
In this scheme, taking a scene object as an example of a game role, the position information of the scene object may be coordinate information of a lumbar bone point of the scene object, or coordinate information of a preset position point near the lumbar of the scene object may also be used. The position information of the test camera may use coordinate information of a center point of a lens of the test camera. The reference point light source is a smaller point, and can directly acquire the coordinate information of the reference point light source. For different scene objects, the position information of the scene object may be characterized in different ways, for example: the scene object is a virtual vehicle, and the position information and the like of the scene object can be represented by the gravity center of the virtual vehicle.
Assuming that the position information of the scene object is llookChar, the position information of the test camera is LLookCam, the position information of the reference point light source 1 is L1, and the position information of the reference point light source 2 is L2. Then, a first distance vector VLookCam = LLookCam-LLookChar between the scene object and the test camera, respectively; a second distance vector v1=l1-LlookChar between the scene object and the reference point light source 1; a second distance vector v2=l2-LLookChar between the scene object and the reference point light source 2.
S302, respectively carrying out horizontal plane projection on the first distance vector and the second distance vector, and determining a first projection vector and a second projection vector.
Alternatively, the projection vector of vector a onto the horizontal plane (n= (0, 1)) may be calculated using the formula b=a-n (a·n/|n|ζ 2).
Referring to fig. 2, the first projection vector, i.e., hvlockcam, may be calculated by substituting the first distance vector VLookCam into the formula b=a-n (a·n/|n|ζ 2); similarly, the second projection vectors HV1 and HV2 may be calculated respectively.
S303, determining an included angle between a connecting line of the scene object and the test camera and a connecting line of the scene object and the reference point light source according to the first projection vector and the second projection vector, and obtaining a reference included angle.
Alternatively, the angle of the first projection vector to the second projection vector may be calculated using the formula θ=arcsin (|c·d|/(|c|×|d|)).
When the first projection vector hvlotcam is used as the parameter c, and the second projection vector HV1 is used as the parameter d to be substituted into the formula θ=arcsin (|c·d|/(|c|×|d|)), the included angle between the second projection vector HV1 and the first projection vector hvlotcam, that is, the included angle A1 in fig. 2, can be calculated. Similarly, after substituting the first projection vector hvlockcam as the parameter c and substituting the second projection vector HV2 as the parameter d into the formula θ=arcsin (|c·d|/(|c|×|d|)), an included angle between the second projection vector HV2 and the first projection vector hvlockcam, that is, an included angle A2 in fig. 2, may be calculated.
A1, representing an included angle between a connecting line of the scene object and the test camera and a connecting line of the scene object and the datum point light source 1; a2, representing an included angle between a connecting line of the scene object and the test camera and a connecting line of the scene object and the datum point light source 2; the reference included angle comprises: a1 and A2. As the number of reference point light sources increases, the reference angle correspondingly increases.
FIG. 4 is a flowchart of another method for processing illumination information according to an embodiment of the present application; optionally, before the reference parameter information is acquired in step S101, the method may further include:
s401, performing modular operation on the first projection vector to obtain the distance between the scene object and the test camera.
Alternatively, the distance between the achievable scene object and the test camera may be recorded as DLook by calculating a modulus of the first projection vector HVLookCam.
S402, performing modular operation on the second projection vector to obtain the distance between the scene object and the reference point light source.
Calculating a model of the second projection vector HV1, so as to obtain the distance between the scene object and the datum point light source 1, and recording the distance as D1; the modulus of the second projection vector HV2 is calculated to obtain the distance between the scene object and the reference point light source 2, recorded as D2.
S403, the distance between the scene object and the test camera and the distance between the scene object and the reference point light source are set as reference distances.
The reference distance may then include: DLook, D1 and D2. As the number of reference point light sources increases, the reference distance correspondingly increases.
In some embodiments, the intensity of the reference point light source may be set to an intensity that enables a scene object to have a better indirect lighting effect, where the intensity of the reference point light source 1 is recorded as I1, and the intensity of the reference point light source 2 is recorded as I2.
Next, a determination manner of the intensity, the position and the color of the simulated point light source corresponding to the scene object in each frame during the game running will be described.
Optionally, in step S102, determining the intensity of each simulated point light source corresponding to the current frame scene object according to the distance between the current frame scene object and the virtual camera, the intensity of the reference point light source, and the reference distance may include: if the distance between the current frame scene object and the virtual camera is larger than the distance between the scene object and the test camera and smaller than the preset distance threshold, determining the intensity of the simulated point light source corresponding to the current frame scene object according to the distance between the current frame scene object and the virtual camera, the distance between the scene object and the test camera, the intensity of the reference point light source corresponding to the simulated point light source and the preset distance threshold.
Optionally, the distance between the current frame scene object and the virtual camera can be determined according to the position information of the current frame scene object and the position information of the virtual camera, and recorded as a DGame; referring to step S301, position information of a scene object of the current frame and position information of a virtual camera may be obtained.
In some embodiments, when it is determined that the DGame is greater than the distance DLook between the scene object and the test camera in the reference distance and less than the preset distance threshold Y1, then the distance between the scene object and the virtual camera may be considered to be greater than the reference distance determined in the test stage and not exceed the distance threshold Y1, in which case the intensity of the simulated point light source may be adjusted by the intensity of the reference point light source based on the principle that the intensity is inversely proportional to the distance between the character and the lens. The more the DGame exceeds DLook, the smaller the intensity of the simulated point light source can be set, the closer the DGame is to DLook, and the larger the intensity of the simulated point light source can be set.
It should be noted that when the DGame exceeds the distance threshold Y1, the scene object is considered to be far from the lens, and it is unnecessary to set the reflection of the global illumination, and at this time, the intensity of the simulated point light source may be set to 0, that is, the simulated point light source is turned off. Wherein the distance threshold Y1 may be a threshold determined according to practice.
Alternatively, in a scene where DGame is greater than DLook and less than Y1, the intensity of the simulated point light source corresponding to the current frame scene object may be calculated using the formula I' = (1/2) ((cos (clip (0, 1, ((DGame-DLook)/(Y1-DLook)))) pi) +1) ×i.
Fig. 5 is a schematic diagram of another positional relationship according to an embodiment of the present application. It should be noted that, in the game running stage, the simulated point light source set for the scene object corresponds to the reference point light source set for the scene object in the test stage. As shown in fig. 5, on the basis of the above two reference point light sources, the simulated point light sources in fig. 5 may also include two, and the positional relationship between the simulated point light sources 1 and 2 and the virtual camera and the scene object follows the positional distribution between the test camera and the scene object and the reference point light sources in fig. 2. The simulated point light source 1 may correspond to the reference point light source 1, and the simulated point light source 2 may correspond to the reference point light source 2, but the intensity, color and specific position of the simulated point light source need to be calculated in real time every frame.
Then, the intensity i1' = (1/2) ((cos (clip (0, 1, ((DGame-DLook)/(Y1-DLook)))) pi) +1) of the analog point light source 1 corresponding to the current frame scene object, i.e., the intensity of the reference point light source 1.
Intensity I2' = (1/2) ((cos (clip (0, 1) ((DGame-DLook)/(Y1-DLook)))) pi) +1) I2 of the analog point light source 2 corresponding to the current frame scene object, wherein I2 is the intensity of the reference point light source 2.
Wherein the equation clamp (0, 1, ((DGame-DLook)/(Y1-DLook)) means that the value of the parameter (DGame-DLook)/(Y1-DLook) is limited to between 0 and 1, i.e., if (DGame-DLook)/(Y1-DLook) is data greater than 1, it is changed to 1, if it is a value smaller than 0, it is changed to 0, and if it is a value between 0 and 1, it is kept unchanged.
Through the calculation, the intensity of the obtained simulated point light source corresponding to the scene object of the current frame can accord with the principle of near-large and far-small, so that the indirect illumination effect of the simulated point light source on the scene object is more in accordance with the actual situation.
Optionally, in step S102, determining the intensity of each simulated point light source corresponding to the current frame scene object according to the distance between the current frame scene object and the virtual camera, the intensity of the reference point light source, and the reference distance may include: if the distance between the scene object of the current frame and the virtual camera is smaller than or equal to the distance between the scene object and the test camera, determining the intensity of the simulated point light source corresponding to the scene object of the current frame according to the intensity of the reference point light source corresponding to the simulated point light source.
In other embodiments, if the distance DGame between the scene object of the current frame and the virtual camera is less than or equal to the distance DLook between the scene object and the test camera, the intensity of the reference point light source may be used as the intensity of the simulated point light source, because the intensity of the reference point light source is already a standard intensity that can enable the scene object to have a better indirect illumination effect, similar to the lower limit, when the intensity of the simulated point light source is less than the intensity of the reference point light source again, the expected illumination effect may not be achieved, and then the intensity of the simulated point light source may be set to the intensity of the reference point light source by default when DGame is less than DLook.
In this case, the intensity of the simulated point light source 1 is the intensity I1 of the reference point light source 1, and the intensity of the simulated point light source 2 is the intensity I2 of the reference point light source 2.
Fig. 6 is a schematic flow chart of a method for processing illumination information according to an embodiment of the present application; optionally, in step S103, determining the position of each simulated point light source corresponding to the current frame scene object according to the position information of the current frame scene object, the position information of the virtual camera, the reference distance and the reference included angle may include:
S601, determining a third distance vector between the current frame scene object and the virtual camera according to the position information of the current frame scene object and the position information of the virtual camera, performing horizontal plane projection on the third distance vector, and determining a third projection vector.
Optionally, assuming that the location information of the current frame scene object is LGameChar and the location information of the virtual camera is LGameCam, a third distance vector vgamecam=lgamecam-LGameChar between the current frame scene object and the virtual camera may be determined according to the location information of the current frame scene object and the location information of the virtual camera. The third distance vector and the first and second distance vectors are vectors with direction information.
The third projection vector HVGameCam can be calculated by taking the third distance vector VGameCam as the parameter a with reference to the formula b=a-n (a·n/|n|2).
S602, performing modular operation on the third projection vector to obtain the distance between the scene object of the current frame and the virtual camera.
And calculating a module of a third projection vector HVGameCal to obtain the distance between the scene object of the current frame and the virtual camera, and recording the distance as D3.
S603, determining the position of the simulated point light source corresponding to the current frame scene object according to the distance between the current frame scene object and the virtual camera, the third projection vector, the included angle between the connecting line of the scene object and the test camera and the connecting line of the reference point light source corresponding to the scene object and the simulated point light source, the distance between the scene object and the reference point light source corresponding to the simulated point light source and the position information of the current frame scene object.
Alternatively, based on the principle that the included angle between the line of the scene object and the simulated point light source and the line of the scene object and the virtual camera in fig. 5 is the same as the included angle between the line of the reference point light source corresponding to the scene object and the simulated point light source in fig. 2 and the line of the scene object and the test camera, the included angle between the line of the scene object and the simulated point light source 1 and the line of the scene object and the virtual camera in fig. 5 can be determined to take the value A1, and the included angle between the line of the scene object and the simulated point light source 2 and the line of the scene object and the virtual camera takes the value A2.
Then, based on the distance D3 between the scene object and the virtual camera of the current frame and the value A1 of the included angle between the line connecting the scene object and the simulated point light source 1 and the line connecting the scene object and the virtual camera, the value A2 of the included angle between the line connecting the scene object and the simulated point light source 2 and the line connecting the scene object and the virtual camera can be calculated reversely, so as to obtain the position of the simulated point light source 1 and the position of the simulated point light source 2.
Fig. 7 is a flowchart of another illumination information processing method according to an embodiment of the present application; optionally, in step S603, determining the position of the simulated point light source corresponding to the current frame scene object according to the distance between the current frame scene object and the virtual camera, the third projection vector, the included angle between the connecting line of the scene object and the test camera and the connecting line of the reference point light source corresponding to the scene object and the simulated point light source, the distance between the scene object and the reference point light source corresponding to the simulated point light source, and the position information of the current frame scene object may include:
S701, determining a fourth projection vector of a distance vector between the simulated point light source and the scene object of the current frame on a horizontal plane according to the distance between the scene object of the current frame and the virtual camera, the third projection vector, and the included angle between the connecting line of the scene object and the test camera and the connecting line of the scene object and the reference point light source corresponding to the simulated point light source.
Optionally, a fourth projection vector of the distance vector between the simulated point light source and the current frame scene object on the horizontal plane is determined by using the formula hvgamecam+tan (a) xD3 according to the distance D3 between the current frame scene object and the virtual camera, the third projection vector HVGameCam, and the included angle (A1 or A2) between the line connecting the scene object and the test camera and the line connecting the scene object and the reference point light source corresponding to the simulated point light source.
Then, a fourth projection vector HV1' =hvgamecam+tan (A1) xD3 of the distance vector between the point light source 1 and the current frame scene object on the horizontal plane is simulated; a fourth projection vector HV2' =hvgamecam+tan (A2) xD3 of the distance vector between the point light source 2 and the current frame scene object on the horizontal plane is simulated.
S702, carrying out normalization processing on the fourth projection vector to obtain a movement direction vector of the simulated point light source in the scene object direction.
Optionally, the fourth projection vector HV1' is normalized, that is, unitized, so as to obtain a moving direction vector HVNormal1 in the scene object direction of the simulated point light source 1; the fourth projection vector HV2' is normalized, that is, unitized, to obtain a moving direction vector HVNorma2 in the scene object direction of the simulated point light source 2.
S703, determining the position of the simulated point light source corresponding to the current frame scene object according to the moving direction vector, the distance between the scene object and the reference point light source corresponding to the simulated point light source and the position information of the current frame scene object.
Then, the position of the simulated point light source 1 corresponding to the scene object of the current frame can be calculated by using the formula HVNormal1×d1+lgamechar, wherein HVNormal1 is a moving direction vector of the simulated point light source 1 in the scene object direction, D1 is a distance between the scene object and the reference point light source 1, and LGameChar is position information of the scene object.
The position of the simulated point light source 2 corresponding to the scene object of the current frame can be calculated by adopting a formula HVNormal2×d2+lgamechar, wherein HVNormal2 is a moving direction vector of the simulated point light source 2 in the scene object direction, D2 is a distance between the scene object and the reference point light source 2, and LGameChar is position information of the scene object.
Based on the method, the intensity and the position of the simulated point light source corresponding to the current frame scene object are calculated respectively.
Optionally, in step S104, determining the color of each simulated point light source corresponding to the current frame scene object according to the position of each simulated point light source may include: and carrying out light collision according to the positions of the simulated point light sources, and determining the colors of the simulated point light sources corresponding to the current frame scene object according to collision intersection points after the light collision.
In some embodiments, based on the determined position of the simulated point light source, the simulated point light source may be placed at the position, and a light collision detection may be performed to obtain a collision intersection point of the light emitted by the simulated point light source and the collided object, so that the color of the simulated point light source may be determined according to the information of the collision intersection point.
FIG. 8 is a flowchart of another method for processing illumination information according to an embodiment of the present application; optionally, in the step, light collision is performed according to the positions of the simulated point light sources, and the determining the color of each simulated point light source corresponding to the current frame scene object according to the collision intersection point after the light collision may include:
s801, controlling the simulated point light source to project light rays towards the reference direction, and acquiring collision intersection points after the simulated point light source collides with the virtual object in the reference direction.
Alternatively, the simulated point light source may be placed at the determined position, and then the simulated point light source is controlled to project light in the reference direction at the current position. Wherein the reference direction may include, but is not limited to: downward (toward the ground) or any other direction, for example, a wall surface in a predetermined direction. The projected light collides with the virtual object in the reference direction in the advancing process, so that a collision intersection point can be obtained, for example, the projected light downwards collides with the ground, and the intersection point with the ground can be used as the collision intersection point; or projected to the right front, the object such as a wall surface in the scene may collide with the object, and the intersection with the wall surface may be regarded as the collision intersection.
In some embodiments, a projection distance threshold Y2 may also be set, when the simulated point light source projects light in the reference direction, if the projected light does not collide with any object within the projection distance threshold Y2, the simulated point light source may be considered to be far away from the object in the scene, and at this time, the intensity of the simulated point light source may be set to 0, that is, the simulated point light source is turned off, and the indirect illumination of the object in the scene may be considered to be negligible.
For example: the simulated point light source projects toward the ground without colliding with the ground within the projection distance threshold Y2, and the simulated point light source is considered to be far from the ground, and at this time, the scene object may be considered to be in the air, and the reflected light from the ground is considered to be weak and negligible.
S802, acquiring the color of the collision intersection point.
Alternatively, the color of the collision intersection may be read from a map in which color information of the object is stored, for example: the color of the collision intersection point of the light ray and the ground is read from the map storing the color information of the ground.
S803, determining the color of the collision intersection point as the color of the simulated point light source corresponding to the scene object of the current frame.
Alternatively, the color of the collision intersection point may be regarded as the color of the simulated point light source corresponding to the previous frame scene object. The collision intersection points corresponding to different simulated point light sources are different, so that the determined colors of the different simulated point light sources are different. By adopting the method, the color of the simulated point light source can be obtained quickly, and the color meets the color requirement of the scene on the reflected light of the scene object.
In summary, according to the illumination information processing method provided by the embodiment, the information of the simulated point light source corresponding to the scene object of the current frame is generated in real time, so that the illumination effect of the simulated point light source on the scene object is used for simulating the reflected light of the game environment on the scene object. In addition, according to the reference parameter information determined in the test stage, the intensity of the simulated point light source is updated by taking the fact that the intensity of the light source is inversely proportional to the distance between the scene object and the virtual camera as a theoretical basis; the horizontal position of the light source is related to the lens of the player, the irradiation effect of the light source always faces the lens at an initial angle, and the position of the simulated point light source is updated according to the fact that an included angle between the connecting line of the light source and the virtual character and the connecting line of the light source and the virtual camera is a certain angle, so that the scene object moves anyway, and the player can observe the environment reflected light of the simulated scene object from all angles. And the color of the simulated point light source is updated based on the position of the simulated point light source, so that the color information of the simulated point light source is more reasonable. The intensity, the position and the color information accuracy of the simulated point light source of each frame obtained based on the method are higher, so that the indirect illumination is simulated based on the obtained simulated point light source, the indirect illumination rendering effect on a scene object is better, the method is simple in execution process, the calculation consumption is smaller, and the method is suitable for various terminal equipment.
The following describes a device, equipment, a storage medium, etc. for executing the illumination information processing method provided by the present application, and specific implementation processes and technical effects thereof are referred to above, and are not described in detail below.
Fig. 9 is a schematic diagram of an illumination information processing apparatus according to an embodiment of the present application, where functions implemented by the illumination information processing apparatus correspond to steps executed by the above-described method. The apparatus may be understood as a terminal device, a server, or a processor of a server, or may be understood as a component, which is independent from the server or the processor and performs the functions of the present application under the control of the server, as shown in fig. 9, where the apparatus may include: acquisition module 910, determination module 920, and rendering module 930;
the obtaining module 910 is configured to obtain reference parameter information, where the reference parameter information at least includes: intensity of the reference point light source, reference distance and reference included angle;
the determining module 920 is configured to determine a distance between the current frame scene object and the virtual camera according to the position information of the current frame scene object and the position information of the virtual camera, and determine an intensity of each simulated point light source corresponding to the current frame scene object according to the distance between the current frame scene object and the virtual camera, the intensity of the reference point light source, and the reference distance;
The determining module 920 is configured to determine the positions of the simulated point light sources corresponding to the current frame scene object according to the position information of the current frame scene object, the position information of the virtual camera, and the reference distance and the reference angle;
the determining module 920 is configured to determine, according to the positions of the simulated point light sources, the colors of the simulated point light sources corresponding to the scene objects of the current frame;
and the rendering module 930 is used for performing indirect illumination rendering on the scene object of the current frame according to the intensity, the color and the position of the simulated point light source.
Optionally, the reference distance and the reference angle are determined based on a preset positional relationship among the scene object, the test camera, and the reference point light source.
Optionally, the determining module 920 is further configured to determine a first distance vector between the scene object and the test camera and a second distance vector between the scene object and the reference point light source according to the position information of the scene object, the position information of the test camera, and the position information of the at least one reference point light source, respectively;
respectively carrying out horizontal plane projection on the first distance vector and the second distance vector, and determining a first projection vector and a second projection vector;
and determining an included angle between the connecting line of the scene object and the test camera and the connecting line of the scene object and the reference point light source according to the first projection vector and the second projection vector to obtain a reference included angle.
Optionally, the determining module 920 is further configured to perform a modulo operation on the first projection vector to obtain a distance between the scene object and the test camera;
performing modular operation on the second projection vector to obtain the distance between the scene object and the reference point light source;
the distance between the scene object and the test camera and the distance between the scene object and the reference point light source are taken as reference distances.
Optionally, the determining module 920 is specifically configured to determine, if the distance between the current frame scene object and the virtual camera is greater than the distance between the scene object and the test camera and less than a preset distance threshold, the intensity of the simulated point light source corresponding to the current frame scene object according to the distance between the current frame scene object and the virtual camera, the distance between the scene object and the test camera, the intensity of the reference point light source corresponding to the simulated point light source, and the preset distance threshold.
Optionally, the determining module 920 is specifically configured to determine the intensity of the simulated point light source corresponding to the current frame scene object according to the intensity of the reference point light source corresponding to the simulated point light source if the distance between the current frame scene object and the virtual camera is less than or equal to the distance between the scene object and the test camera.
Optionally, the determining module 920 is specifically configured to determine a third distance vector between the current frame scene object and the virtual camera according to the position information of the current frame scene object and the position information of the virtual camera, and perform horizontal plane projection on the third distance vector to determine a third projection vector;
performing modular operation on the third projection vector to obtain the distance between the scene object of the current frame and the virtual camera;
and determining the position of the simulated point light source corresponding to the current frame scene object according to the distance between the current frame scene object and the virtual camera, the third projection vector, the included angle between the connecting line of the scene object and the test camera and the connecting line of the reference point light source corresponding to the scene object and the simulated point light source, the distance between the scene object and the reference point light source corresponding to the simulated point light source and the position information of the current frame scene object.
Optionally, the determining module 920 is specifically configured to determine a fourth projection vector of the distance vector between the simulated point light source and the current frame scene object on a horizontal plane according to the distance between the current frame scene object and the virtual camera, the third projection vector, and an included angle between a connection line between the scene object and the test camera and a connection line between the scene object and the reference point light source corresponding to the simulated point light source;
Normalizing the fourth projection vector to obtain a moving direction vector of the simulated point light source in the scene object direction;
and determining the position of the simulated point light source corresponding to the current frame scene object according to the moving direction vector, the distance between the scene object and the reference point light source corresponding to the simulated point light source and the position information of the current frame scene object.
Optionally, the determining module 920 is specifically configured to perform light collision according to the positions of the simulated point light sources, and determine the color of each simulated point light source corresponding to the current frame scene object according to the collision intersection point after the light collision.
Optionally, the determining module 920 is specifically configured to control the simulated point light source to project light in the reference direction, and obtain a collision intersection point after the simulated point light source collides with the virtual object in the reference direction;
acquiring the color of the collision intersection point;
and determining the color of the collision intersection point as the color of the simulated point light source corresponding to the scene object of the current frame.
The above modules may be one or more integrated circuits configured to implement the above methods, for example: one or more application specific integrated circuits (Application Specific Integrated Circuit, abbreviated as ASIC), or one or more microprocessors (digital singnal processor, abbreviated as DSP), or one or more field programmable gate arrays (Field Programmable Gate Array, abbreviated as FPGA), or the like. For another example, when a module above is implemented in the form of a processing element scheduler code, the processing element may be a general-purpose processor, such as a central processing unit (Central Processing Unit, CPU) or other processor that may invoke the program code. For another example, the modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
The modules may be connected or communicate with each other via wired or wireless connections. The wired connection may include a metal cable, optical cable, hybrid cable, or the like, or any combination thereof. The wireless connection may include a connection through a LAN, WAN, bluetooth, zigBee, or NFC, or any combination thereof. Two or more modules may be combined into a single module, and any one module may be divided into two or more units. It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the method embodiments, and are not repeated in the present disclosure.
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application, including: a processor 801, a storage medium 802, and a bus 803, the storage medium 802 storing machine-readable instructions executable by the processor 801, the processor 801 and the storage medium 802 communicating via the bus 803 when the electronic device is running a light information processing method as in the embodiment, the processor 801 executing the machine-readable instructions to perform the steps of:
obtaining reference parameter information, wherein the reference parameter information at least comprises: intensity of the reference point light source, reference distance and reference included angle;
Determining the distance between the scene object and the virtual camera in the current frame according to the position information of the scene object in the current frame and the position information of the virtual camera, and determining the intensity of each simulated point light source corresponding to the scene object in the current frame according to the distance between the scene object and the virtual camera in the current frame, the intensity of the reference point light source and the reference distance;
determining the position of each simulated point light source corresponding to the scene object of the current frame according to the position information of the scene object of the current frame, the position information of the virtual camera, the reference distance and the reference included angle;
determining the color of each simulated point light source corresponding to the scene object of the current frame according to the position of each simulated point light source;
and performing indirect illumination rendering on the scene object of the current frame according to the intensity, the color and the position of the simulated point light source.
In one possible embodiment, the reference distance and the reference angle are determined based on a preset positional relationship among the scene object, the test camera, and the reference point light source.
In one possible embodiment, the processor 801, prior to executing the acquiring of the reference parameter information, is further configured to: according to the position information of the scene object, the position information of the test camera and the position information of at least one reference point light source, respectively determining a first distance vector between the scene object and the test camera and a second distance vector between the scene object and the reference point light source;
Respectively carrying out horizontal plane projection on the first distance vector and the second distance vector to determine a first projection vector and a second projection vector;
and determining an included angle between the connecting line of the scene object and the test camera and the connecting line of the scene object and the reference point light source according to the first projection vector and the second projection vector, and obtaining the reference included angle.
In one possible embodiment, the processor 801, prior to executing the acquiring of the reference parameter information, is further configured to: performing modular operation on the first projection vector to obtain the distance between the scene object and the test camera;
performing modular operation on the second projection vector to obtain the distance between the scene object and the reference point light source;
and taking the distance between the scene object and the test camera and the distance between the scene object and the reference point light source as the reference distance.
In a possible embodiment, the processor 801 is specifically configured to, when executing determining the intensity of each simulated point light source corresponding to the scene object of the current frame according to the distance between the scene object and the virtual camera of the current frame, the intensity of the reference point light source, and the reference distance: if the distance between the scene object and the virtual camera in the current frame is greater than the distance between the scene object and the test camera and is smaller than a preset distance threshold, determining the intensity of the simulated point light source corresponding to the scene object in the current frame according to the distance between the scene object and the virtual camera in the current frame, the distance between the scene object and the test camera, the intensity of the reference point light source corresponding to the simulated point light source and the preset distance threshold.
In a possible embodiment, the processor 801 is specifically configured to, when executing determining the intensity of each simulated point light source corresponding to the scene object of the current frame according to the distance between the scene object and the virtual camera of the current frame, the intensity of the reference point light source, and the reference distance: and if the distance between the scene object and the virtual camera in the current frame is smaller than or equal to the distance between the scene object and the test camera, determining the intensity of the simulated point light source corresponding to the scene object in the current frame according to the intensity of the reference point light source corresponding to the simulated point light source.
In a possible embodiment, the processor 801 is specifically configured to, when executing determining the positions of the simulated point light sources corresponding to the scene object in the current frame according to the position information of the scene object in the current frame, the position information of the virtual camera, the reference distance, and the reference included angle,: determining a third distance vector between the scene object and the virtual camera in the current frame according to the position information of the scene object in the current frame and the position information of the virtual camera, performing horizontal plane projection on the third distance vector, and determining a third projection vector;
Performing modular operation on the third projection vector to obtain the distance between the scene object and the virtual camera in the current frame;
and determining the position of the simulated point light source corresponding to the scene object of the current frame according to the distance between the scene object and the virtual camera of the current frame, the third projection vector, the included angle between the connecting line of the scene object and the test camera and the connecting line of the reference point light source corresponding to the scene object and the simulated point light source, the distance between the scene object and the reference point light source corresponding to the simulated point light source and the position information of the scene object of the current frame.
In a possible embodiment, the processor 801 is specifically configured to, when executing determining the position of the simulated point light source corresponding to the scene object in the current frame according to the distance between the scene object and the virtual camera in the current frame, the third projection vector, the included angle between the line between the scene object and the test camera and the line between the scene object and the reference point light source corresponding to the simulated point light source, the distance between the scene object and the reference point light source corresponding to the simulated point light source, and the position information of the scene object in the current frame: determining a fourth projection vector of a distance vector between the simulated point light source and the scene object of the current frame on a horizontal plane according to the distance between the scene object of the current frame and the virtual camera, the third projection vector, and an included angle between a connecting line of the scene object and the test camera and a connecting line of the scene object and a datum point light source corresponding to the simulated point light source;
Normalizing the fourth projection vector to obtain a moving direction vector of the simulated point light source in the scene object direction;
and determining the position of the simulated point light source corresponding to the scene object in the current frame according to the moving direction vector, the distance between the scene object and the reference point light source corresponding to the simulated point light source and the position information of the scene object in the current frame.
In a possible embodiment, the processor 801 is specifically configured to, when executing determining the color of each simulated point light source corresponding to the scene object in the current frame according to the position of each simulated point light source: and carrying out light collision according to the positions of the simulated point light sources, and determining the colors of the simulated point light sources corresponding to the scene objects in the current frame according to collision intersection points after the light collision.
In a possible embodiment, the processor 801 is specifically configured to, when performing light collision according to the positions of the simulated point light sources, determine the color of each simulated point light source corresponding to the scene object in the current frame according to the collision intersection point after the light collision: controlling the simulated point light source to project light rays to a reference direction, and acquiring collision intersection points after the simulated point light source collides with the virtual object in the reference direction;
Acquiring the color of the collision intersection point;
and determining the color of the collision intersection point as the color of the simulated point light source corresponding to the scene object of the current frame.
By the method, the electronic equipment simulates the reflected light of the game environment to the scene object by generating the information of the simulated point light source corresponding to the scene object of the current frame in real time so as to simulate the illumination effect of the point light source to the scene object. In addition, according to the reference parameter information determined in the test stage, the intensity of the simulated point light source is updated by taking the fact that the intensity of the light source is inversely proportional to the distance between the scene object and the virtual camera as a theoretical basis; the horizontal position of the light source is related to the lens of the player, the irradiation effect of the light source always faces the lens at an initial angle, and the position of the simulated point light source is updated according to the fact that an included angle between the connecting line of the light source and the virtual character and the connecting line of the light source and the virtual camera is a certain angle, so that the scene object moves anyway, and the player can observe the environment reflected light of the simulated scene object from all angles. And the color of the simulated point light source is updated based on the position of the simulated point light source, so that the color information of the simulated point light source is more reasonable. The intensity, the position and the color information accuracy of the simulated point light source of each frame obtained based on the method are higher, so that the indirect illumination is simulated based on the obtained simulated point light source, the indirect illumination rendering effect on a scene object is better, the method is simple in execution process, the calculation consumption is smaller, and the method is suitable for various terminal equipment.
In which the storage medium 802 stores program code that, when executed by the processor 801, causes the processor 801 to execute various steps in the illumination information processing method according to various exemplary embodiments of the present application described in the "exemplary method" section of the present specification.
The processor 801 may be a general purpose processor such as a Central Processing Unit (CPU), digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
The storage medium 802 is a non-volatile computer-readable storage medium that can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules. The Memory may include at least one type of storage medium, which may include, for example, flash Memory, hard disk, multimedia card, card Memory, random access Memory (Random Access Memory, RAM), static random access Memory (Static Random Access Memory, SRAM), programmable Read-Only Memory (Programmable Read Only Memory, PROM), read-Only Memory (ROM), charged erasable programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), magnetic Memory, magnetic disk, optical disk, and the like. The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The storage medium 802 of the present application may also be circuitry or any other device capable of implementing a storage function for storing program instructions and/or data.
Optionally, an embodiment of the present application further provides a computer readable storage medium having stored thereon a computer program, which when executed by a processor, performs the steps of:
obtaining reference parameter information, wherein the reference parameter information at least comprises: intensity of the reference point light source, reference distance and reference included angle;
determining the distance between the scene object and the virtual camera in the current frame according to the position information of the scene object in the current frame and the position information of the virtual camera, and determining the intensity of each simulated point light source corresponding to the scene object in the current frame according to the distance between the scene object and the virtual camera in the current frame, the intensity of the reference point light source and the reference distance;
determining the position of each simulated point light source corresponding to the scene object of the current frame according to the position information of the scene object of the current frame, the position information of the virtual camera, the reference distance and the reference included angle;
determining the color of each simulated point light source corresponding to the scene object of the current frame according to the position of each simulated point light source;
and performing indirect illumination rendering on the scene object of the current frame according to the intensity, the color and the position of the simulated point light source.
In one possible embodiment, the reference distance and the reference angle are determined based on a preset positional relationship among the scene object, the test camera, and the reference point light source.
In one possible embodiment, the processor, prior to executing the acquiring the reference parameter information, is further configured to: according to the position information of the scene object, the position information of the test camera and the position information of at least one reference point light source, respectively determining a first distance vector between the scene object and the test camera and a second distance vector between the scene object and the reference point light source;
respectively carrying out horizontal plane projection on the first distance vector and the second distance vector to determine a first projection vector and a second projection vector;
and determining an included angle between the connecting line of the scene object and the test camera and the connecting line of the scene object and the reference point light source according to the first projection vector and the second projection vector, and obtaining the reference included angle.
In one possible embodiment, the processor, prior to executing the acquiring the reference parameter information, is further configured to: performing modular operation on the first projection vector to obtain the distance between the scene object and the test camera;
Performing modular operation on the second projection vector to obtain the distance between the scene object and the reference point light source;
and taking the distance between the scene object and the test camera and the distance between the scene object and the reference point light source as the reference distance.
In one possible embodiment, the processor is specifically configured to, when executing determining the intensity of each simulated point light source corresponding to the scene object of the current frame according to the distance between the scene object and the virtual camera of the current frame, the intensity of the reference point light source, and the reference distance: if the distance between the scene object and the virtual camera in the current frame is greater than the distance between the scene object and the test camera and is smaller than a preset distance threshold, determining the intensity of the simulated point light source corresponding to the scene object in the current frame according to the distance between the scene object and the virtual camera in the current frame, the distance between the scene object and the test camera, the intensity of the reference point light source corresponding to the simulated point light source and the preset distance threshold.
In one possible embodiment, the processor is specifically configured to, when executing determining the intensity of each simulated point light source corresponding to the scene object of the current frame according to the distance between the scene object and the virtual camera of the current frame, the intensity of the reference point light source, and the reference distance: and if the distance between the scene object and the virtual camera in the current frame is smaller than or equal to the distance between the scene object and the test camera, determining the intensity of the simulated point light source corresponding to the scene object in the current frame according to the intensity of the reference point light source corresponding to the simulated point light source.
In a possible implementation manner, the processor is specifically configured to, when executing determining the positions of the simulated point light sources corresponding to the scene object in the current frame according to the position information of the scene object in the current frame, the position information of the virtual camera, the reference distance and the reference included angle: determining a third distance vector between the scene object and the virtual camera in the current frame according to the position information of the scene object in the current frame and the position information of the virtual camera, performing horizontal plane projection on the third distance vector, and determining a third projection vector;
performing modular operation on the third projection vector to obtain the distance between the scene object and the virtual camera in the current frame;
and determining the position of the simulated point light source corresponding to the scene object of the current frame according to the distance between the scene object and the virtual camera of the current frame, the third projection vector, the included angle between the connecting line of the scene object and the test camera and the connecting line of the reference point light source corresponding to the scene object and the simulated point light source, the distance between the scene object and the reference point light source corresponding to the simulated point light source and the position information of the scene object of the current frame.
In a possible implementation manner, the processor is specifically configured to, when executing the determining the position of the simulated point light source corresponding to the scene object in the current frame according to the distance between the scene object and the virtual camera in the current frame, the third projection vector, the included angle between the line between the scene object and the test camera and the line between the scene object and the reference point light source corresponding to the simulated point light source, the distance between the scene object and the reference point light source corresponding to the simulated point light source, and the position information of the scene object in the current frame: determining a fourth projection vector of a distance vector between the simulated point light source and the scene object of the current frame on a horizontal plane according to the distance between the scene object of the current frame and the virtual camera, the third projection vector, and an included angle between a connecting line of the scene object and the test camera and a connecting line of the scene object and a datum point light source corresponding to the simulated point light source;
normalizing the fourth projection vector to obtain a moving direction vector of the simulated point light source in the scene object direction;
and determining the position of the simulated point light source corresponding to the scene object in the current frame according to the moving direction vector, the distance between the scene object and the reference point light source corresponding to the simulated point light source and the position information of the scene object in the current frame.
In a possible embodiment, the processor is specifically configured to, when executing determining the color of each simulated point light source corresponding to the scene object in the current frame according to the position of each simulated point light source: and carrying out light collision according to the positions of the simulated point light sources, and determining the colors of the simulated point light sources corresponding to the scene objects in the current frame according to collision intersection points after the light collision.
In a possible implementation manner, the processor is specifically configured to, when executing light collision according to the positions of the simulated point light sources and determining the color of each simulated point light source corresponding to the scene object in the current frame according to the collision intersection point after the light collision: controlling the simulated point light source to project light rays to a reference direction, and acquiring collision intersection points after the simulated point light source collides with the virtual object in the reference direction;
acquiring the color of the collision intersection point;
and determining the color of the collision intersection point as the color of the simulated point light source corresponding to the scene object of the current frame.
By the method, the electronic equipment simulates the reflected light of the game environment to the scene object by generating the information of the simulated point light source corresponding to the scene object of the current frame in real time so as to simulate the illumination effect of the point light source to the scene object. In addition, according to the reference parameter information determined in the test stage, the intensity of the simulated point light source is updated by taking the fact that the intensity of the light source is inversely proportional to the distance between the scene object and the virtual camera as a theoretical basis; the horizontal position of the light source is related to the lens of the player, the irradiation effect of the light source always faces the lens at an initial angle, and the position of the simulated point light source is updated according to the fact that an included angle between the connecting line of the light source and the virtual character and the connecting line of the light source and the virtual camera is a certain angle, so that the scene object moves anyway, and the player can observe the environment reflected light of the simulated scene object from all angles. And the color of the simulated point light source is updated based on the position of the simulated point light source, so that the color information of the simulated point light source is more reasonable. The intensity, the position and the color information accuracy of the simulated point light source of each frame obtained based on the method are higher, so that the indirect illumination is simulated based on the obtained simulated point light source, the indirect illumination rendering effect on a scene object is better, the method is simple in execution process, the calculation consumption is smaller, and the method is suitable for various terminal equipment.
In an embodiment of the present application, the computer program may further execute other machine readable instructions when executed by a processor to perform the method as described in other embodiments, and the specific implementation of the method steps and principles are referred to in the description of the embodiments and are not described in detail herein.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (english: processor) to perform some of the steps of the methods according to the embodiments of the application. And the aforementioned storage medium includes: u disk, mobile hard disk, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.

Claims (13)

1. A lighting information processing method, characterized by comprising:
obtaining reference parameter information, wherein the reference parameter information at least comprises: intensity of the reference point light source, reference distance and reference included angle;
Determining the distance between the scene object and the virtual camera in the current frame according to the position information of the scene object in the current frame and the position information of the virtual camera, and determining the intensity of each simulated point light source corresponding to the scene object in the current frame according to the distance between the scene object and the virtual camera in the current frame, the intensity of the reference point light source and the reference distance;
determining the position of each simulated point light source corresponding to the scene object of the current frame according to the position information of the scene object of the current frame, the position information of the virtual camera, the reference distance and the reference included angle;
determining the color of each simulated point light source corresponding to the scene object of the current frame according to the position of each simulated point light source;
and performing indirect illumination rendering on the scene object of the current frame according to the intensity, the color and the position of the simulated point light source.
2. The method of claim 1, wherein the reference distance and the reference angle are determined based on a preset positional relationship among a scene object, a test camera, and a reference point light source.
3. The method of claim 1, wherein prior to the obtaining the reference parameter information, further comprising:
According to the position information of the scene object, the position information of the test camera and the position information of at least one reference point light source, respectively determining a first distance vector between the scene object and the test camera and a second distance vector between the scene object and the reference point light source;
respectively carrying out horizontal plane projection on the first distance vector and the second distance vector to determine a first projection vector and a second projection vector;
and determining an included angle between the connecting line of the scene object and the test camera and the connecting line of the scene object and the reference point light source according to the first projection vector and the second projection vector, and obtaining the reference included angle.
4. A method according to claim 3, wherein prior to said obtaining the reference parameter information, further comprising:
performing modular operation on the first projection vector to obtain the distance between the scene object and the test camera;
performing modular operation on the second projection vector to obtain the distance between the scene object and the reference point light source;
and taking the distance between the scene object and the test camera and the distance between the scene object and the reference point light source as the reference distance.
5. The method according to any one of claims 1 to 4, wherein determining the intensity of each simulated point light source corresponding to the scene object of the current frame according to the distance between the scene object and the virtual camera of the current frame, the intensity of the reference point light source, and the reference distance includes:
if the distance between the scene object and the virtual camera in the current frame is greater than the distance between the scene object and the test camera and is smaller than a preset distance threshold, determining the intensity of the simulated point light source corresponding to the scene object in the current frame according to the distance between the scene object and the virtual camera in the current frame, the distance between the scene object and the test camera, the intensity of the reference point light source corresponding to the simulated point light source and the preset distance threshold.
6. The method of claim 5, wherein determining the intensity of each simulated point light source corresponding to the scene object of the current frame based on the distance between the scene object and the virtual camera of the current frame, the intensity of the reference point light source, and the reference distance comprises:
and if the distance between the scene object and the virtual camera in the current frame is smaller than or equal to the distance between the scene object and the test camera, determining the intensity of the simulated point light source corresponding to the scene object in the current frame according to the intensity of the reference point light source corresponding to the simulated point light source.
7. The method according to any one of claims 1-4, wherein determining the position of each simulated point light source corresponding to the scene object in the current frame according to the position information of the scene object in the current frame, the position information of the virtual camera, the reference distance, and the reference included angle includes:
determining a third distance vector between the scene object and the virtual camera in the current frame according to the position information of the scene object in the current frame and the position information of the virtual camera, performing horizontal plane projection on the third distance vector, and determining a third projection vector;
performing modular operation on the third projection vector to obtain the distance between the scene object and the virtual camera in the current frame;
and determining the position of the simulated point light source corresponding to the scene object of the current frame according to the distance between the scene object and the virtual camera of the current frame, the third projection vector, the included angle between the connecting line of the scene object and the test camera and the connecting line of the reference point light source corresponding to the scene object and the simulated point light source, the distance between the scene object and the reference point light source corresponding to the simulated point light source and the position information of the scene object of the current frame.
8. The method of claim 7, wherein determining the position of the simulated point light source corresponding to the scene object for the current frame based on the distance between the scene object and the virtual camera for the current frame, the third projection vector, an angle between a line connecting the scene object and the test camera and a line connecting the scene object and the reference point light source corresponding to the simulated point light source, the distance between the scene object and the reference point light source corresponding to the simulated point light source, and the position information of the scene object for the current frame, comprises:
determining a fourth projection vector of a distance vector between the simulated point light source and the scene object of the current frame on a horizontal plane according to the distance between the scene object of the current frame and the virtual camera, the third projection vector, and an included angle between a connecting line of the scene object and the test camera and a connecting line of the scene object and a datum point light source corresponding to the simulated point light source;
normalizing the fourth projection vector to obtain a moving direction vector of the simulated point light source in the scene object direction;
and determining the position of the simulated point light source corresponding to the scene object in the current frame according to the moving direction vector, the distance between the scene object and the reference point light source corresponding to the simulated point light source and the position information of the scene object in the current frame.
9. The method according to claim 1, wherein determining the color of each simulated point light source corresponding to the scene object of the current frame according to the position of each simulated point light source comprises:
and carrying out light collision according to the positions of the simulated point light sources, and determining the colors of the simulated point light sources corresponding to the scene objects in the current frame according to collision intersection points after the light collision.
10. The method of claim 9, wherein the determining the color of each simulated point light source corresponding to the scene object of the current frame according to the collision point after the light collision and the position of each simulated point light source comprises:
controlling the simulated point light source to project light rays to a reference direction, and acquiring collision intersection points after the simulated point light source collides with the virtual object in the reference direction;
acquiring the color of the collision intersection point;
and determining the color of the collision intersection point as the color of the simulated point light source corresponding to the scene object of the current frame.
11. An illumination information processing apparatus, comprising: the device comprises an acquisition module, a determination module and a rendering module;
the acquisition module is configured to acquire reference parameter information, where the reference parameter information at least includes: intensity of the reference point light source, reference distance and reference included angle;
The determining module is used for determining the distance between the scene object and the virtual camera in the current frame according to the position information of the scene object in the current frame and the position information of the virtual camera, and determining the intensity of each simulated point light source corresponding to the scene object in the current frame according to the distance between the scene object and the virtual camera in the current frame, the intensity of the reference point light source and the reference distance;
the determining module is used for determining the positions of the simulated point light sources corresponding to the scene objects of the current frame according to the position information of the scene objects of the current frame, the position information of the virtual camera, the reference distance and the reference included angle;
the determining module is used for determining the colors of the simulated point light sources corresponding to the scene objects of the current frame according to the positions of the simulated point light sources;
and the rendering module is used for performing indirect illumination rendering on the scene object of the current frame according to the intensity, the color and the position of the simulated point light source.
12. An electronic device, comprising: a processor, a storage medium, and a bus, the storage medium storing program instructions executable by the processor, the processor and the storage medium communicating over the bus when the electronic device is running, the processor executing the program instructions to perform the illumination information processing method according to any one of claims 1 to 10.
13. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when executed by a processor, performs the illumination information processing method according to any one of claims 1 to 10.
CN202311056697.XA 2023-08-21 2023-08-21 Illumination information processing device, electronic device, and storage medium Pending CN116993896A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311056697.XA CN116993896A (en) 2023-08-21 2023-08-21 Illumination information processing device, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311056697.XA CN116993896A (en) 2023-08-21 2023-08-21 Illumination information processing device, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
CN116993896A true CN116993896A (en) 2023-11-03

Family

ID=88526630

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311056697.XA Pending CN116993896A (en) 2023-08-21 2023-08-21 Illumination information processing device, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN116993896A (en)

Similar Documents

Publication Publication Date Title
CN112316420B (en) Model rendering method, device, equipment and storage medium
US6876362B1 (en) Omnidirectional shadow texture mapping
US7274365B1 (en) Graphical processing of object perimeter information
CN111968215A (en) Volume light rendering method and device, electronic equipment and storage medium
CN112184873B (en) Fractal graph creation method, fractal graph creation device, electronic equipment and storage medium
CN114419240B (en) Illumination rendering method and device, computer equipment and storage medium
CN113436343A (en) Picture generation method and device for virtual studio, medium and electronic equipment
EP4213102A1 (en) Rendering method and apparatus, and device
CN112712582A (en) Dynamic global illumination method, electronic device and computer-readable storage medium
WO2024027286A1 (en) Rendering method and apparatus, and device and storage medium
CN115761105A (en) Illumination rendering method and device, electronic equipment and storage medium
CN116993896A (en) Illumination information processing device, electronic device, and storage medium
CN112465941B (en) Volume cloud processing method and device, electronic equipment and storage medium
CN116958390A (en) Image rendering method, device, equipment, storage medium and program product
US7710419B2 (en) Program, information storage medium, and image generation system
CN114596403A (en) Image processing method, image processing device, storage medium and terminal
JP2007272388A (en) Program, information storage medium and image generation system
JP5848071B2 (en) A method for estimating the scattering of light in a homogeneous medium.
US7724255B2 (en) Program, information storage medium, and image generation system
JP4754385B2 (en) Program, information recording medium, and image generation system
Yang et al. Visual effects in computer games
Sousa et al. Cryengine 3: Three years of work in review
CN116740253B (en) Ray tracing method and electronic equipment
US20230090732A1 (en) System and method for real-time ray tracing in a 3d environment
CN112473135B (en) Real-time illumination simulation method, device and equipment for mobile game and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination