CN112473135A - Real-time illumination simulation method, device, equipment and storage medium for mobile game - Google Patents

Real-time illumination simulation method, device, equipment and storage medium for mobile game Download PDF

Info

Publication number
CN112473135A
CN112473135A CN202011227001.1A CN202011227001A CN112473135A CN 112473135 A CN112473135 A CN 112473135A CN 202011227001 A CN202011227001 A CN 202011227001A CN 112473135 A CN112473135 A CN 112473135A
Authority
CN
China
Prior art keywords
model
scene
data
mobile terminal
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011227001.1A
Other languages
Chinese (zh)
Other versions
CN112473135B (en
Inventor
李进
柴毅哲
代天麒
彭通
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202011227001.1A priority Critical patent/CN112473135B/en
Publication of CN112473135A publication Critical patent/CN112473135A/en
Application granted granted Critical
Publication of CN112473135B publication Critical patent/CN112473135B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The application discloses a real-time illumination simulation method, a real-time illumination simulation device, a real-time illumination simulation equipment and a real-time illumination simulation storage medium for a mobile game, which relate to the technical field of game design. The method comprises the following steps: determining a mobile terminal, and loading a scene map of a target scene where the mobile terminal is located; generating light source simulation data according to a scene light source of a target scene at the current time, model attributes of a plurality of entity models and preset processing parameters; drawing a model shadow for the mobile terminal based on direct light data and indirect light data of a scene light source; and according to the corresponding position attribute of the mobile terminal in the scene map, adopting direct light data, model shadow and indirect light data to carry out illumination processing on the mobile terminal in the scene map.

Description

Real-time illumination simulation method, device, equipment and storage medium for mobile game
Technical Field
The present application relates to the field of game design technologies, and in particular, to a real-time illumination simulation method, device, apparatus, and storage medium for a mobile game.
Background
In recent years, game design techniques have been rapidly developed, and demands of game players for game screens have been increasing, and therefore, simulation of different lighting effects in a game scene has become a key point in game design in order to improve game screen quality and enhance substitution feeling of a game.
In the related art, when a game scene is subjected to illumination simulation, light source information such as the position, direction, color and the like of a light source arranged in the scene is generally determined, an illumination map and a shadow map of the scene are baked according to the light source information, and the illumination map and the shadow map are further processed, so that the overall illumination simulation of the scene is realized.
In carrying out the present application, the applicant has found that the related art has at least the following problems:
some scenes in the game are outdoor scenes, light sources in the outdoor scenes belong to real-time light sources, the light sources need to change along with the change of time, models in the game scenes can move for some mobile games, a large number of light patterns and shadow patterns can be generated by adopting the mode to perform illumination simulation on the scenes, however, the running memory of the game is limited, a large number of pictures cannot be borne, the precision of the shadow in the scenes is limited by the running memory, so that the precision of the shadow of the models in the mobile game is reduced, the image quality of the scenes is poor, and the reality sense of the illumination simulation of the mobile game is difficult to ensure.
Disclosure of Invention
In view of this, the present application provides a real-time illumination simulation method, apparatus, device and storage medium for a mobile game, and mainly aims to solve the problems that the shadow precision of a model in the current mobile game is reduced, the image quality of a scene is poor, and the reality sense of illumination simulation of the mobile game is difficult to guarantee.
According to a first aspect of the present application, there is provided a real-time illumination simulation method for a mobile game, the method comprising:
determining a mobile terminal, and loading a scene map of a target scene where the mobile terminal is located, wherein the scene map is generated by taking a plurality of entity models included in the target scene as a unit, and the model attribute of the mobile terminal is a dynamic model;
generating light source simulation data according to a scene light source of the target scene at the current time, the model attributes of the entity models and preset processing parameters, wherein the light source simulation data comprises indirect light data used for performing illumination simulation on the entity models;
drawing a model shadow for the mobile terminal based on the direct light data and the indirect light data of the scene light source;
and according to the corresponding position attribute of the mobile terminal in the scene map, adopting the direct light data, the model shadow and the indirect light data to carry out illumination processing on the mobile terminal in the scene map, wherein the position attribute is at least a long-range attribute or a short-range attribute.
In another embodiment, before determining the mobile terminal and loading the scene map of the target scene where the mobile terminal is located, the method further includes:
determining the scene light source at the current time, and setting the preset processing parameters for the scene light source, wherein the scene light source comprises a real-time light source, a natural light source and a point light source, and the preset processing parameters indicate whether the scene light source moves and generates a shadow when generating the light source simulation data;
querying an entity type of each entity model in the entity models, extracting a first entity model of which the entity type is a preset entity type from the entity models, and setting a model attribute of the first entity model as a dynamic model;
and counting the length-width ratio of a second entity model in the plurality of entity models, and setting model attributes for the second entity model based on the length-width ratio, wherein the second entity model is the other entity model except the first entity model in the plurality of entity models.
In another embodiment, said setting model attributes for said second solid model based on said aspect ratio comprises:
acquiring a preset model proportion, and comparing the length-width proportion with the preset model proportion;
when the length-width ratio is larger than or equal to the preset model ratio, setting the model attribute of the second entity model as a static model;
and when the length-width ratio is smaller than the preset model ratio, setting the model attribute of the second entity model as the dynamic model.
In another embodiment, the generating light source simulation data according to the scene light source of the target scene at the current time, the model attributes of the plurality of solid models, and the preset processing parameters includes:
baking the scene light source based on the preset processing parameters to obtain first indirect light baking data in a map form, wherein the first indirect light baking data are used for performing illumination simulation on entity models of which the model attributes are static models in the entity models;
baking the scene light source, the first indirect light baking data and the entity models of which the model attributes are static models in the entity models by adopting the preset processing parameters to obtain second indirect light baking data in the form of spherical harmonic data, wherein the second indirect light baking data is used for performing illumination simulation on the entity models of which the model attributes are dynamic models in the entity models;
using the first indirect light bake data and the second indirect light bake data as the light source simulation data.
In another embodiment, the drawing a model shadow for the mobile terminal based on the direct light data and the indirect light data of the scene light source includes:
extracting second indirect light baking data from the indirect light data;
and performing illumination simulation on the mobile terminal by adopting the direct light data and the second indirect light baking data, and drawing the model shadow of the mobile terminal.
In another embodiment, after the drawing a model shadow for the mobile terminal based on the direct light data and the indirect light data of the scene light source, the method further includes:
counting the display occupied area of the mobile terminal in the scene map, and determining the actual display area of the mobile terminal in the target scene;
when the ratio of the display occupied area to the actual display area is smaller than or equal to a ratio threshold, setting the position attribute of the mobile terminal as the perspective attribute;
and when the ratio of the display occupied area to the actual display area is greater than a ratio threshold, setting the position attribute of the mobile terminal as the close-range attribute.
In another embodiment, the performing, by using the direct light data, the model shadow, and the indirect light data according to the corresponding position attribute of the mobile terminal in the scene map, illumination processing on the mobile terminal in the scene map includes:
determining the position attribute of the mobile terminal;
when the position attribute is a distant view attribute, performing illumination processing on the mobile terminal in the scene map by using the direct light data, a first-layer shadow in the model shadow and the indirect light data;
and when the position attribute is a close-range attribute, performing illumination processing on the mobile terminal in the scene map by adopting the direct light data and the second-layer shadow and the third-layer shadow in the model shadow.
According to a second aspect of the present application, there is provided a real-time illumination simulation apparatus for a mobile game, the apparatus comprising:
the loading module is used for determining a mobile terminal and loading a scene map of a target scene where the mobile terminal is located, wherein the scene map is generated by taking a plurality of entity models included in the target scene as a unit, and the model attribute of the mobile terminal is a dynamic model;
the generating module is used for generating light source simulation data according to a scene light source of the target scene at the current time, the model attributes of the entity models and preset processing parameters, wherein the light source simulation data comprises indirect light data used for performing illumination simulation on the entity models;
the drawing module is used for drawing a model shadow for the mobile terminal based on the direct light data and the indirect light data of the scene light source;
and the processing module is used for performing illumination processing on the mobile terminal in the scene map by adopting the direct light data, the model shadow and the indirect light data according to the corresponding position attribute of the mobile terminal in the scene map, wherein the position attribute is at least a long-range attribute or a short-range attribute.
In another embodiment, the apparatus further comprises:
the first setting module is used for determining the scene light source at the current time and setting the preset processing parameters for the scene light source, wherein the scene light source comprises a real-time light source, a natural light source and a point light source, and the preset processing parameters indicate whether the scene light source moves and generates a shadow when generating the light source simulation data;
the query module is used for querying the entity type of each entity model in the entity models, extracting a first entity model of which the entity type is a preset entity type from the entity models, and setting the model attribute of the first entity model as a dynamic model;
and the statistical module is used for counting the length-width ratio of a second entity model in the entity models, and setting model attributes for the second entity model based on the length-width ratio, wherein the second entity model is other entity models except the first entity model in the entity models.
In another embodiment, the statistical module is configured to obtain a preset model ratio, and compare the length-width ratio with the preset model ratio; when the length-width ratio is larger than or equal to the preset model ratio, setting the model attribute of the second entity model as a static model; and when the length-width ratio is smaller than the preset model ratio, setting the model attribute of the second entity model as the dynamic model.
In another embodiment, the generating module is configured to bake the scene light source based on the preset processing parameter to obtain first indirect light baking data in a map form, where the first indirect light baking data is used to perform illumination simulation on an entity model with static model attributes in the entity models; baking the scene light source, the first indirect light baking data and the entity models of which the model attributes are static models in the entity models by adopting the preset processing parameters to obtain second indirect light baking data in the form of spherical harmonic data, wherein the second indirect light baking data is used for performing illumination simulation on the entity models of which the model attributes are dynamic models in the entity models; using the first indirect light bake data and the second indirect light bake data as the light source simulation data.
In another embodiment, the rendering module is configured to extract second indirect light baking data from the indirect light data; and performing illumination simulation on the mobile terminal by adopting the direct light data and the second indirect light baking data, and drawing the model shadow of the mobile terminal.
In another embodiment, the apparatus further comprises:
the determining module is used for counting the display occupied area of the mobile terminal in the scene map and determining the actual display area of the mobile terminal in the target scene;
the second setting module is used for setting the position attribute of the mobile terminal as the long-range view attribute when the ratio of the display occupied area to the actual display area is smaller than or equal to a ratio threshold;
the second setting module is further configured to set the position attribute of the moving end as the close-range attribute when a ratio of the display occupied area to the actual display area is greater than a ratio threshold.
In another embodiment, the processing module is configured to determine a location attribute of the mobile terminal; when the position attribute is a distant view attribute, performing illumination processing on the mobile terminal in the scene map by using the direct light data, a first-layer shadow in the model shadow and the indirect light data; and when the position attribute is a close-range attribute, performing illumination processing on the mobile terminal in the scene map by adopting the direct light data and the second-layer shadow and the third-layer shadow in the model shadow.
According to a third aspect of the present application, there is provided an apparatus comprising a memory storing a computer program and a processor implementing the steps of the method of the first aspect when the processor executes the computer program.
According to a fourth aspect of the present application, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, carries out the steps of the method of the first aspect described above.
By the technical scheme, the application provides a real-time illumination simulation method, a device, equipment and a storage medium for a mobile game, which determine whether an entity model can generate indirect light according to the model attribute of the entity model in a target scene of the mobile game, further generate light source simulation data comprising indirect light data for performing illumination simulation on a plurality of entity models by baking a scene light source, draw the shadow of a mobile terminal in the target scene by adopting the light source simulation data and direct light data generated by the scene light source, so that the illumination simulation is performed on the mobile terminal by using corresponding parameters according to the position attribute of the mobile terminal in the target scene without drawing a large number of maps for illumination, shadow and the like, and the illumination simulation is performed by directly selecting corresponding parameters according to the position of the mobile terminal, so that the shadow precision of the model in the mobile game is not limited by a game running memory, the image quality of the scene is improved, and the reality sense of illumination simulation of the mobile game is guaranteed.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flow chart of a real-time illumination simulation method for a mobile game according to an embodiment of the present disclosure;
FIG. 2 is a flow chart of a real-time illumination simulation method for a mobile game according to an embodiment of the present disclosure;
fig. 3A is a schematic structural diagram illustrating a real-time illumination simulation apparatus for a mobile game according to an embodiment of the present application;
FIG. 3B is a schematic structural diagram illustrating a real-time illumination simulation apparatus for a mobile game according to an embodiment of the present disclosure;
fig. 3C is a schematic structural diagram illustrating a real-time illumination simulation apparatus for a mobile game according to an embodiment of the present application;
fig. 4 shows a schematic structural diagram of an apparatus provided in an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The embodiment of the application provides a real-time illumination simulation method for a mobile game, as shown in fig. 1, the method includes:
101. determining a mobile terminal, loading a scene map of a target scene where the mobile terminal is located, wherein the scene map is generated by taking a plurality of entity models included in the target scene as a unit, and the model attribute of the mobile terminal is a dynamic model.
102. And generating light source simulation data according to the scene light source of the target scene at the current time, the model attributes of the entity models and preset processing parameters, wherein the light source simulation data comprises indirect light data for performing illumination simulation on the entity models.
103. And drawing a model shadow for the mobile terminal based on the direct light data and indirect light data of the scene light source.
104. And according to the corresponding position attribute of the mobile terminal in the scene map, adopting direct light data, model shadow and indirect light data to carry out illumination processing on the mobile terminal in the scene map, wherein the position attribute is at least a distant view attribute or a close view attribute.
The method provided by the embodiment of the application determines whether the entity model can generate indirect light according to the model attribute of the entity model in the target scene of the mobile game, further generating light source simulation data including indirect light data for performing illumination simulation on the plurality of solid models by baking the scene light source, drawing a shadow of a moving end in the target scene by using the light source simulation data and direct light data generated by the scene light source, the method and the device have the advantages that the corresponding parameters are used for carrying out illumination simulation on the mobile terminal according to the position attribute of the mobile terminal in the target scene, a large number of pictures do not need to be drawn for illumination, shadow and the like, the corresponding parameters are directly selected according to the position of the mobile terminal for carrying out illumination simulation, the shadow precision of a model in the mobile game is not limited by a game running memory, the image quality of the scene is improved, and the reality of the illumination simulation of the mobile game is guaranteed.
An embodiment of the present application provides a real-time illumination simulation method for a mobile game, as shown in fig. 2, the method includes:
201. and determining a mobile terminal, and loading a scene map of a target scene where the mobile terminal is located.
In recent years, the design of game scenes is more and more vivid, illumination simulation is required to be carried out in the game scenes whether outdoors or indoors, and the reality of the game scenes is directly influenced by the effect of the illumination simulation. At present, when lighting simulation is performed on a game scene, light source information such as a position, a direction, and a color of a light source set in the scene is generally determined, a light map and a shadow map of the scene are baked according to the light source information, and the light map and the shadow map are processed to realize global lighting simulation of the scene. However, the applicant recognizes that the illumination in many game scenes is real-time, for example, the outdoor natural light continuously changes along with time, and many models in the scenes can move, and the real-time illumination and the movement of the models require that the illumination simulation in the game is more detailed, but the requirement also requires that the operation of the game occupies a larger memory, and many games cannot occupy an increased memory for operation, so that the effect of the illumination simulation of the scenes in many games is limited by the memory, the shadow precision of the moving models in the scenes is reduced, the image quality of the scenes is poor, and the reality of the illumination simulation of the moving models is difficult to ensure. Therefore, the present application provides a real-time illumination simulation method for a mobile model, which determines whether an entity model can generate indirect light according to model attributes of the entity model in a target scene, further generating light source simulation data including indirect light data for performing illumination simulation on the plurality of solid models by baking the scene light source, drawing the shadow of the mobile terminal in the target scene by using the light source simulation data and the direct light data generated by the scene light source, the method and the device have the advantages that the corresponding parameters are used for carrying out illumination simulation on the mobile terminal according to the position attribute of the mobile terminal in the target scene, a large number of pictures are not required to be drawn for illumination, shadow and the like, the corresponding parameters are directly selected according to the position of the mobile terminal for carrying out the illumination simulation, the shadow precision of the mobile terminal is not limited by a game running memory, the image quality of the scene is improved, and the reality of the illumination simulation of the mobile model is guaranteed.
In order to implement the technical solution described in the present application, a scene map of a target scene needs to be prepared, so that the scene map of the target scene is directly loaded, and corresponding baking processing is performed on the basis of the scene map. The scene map is generated by taking a plurality of entity models included in the target scene as a unit, can be a map generated off-line, and is a full scene map of the target scene. Specifically, since some game development tools exist at present, for example, UE4(Unreal Engine 4), the scene map may be obtained by baking the global illumination for the field to be processed for LQ light map generated by the game development tool, and in the game development tool, the global illumination in the target scene may be controlled to be finer by selecting the global illumination. It should be noted that, in practice, the technical solutions described in the embodiments of the present application can be implemented by setting a series of parameters in a game development tool, and the game development tool is exemplified as UE4 in the present application. In addition, in the present application, a mobile end is taken as an example for explanation, a model attribute of the mobile end is a dynamic model, that is, the mobile end is capable of moving, and all entity models whose model attributes are dynamic models in a target scene can be used as the mobile end to perform illumination simulation by adopting the scheme in the embodiment of the present application.
202. Setting preset processing parameters for a scene light source in a target scene and setting model attributes for a plurality of solid models in a scene map.
In the embodiment of the application, before the simulation illumination is performed on the target scene, the setting of parameters such as a light source and a model in the target scene needs to be performed in the game development tool. When setting preset processing parameters for a scene Light source in a target scene, for a real-time Light source in the target scene, that is, a directional Light in a game development tool, it is necessary to set a stateful (fixed) during baking and a Movable (Movable) during running. The game development tool can also have setting of Cascade Shadow Maps (waterfall type Shadow Maps), and the Cascade Shadow Maps can be set to be 500-distance and 2-level shadows. Further, for a natural Light source in the target scene, that is, Sky Light in the game development tool, it needs to be set as baking-time Stationary, runtime Stationary, Sky Light will generate Sky occupancy (ambient Light shielding), SourceType is set as SlS Specified cube map, that is, SourceType uses SceneCaptureCube to bake cube after the Light baking is completed. In addition, the natural light source can be modified in intensity and color for 24 hours use in a game development tool. Further, for a Point Light source in the target scene, that is, a Point/Spot Light source in the game development tool, it needs to be set to Movable and Shadows off. In addition, there may be a rectght (visual light) in some scenes, and in the game development tool, the rectght may be set to Static. Through the process, the scene light source is determined, preset processing parameters are set for the scene light source, the scene light source comprises a real-time light source, a natural light source and a point light source, and the preset processing parameters indicate whether the scene light source moves and whether shadows are generated when light source simulation data are generated. Moreover, after the setting of scene light source parameters is completed, the overlapping degree of lamplight can be checked in a game development tool, and the phenomenon that too deep color appears in a target scene is avoided.
Subsequently, considering that some larger solid models not only receive the illumination from the light source in the real scene, but also generate some indirect light, which will be reflected to other models diffusely, it is necessary to set model attributes for a plurality of solid models in the scene map, determine which solid models will generate indirect light, and which solid models only receive light and do not generate light, so as to process the shadow and illumination of the solid models in the following. When setting the model attributes, the principle of complying with is to set a larger model in a scene as a static model, such as a mountain stone, a building, a landmark building, and the like, specifically, a human body ratio can be used as a reference, a model with a length and a width both greater than or equal to the human body ratio is set as the static model, and a model with a length and a width both less than the human body ratio is set as a dynamic model, such as a grass, a small stone, and the like. However, in the solid model in the above-mentioned human body ratio, there is a solid model of a tree, which has a large or small size and a light shielding capability far inferior to that of some rocks, buildings, and the like, and the indirect light generated is practically negligible, and therefore, it is necessary to use the solid model of the tree as a dynamic model. In order to treat entity models of a tree differently, the tree is set as a preset entity type in advance, so that model attributes are set for a plurality of entity models in a scene map, firstly, the entity type of each entity model in the entity models is inquired, a first entity model with the entity type being the preset entity type is extracted from the entity models, namely the entity model is extracted as the first entity model of the tree, and the model attributes of the first entity model are directly set as a dynamic model. And then, counting the length-width ratio of a second entity model in the plurality of entity models, and setting model attributes for the second entity model based on the length-width ratio, wherein the second entity model is other entity models except the first entity model in the plurality of entity models.
When setting the model attribute for the second entity model, first, a preset model proportion, that is, a standard human body proportion, is obtained, and specifically, the preset model proportion may change with the size of a scene for different scenes. And comparing the length-width ratio with a preset model ratio, determining that the second entity model is a larger model such as a mountain stone and a building when the length-width ratio is greater than or equal to the preset model ratio, and setting the model attribute of the second entity model as a static model. On the contrary, when the length-width ratio is smaller than the preset model ratio, the second entity model is determined to be a small entity model such as small stones and grass, and the model attribute of the second entity model is set as the dynamic model.
It should be noted that, in the practical application process, the material in the target scene needs to be set, so that the developer can set the corresponding material in the game development tool according to the requirement of game development. In addition, when setting a material, AO (Ambient Occlusion) of the material in the scene map needs to be reinforced alone, and the target scene does not need to be used globally.
In the process of practical application, some game development tools may not support parameter setting of the diversified environment, so in the embodiment of the present application, a configuration file related to environment setting is provided, the configuration file is written into the game development tool through an editor, is introduced into a directory of the game development tool, provides an additional function in the game development tool, ensures that direct light of sky light is not generated when illumination simulation is performed, and can realize setting of more parameters in the environment based on the additional function.
203. And generating light source simulation data according to the scene light source of the target scene at the current time, the model attributes of the entity models and preset processing parameters.
In this embodiment of the application, some solid models in the target scene may generate indirect light illumination, such as light generated by reflection, for other solid models after receiving illumination, where the indirect light is static light, but it is still necessary to simulate real-time adjustment of color and brightness of the indirect light by using materials, so that light source simulation data is generated according to a scene light source of the target scene at the current time, model attributes of a plurality of solid models, and preset processing parameters, where the light source simulation data includes indirect light data used for performing illumination simulation on the plurality of solid models, and then illumination simulation is performed on the solid models in the target scene by using the indirect light data.
Specifically, firstly, a scene light source is baked based on preset processing parameters to obtain first indirect light baking data in a map form, the first indirect light baking data is used for performing illumination simulation on entity models with model attributes being static models in a plurality of entity models, and the scene light source with the preset processing parameters is baked through a game development tool. And then, baking the scene light source, the first indirect light baking data and the entity models of which the model attributes are static models in the entity models by adopting preset processing parameters to obtain second indirect light baking data in the form of spherical harmonic data, wherein the second indirect light baking data is used for performing illumination simulation on the entity models of which the model attributes are dynamic models in the entity models, and similarly, the scene light source with the preset processing parameters can be baked by a game development tool. And finally, taking the first indirect light baking data and the second indirect light baking data as light source simulation data.
It should be noted that the game level scale is proportional to the unit of the game development tool, 1 unit 1 cm, and the smaller scale will greatly increase the construction time, determining how detailed the lighting calculation is. Thus, for a giant level, the units of the game development tool can be set to 2 or 4 to reduce the time to build the lighting, the larger the number, the faster the bake. In addition, the number of reflections of GI (Global Illumination) needs to be set when the baking process is performed, starting from the scene light source. 0 refers to direct illumination only, 1 refers to the first reflection, the first reflection takes the longest time in calculation, the 2 nd reflection is followed, and subsequent reflection calculation does not take much time. Furthermore, when the proportion of interpolation of Photon (luminosity) during the calculation of Irradiance Caching is set in the game development tool, the proportion can be set at 66 or 75, so that less noise is hidden, and meanwhile, the loss of details caused by indirect shadow and environmental shielding is avoided, and a smooth indirect illumination effect is obtained. The parameters are only examples of parameter setting of the game development tool during illumination baking, in the process of practical application, starting of static environment shielding, setting of a complete shielding sample proportion, setting of a maximum distance and the like also exist in the game development tool, developers can set the parameters according to development requirements of the game, and details are not repeated in the application.
204. And drawing a model shadow for the mobile terminal based on the direct light data and indirect light data of the scene light source.
In the embodiment of the application, after the light source simulation data is generated, the model shadow is drawn for the mobile terminal based on the direct light data and the indirect light data of the scene light source. And the model attribute of the model to be processed is a dynamic model, so that second indirect light baking data is extracted from the indirect light data, the direct light data and the second indirect light baking data are adopted to carry out illumination simulation on the mobile terminal, and the model shadow of the mobile terminal is drawn. In fact, all solid models whose model attributes are dynamic models in the target scene can draw model shadows by adopting the process. And the entity model with the model attribute being the static model also exists in the target scene, and for the entity model with the model attribute being the static model in the entity models, the entity model is subjected to illumination simulation by adopting direct light data and first indirect light baking data in the indirect light data, and the model shadow of the entity model is drawn.
In the actual application process, no matter the model is a dynamic model or a static model, the generated model shadow can be a layered shadow, and specifically, the model shadow can be generated by setting colors and shadow distances of different layers in a game development tool by a developer. The model shadow comprises three layers, and the three layers of shadow are respectively used in illumination simulation of entity models with different position attributes.
205. And determining the position attribute of the mobile terminal.
In the embodiment of the present application, because the far entity model and the near entity model exist in the scene map, and the far entity model and the near entity model are different in the representation of the model shadow and the degree of the illumination simulation, the position attribute of the mobile terminal is determined, and then the illumination simulation is performed on the mobile terminal according to the position attribute, where the position attribute is divided into a long-range view attribute and a short-range view attribute.
When the position attribute is determined, the display occupied area of the mobile terminal in the scene map is counted, and the actual display area of the mobile terminal in the target scene is determined. When the ratio of the display occupied area to the actual display area is less than or equal to the ratio threshold, it indicates that the size of the mobile terminal is actually large, but is small when embodied in the scene map, and it may be determined that the mobile terminal is far away, and therefore, the position attribute of the mobile terminal is set as the distant view attribute. Conversely, when the ratio of the display occupied area to the actual display area is greater than the ratio threshold, it indicates that the actual size of the mobile terminal is not much different from the size embodied in the map, and it may be determined that the mobile terminal is close, and therefore, the position attribute of the mobile terminal is set as the close-range attribute. For each solid model in the target scene, the above process may be adopted to set the location attribute.
206. And according to the corresponding position attribute of the mobile terminal in the scene map, adopting direct light data, model shadow and indirect light data to carry out illumination processing on the mobile terminal in the scene map.
In the embodiment of the present application, after the position attribute of the mobile terminal is determined, the mobile terminal in the scene map may be subjected to illumination processing by using direct light data, model shadow, and indirect light data according to the corresponding position attribute of the mobile terminal in the scene map.
For mobile terminals with different position attributes, model shadows of different layers are needed to be adopted for illumination simulation, and the specific process is as follows: and determining the position attribute of the mobile terminal. When the position attribute is a long-range view attribute, the direct light data, the first-layer shadow in the model shadow of the mobile terminal and the indirect light data are adopted to carry out illumination processing on the mobile terminal in the scene map, namely the direct light data, the first-layer shadow and the indirect light data are combined to carry out illumination processing on the mobile terminal. And when the position attribute is a close-range attribute, performing illumination processing on the mobile terminal in the scene map by adopting the direct light data and the second-layer shadow and the third-layer shadow in the model shadow of the mobile terminal, namely combining the direct light data, the second-layer shadow and the third-layer shadow to perform illumination processing on the mobile terminal.
The method provided by the embodiment of the application determines whether the entity model can generate indirect light according to the model attribute of the entity model in the target scene of the mobile game, further generating light source simulation data including indirect light data for performing illumination simulation on the plurality of solid models by baking the scene light source, drawing a shadow of a moving end in the target scene by using the light source simulation data and direct light data generated by the scene light source, the method and the device have the advantages that the corresponding parameters are used for carrying out illumination simulation on the mobile terminal according to the position attribute of the mobile terminal in the target scene, a large number of pictures do not need to be drawn for illumination, shadow and the like, the corresponding parameters are directly selected according to the position of the mobile terminal for carrying out illumination simulation, the shadow precision of a model in the mobile game is not limited by a game running memory, the image quality of the scene is improved, and the reality of the illumination simulation of the mobile game is guaranteed.
Further, as a specific implementation of the method shown in fig. 1, an embodiment of the present application provides a real-time illumination simulation apparatus for a mobile game, as shown in fig. 3A, the apparatus includes: a loading module 301, a generating module 302, a drawing module 303 and a processing module 304.
The loading module 301 is configured to determine a mobile terminal, and load a scene map of a target scene where the mobile terminal is located, where the scene map is generated by using a plurality of entity models included in the target scene as a unit, and a model attribute of the mobile terminal is a dynamic model;
the generating module 302 is configured to generate light source simulation data according to a scene light source of the target scene at a current time, the model attributes of the entity models, and preset processing parameters, where the light source simulation data includes indirect light data used for performing illumination simulation on the entity models;
the drawing module 303 is configured to draw a model shadow for the mobile terminal based on the direct light data and the indirect light data of the scene light source;
the processing module 304 is configured to perform illumination processing on the mobile terminal in the scene map by using the direct light data, the model shadow, and the indirect light data according to a position attribute corresponding to the mobile terminal in the scene map, where the position attribute is at least a long-range attribute or a short-range attribute.
In a specific application scenario, as shown in fig. 3B, the apparatus further includes: a first setup module 305, a query module 306 and a statistics module 307.
The first setting module 305 is configured to determine the scene light source at the current time, and set the preset processing parameter for the scene light source, where the scene light source includes a real-time light source, a natural light source, and a point light source, and the preset processing parameter indicates whether the scene light source moves and generates a shadow when generating the light source simulation data;
the query module 306 is configured to query an entity type of each entity model in the plurality of entity models, extract a first entity model of which the entity type is a preset entity type from the plurality of entity models, and set a model attribute of the first entity model as a dynamic model;
the statistical module 307 is configured to count a length-width ratio of a second entity model of the plurality of entity models, and set a model attribute for the second entity model based on the length-width ratio, where the second entity model is another entity model of the plurality of entity models except the first entity model.
In a specific application scenario, the statistical module 307 is configured to obtain a preset model ratio, and compare the length-width ratio with the preset model ratio; when the length-width ratio is larger than or equal to the preset model ratio, setting the model attribute of the second entity model as a static model; and when the length-width ratio is smaller than the preset model ratio, setting the model attribute of the second entity model as the dynamic model.
In a specific application scene, the generating module 302 is configured to bake the scene light source based on the preset processing parameter to obtain first indirect light baking data in a chartlet form, where the first indirect light baking data is used to perform illumination simulation on an entity model with static model attributes in the entity models; baking the scene light source, the first indirect light baking data and the entity models of which the model attributes are static models in the entity models by adopting the preset processing parameters to obtain second indirect light baking data in the form of spherical harmonic data, wherein the second indirect light baking data is used for performing illumination simulation on the entity models of which the model attributes are dynamic models in the entity models; using the first indirect light bake data and the second indirect light bake data as the light source simulation data.
In a specific application scenario, the rendering module 303 is configured to extract second indirect light baking data from the indirect light data; and performing illumination simulation on the mobile terminal by adopting the direct light data and the second indirect light baking data, and drawing the model shadow of the mobile terminal.
In a specific application scenario, as shown in fig. 3C, the apparatus further includes: a determination module 308 and a second setting module 309.
The determining module 308 is configured to count a display occupied area of the mobile terminal in the scene map, and determine an actual display area of the mobile terminal in the target scene;
the second setting module 309, configured to set the position attribute of the mobile terminal as the perspective attribute when the ratio of the display occupied area to the actual display area is smaller than or equal to a ratio threshold;
the second setting module 309 is further configured to set the position attribute of the moving end as the close-range attribute when the ratio of the display occupied area to the actual display area is greater than a ratio threshold.
In a specific application scenario, the processing module 304 is configured to determine a location attribute of the mobile terminal; when the position attribute is a distant view attribute, performing illumination processing on the mobile terminal in the scene map by using the direct light data, a first-layer shadow in the model shadow and the indirect light data; and when the position attribute is a close-range attribute, performing illumination processing on the mobile terminal in the scene map by adopting the direct light data and the second-layer shadow and the third-layer shadow in the model shadow.
The device provided by the embodiment of the application determines whether the entity model can generate indirect light according to the model attribute of the entity model in the target scene of the mobile game, further generating light source simulation data including indirect light data for performing illumination simulation on the plurality of solid models by baking the scene light source, drawing a shadow of a moving end in the target scene by using the light source simulation data and direct light data generated by the scene light source, the method and the device have the advantages that the corresponding parameters are used for carrying out illumination simulation on the mobile terminal according to the position attribute of the mobile terminal in the target scene, a large number of pictures do not need to be drawn for illumination, shadow and the like, the corresponding parameters are directly selected according to the position of the mobile terminal for carrying out illumination simulation, the shadow precision of a model in the mobile game is not limited by a game running memory, the image quality of the scene is improved, and the reality of the illumination simulation of the mobile game is guaranteed.
It should be noted that other corresponding descriptions of the functional units related to the real-time illumination simulation apparatus for a mobile game provided in the embodiment of the present application may refer to the corresponding descriptions in fig. 1 and fig. 2, and are not described herein again.
In an exemplary embodiment, referring to fig. 4, there is further provided a device, where the device 400 includes a communication bus, a processor, a memory, and a communication interface, and may further include an input/output interface and a display device, where the functional units may communicate with each other through the bus. The memory stores computer programs, and the processor is used for executing the programs stored in the memory and executing the real-time illumination simulation method of the mobile game in the embodiment.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the real-time illumination simulation method of a mobile game.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present application can be implemented by hardware, and also by software plus a necessary general hardware platform. Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the implementation scenarios of the present application.
Those skilled in the art will appreciate that the figures are merely schematic representations of one preferred implementation scenario and that the blocks or flow diagrams in the figures are not necessarily required to practice the present application.
Those skilled in the art will appreciate that the modules in the devices in the implementation scenario may be distributed in the devices in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the present implementation scenario with corresponding changes. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The above application serial numbers are for description purposes only and do not represent the superiority or inferiority of the implementation scenarios.
The above disclosure is only a few specific implementation scenarios of the present application, but the present application is not limited thereto, and any variations that can be made by those skilled in the art are intended to fall within the scope of the present application.

Claims (10)

1. A real-time illumination simulation method for a mobile game, comprising:
determining a mobile terminal, and loading a scene map of a target scene where the mobile terminal is located, wherein the scene map is generated by taking a plurality of entity models included in the target scene as a unit, and the model attribute of the mobile terminal is a dynamic model;
generating light source simulation data according to a scene light source of the target scene at the current time, the model attributes of the entity models and preset processing parameters, wherein the light source simulation data comprises indirect light data used for performing illumination simulation on the entity models;
drawing a model shadow for the mobile terminal based on the direct light data and the indirect light data of the scene light source;
and according to the corresponding position attribute of the mobile terminal in the scene map, adopting the direct light data, the model shadow and the indirect light data to carry out illumination processing on the mobile terminal in the scene map, wherein the position attribute is at least a long-range attribute or a short-range attribute.
2. The method according to claim 1, wherein before determining the mobile terminal and loading the scene map of the target scene where the mobile terminal is located, the method further comprises:
determining the scene light source at the current time, and setting the preset processing parameters for the scene light source, wherein the scene light source comprises a real-time light source, a natural light source and a point light source, and the preset processing parameters indicate whether the scene light source moves and generates a shadow when generating the light source simulation data;
querying an entity type of each entity model in the entity models, extracting a first entity model of which the entity type is a preset entity type from the entity models, and setting a model attribute of the first entity model as a dynamic model;
and counting the length-width ratio of a second entity model in the plurality of entity models, and setting model attributes for the second entity model based on the length-width ratio, wherein the second entity model is the other entity model except the first entity model in the plurality of entity models.
3. The method of claim 2, wherein setting model attributes for the second solid model based on the aspect ratio comprises:
acquiring a preset model proportion, and comparing the length-width proportion with the preset model proportion;
when the length-width ratio is larger than or equal to the preset model ratio, setting the model attribute of the second entity model as a static model;
and when the length-width ratio is smaller than the preset model ratio, setting the model attribute of the second entity model as the dynamic model.
4. The method of claim 1, wherein generating light source simulation data according to the scene light source of the target scene at the current time, the model attributes of the plurality of solid models, and preset processing parameters comprises:
baking the scene light source based on the preset processing parameters to obtain first indirect light baking data in a map form, wherein the first indirect light baking data are used for performing illumination simulation on entity models of which the model attributes are static models in the entity models;
baking the scene light source, the first indirect light baking data and the entity models of which the model attributes are static models in the entity models by adopting the preset processing parameters to obtain second indirect light baking data in the form of spherical harmonic data, wherein the second indirect light baking data is used for performing illumination simulation on the entity models of which the model attributes are dynamic models in the entity models;
using the first indirect light bake data and the second indirect light bake data as the light source simulation data.
5. The method of claim 1, wherein the drawing a model shadow for the mobile end based on the direct light data and the indirect light data of the scene light source comprises:
extracting second indirect light baking data from the indirect light data;
and performing illumination simulation on the mobile terminal by adopting the direct light data and the second indirect light baking data, and drawing the model shadow of the mobile terminal.
6. The method of claim 1, wherein after the rendering a model shadow for the mobile end based on the direct light data and the indirect light data of the scene light source, the method further comprises:
counting the display occupied area of the mobile terminal in the scene map, and determining the actual display area of the mobile terminal in the target scene;
when the ratio of the display occupied area to the actual display area is smaller than or equal to a ratio threshold, setting the position attribute of the mobile terminal as the perspective attribute;
and when the ratio of the display occupied area to the actual display area is greater than a ratio threshold, setting the position attribute of the mobile terminal as the close-range attribute.
7. The method according to claim 1, wherein the performing illumination processing on the mobile terminal in the scene map by using the direct light data, the model shadow, and the indirect light data according to the corresponding position attribute of the mobile terminal in the scene map comprises:
determining the position attribute of the mobile terminal;
when the position attribute is a distant view attribute, performing illumination processing on the mobile terminal in the scene map by using the direct light data, a first-layer shadow in the model shadow and the indirect light data;
and when the position attribute is a close-range attribute, performing illumination processing on the mobile terminal in the scene map by adopting the direct light data and the second-layer shadow and the third-layer shadow in the model shadow.
8. A real-time lighting simulation apparatus for a mobile game, comprising:
the loading module is used for determining a mobile terminal and loading a scene map of a target scene where the mobile terminal is located, wherein the scene map is generated by taking a plurality of entity models included in the target scene as a unit, and the model attribute of the mobile terminal is a dynamic model;
the generating module is used for generating light source simulation data according to a scene light source of the target scene at the current time, the model attributes of the entity models and preset processing parameters, wherein the light source simulation data comprises indirect light data used for performing illumination simulation on the entity models;
the drawing module is used for drawing a model shadow for the mobile terminal based on the direct light data and the indirect light data of the scene light source;
and the processing module is used for performing illumination processing on the mobile terminal in the scene map by adopting the direct light data, the model shadow and the indirect light data according to the corresponding position attribute of the mobile terminal in the scene map, wherein the position attribute is at least a long-range attribute or a short-range attribute.
9. An apparatus comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 7 when executing the computer program.
10. A storage medium having a computer program stored thereon, the computer program, when being executed by a processor, realizing the steps of the method of any one of claims 1 to 7.
CN202011227001.1A 2020-11-06 2020-11-06 Real-time illumination simulation method, device and equipment for mobile game and storage medium Active CN112473135B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011227001.1A CN112473135B (en) 2020-11-06 2020-11-06 Real-time illumination simulation method, device and equipment for mobile game and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011227001.1A CN112473135B (en) 2020-11-06 2020-11-06 Real-time illumination simulation method, device and equipment for mobile game and storage medium

Publications (2)

Publication Number Publication Date
CN112473135A true CN112473135A (en) 2021-03-12
CN112473135B CN112473135B (en) 2024-05-10

Family

ID=74928686

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011227001.1A Active CN112473135B (en) 2020-11-06 2020-11-06 Real-time illumination simulation method, device and equipment for mobile game and storage medium

Country Status (1)

Country Link
CN (1) CN112473135B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113786616A (en) * 2021-09-30 2021-12-14 天津亚克互动科技有限公司 Indirect illumination implementation method and device, storage medium and computing equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722901A (en) * 2011-03-29 2012-10-10 腾讯科技(深圳)有限公司 Method and apparatus for processing images
CN104008563A (en) * 2014-06-07 2014-08-27 长春理工大学 Method for achieving global illumination drawing of animation three-dimensional scene with virtual point light sources
CN104268923A (en) * 2014-09-04 2015-01-07 无锡梵天信息技术股份有限公司 Illumination method based on picture level images
WO2018028669A1 (en) * 2016-08-12 2018-02-15 腾讯科技(深圳)有限公司 Illumination processing method in 3d scenario, terminal, server, and storage medium
CN109934904A (en) * 2019-03-15 2019-06-25 网易(杭州)网络有限公司 Static light is according to baking processing method, device, equipment and readable storage medium storing program for executing
CN110152291A (en) * 2018-12-13 2019-08-23 腾讯科技(深圳)有限公司 Rendering method, device, terminal and the storage medium of game picture
CN111127624A (en) * 2019-12-27 2020-05-08 珠海金山网络游戏科技有限公司 Illumination rendering method and device based on AR scene
CN111161393A (en) * 2019-12-31 2020-05-15 威创集团股份有限公司 Real-time light effect dynamic display method and system based on 3D map

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722901A (en) * 2011-03-29 2012-10-10 腾讯科技(深圳)有限公司 Method and apparatus for processing images
CN104008563A (en) * 2014-06-07 2014-08-27 长春理工大学 Method for achieving global illumination drawing of animation three-dimensional scene with virtual point light sources
CN104268923A (en) * 2014-09-04 2015-01-07 无锡梵天信息技术股份有限公司 Illumination method based on picture level images
WO2018028669A1 (en) * 2016-08-12 2018-02-15 腾讯科技(深圳)有限公司 Illumination processing method in 3d scenario, terminal, server, and storage medium
CN110152291A (en) * 2018-12-13 2019-08-23 腾讯科技(深圳)有限公司 Rendering method, device, terminal and the storage medium of game picture
CN109934904A (en) * 2019-03-15 2019-06-25 网易(杭州)网络有限公司 Static light is according to baking processing method, device, equipment and readable storage medium storing program for executing
CN111127624A (en) * 2019-12-27 2020-05-08 珠海金山网络游戏科技有限公司 Illumination rendering method and device based on AR scene
CN111161393A (en) * 2019-12-31 2020-05-15 威创集团股份有限公司 Real-time light effect dynamic display method and system based on 3D map

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JULY32: "《【理论】球谐光照》", Retrieved from the Internet <URL:https://huailiang.github.io/blog/2019/harmonics/> *
吴伟和: "《 基于直接光照的全局光照模拟》", 计算机工程 *
子胤: "球谐光照", Retrieved from the Internet <URL:https://blog.csdn.net/yinfourever/article/details/90205890> *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113786616A (en) * 2021-09-30 2021-12-14 天津亚克互动科技有限公司 Indirect illumination implementation method and device, storage medium and computing equipment
CN113786616B (en) * 2021-09-30 2024-04-12 天津亚克互动科技有限公司 Indirect illumination implementation method and device, storage medium and computing equipment

Also Published As

Publication number Publication date
CN112473135B (en) 2024-05-10

Similar Documents

Publication Publication Date Title
CN112037311B (en) Animation generation method, animation playing method and related devices
CN111275797B (en) Animation display method, device, equipment and storage medium
CN111768473B (en) Image rendering method, device and equipment
CN113559504A (en) Information processing method, information processing apparatus, storage medium, and electronic device
CN112843704B (en) Animation model processing method, device, equipment and storage medium
CN114119818A (en) Rendering method, device and equipment of scene model
KR102603609B1 (en) Method, device, terminal, and storage medium for selecting virtual objects
US9483873B2 (en) Easy selection threshold
CN111583378B (en) Virtual asset processing method and device, electronic equipment and storage medium
CN112190937A (en) Illumination processing method, device, equipment and storage medium in game
CN112473135A (en) Real-time illumination simulation method, device, equipment and storage medium for mobile game
Thorn Learn unity for 2d game development
CN111681317B (en) Data processing method and device, electronic equipment and storage medium
CN114359458A (en) Image rendering method, device, equipment, storage medium and program product
WO2024082897A1 (en) Illumination control method and apparatus, and computer device and storage medium
CN116452704A (en) Method and device for generating lens halation special effect, storage medium and electronic device
CN115761105A (en) Illumination rendering method and device, electronic equipment and storage medium
CN114596403B (en) Image processing method, device, storage medium and terminal
CN113313798B (en) Cloud picture manufacturing method and device, storage medium and computer equipment
CN111462343B (en) Data processing method and device, electronic equipment and storage medium
CN116485981A (en) Three-dimensional model mapping method, device, equipment and storage medium
Lee et al. Unreal Engine: Game Development from A to Z
CN114821010A (en) Virtual scene processing method and device, storage medium and electronic equipment
CN114832375A (en) Ambient light shielding processing method, device and equipment
CN114245907A (en) Auto-exposure ray tracing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant