CN112473135B - Real-time illumination simulation method, device and equipment for mobile game and storage medium - Google Patents

Real-time illumination simulation method, device and equipment for mobile game and storage medium Download PDF

Info

Publication number
CN112473135B
CN112473135B CN202011227001.1A CN202011227001A CN112473135B CN 112473135 B CN112473135 B CN 112473135B CN 202011227001 A CN202011227001 A CN 202011227001A CN 112473135 B CN112473135 B CN 112473135B
Authority
CN
China
Prior art keywords
model
data
scene
mobile terminal
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011227001.1A
Other languages
Chinese (zh)
Other versions
CN112473135A (en
Inventor
李进
柴毅哲
代天麒
彭通
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202011227001.1A priority Critical patent/CN112473135B/en
Publication of CN112473135A publication Critical patent/CN112473135A/en
Application granted granted Critical
Publication of CN112473135B publication Critical patent/CN112473135B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The application discloses a real-time illumination simulation method, device and equipment for a mobile game and a storage medium, which relate to the technical field of game design, generate light source simulation data, select corresponding parameters in the light source simulation data to simulate illumination of the mobile terminal according to the position attribute of the mobile terminal in a target scene, and ensure the sense of realism without drawing a large number of maps. The method comprises the following steps: determining a mobile terminal, and loading a scene map of a target scene where the mobile terminal is located; generating light source simulation data according to a scene light source of a target scene at the current time, model attributes of a plurality of entity models and preset processing parameters; drawing model shadows for the mobile terminal based on the direct light data and the indirect light data of the scene light source; and carrying out illumination treatment on the mobile terminal in the scene map by adopting direct light data, model shadows and indirect light data according to the corresponding position attribute of the mobile terminal in the scene map.

Description

Real-time illumination simulation method, device and equipment for mobile game and storage medium
Technical Field
The present application relates to the field of game design technologies, and in particular, to a method, an apparatus, a device, and a storage medium for real-time illumination simulation of a mobile game.
Background
In recent years, game design techniques have been rapidly developed, and game players have increasingly demanded game screens, so simulating different lighting effects in a game scene has become a key point in game design in order to improve game screen quality and enhance substitution feeling of a game.
In the related art, when a game scene performs illumination simulation in the middle, light source information such as the position, the direction, the color and the like of a light source arranged in the scene is generally determined, and an illumination map and a shadow map of the scene are baked according to the light source information, so that the illumination map and the shadow map are processed, and the overall illumination simulation of the scene is realized.
In carrying out the present application, the applicant has found that the related art has at least the following problems:
some scenes in the game are outdoor scenes, a light source in the outdoor scenes belongs to a real-time light source, the light source needs to be changed along with time change, and for some mobile games, a model in the game scene can be moved, the mode is adopted to perform illumination simulation on the scenes to generate a large number of illumination patterns and shadow patterns, but the running memory of the game is limited, a large number of maps cannot be carried, the accuracy of shadows in the scene is limited by the running memory, the accuracy of the shadows of the model in the mobile game is reduced, the image quality of the scene is poor, and the reality of illumination simulation on the mobile game is difficult to ensure.
Disclosure of Invention
In view of the above, the present application provides a real-time illumination simulation method, apparatus, device and storage medium for mobile game, which mainly aims to solve the problems that the shadow precision of the model in the current mobile game is reduced, the image quality of the scene is poor, and the sense of reality of the mobile game illumination simulation is difficult to be ensured.
According to a first aspect of the present application, there is provided a real-time illumination simulation method for a mobile game, the method comprising:
Determining a mobile terminal, and loading a scene map of a target scene where the mobile terminal is located, wherein the scene map is generated by taking a plurality of entity models included in the target scene as units, and the model attribute of the mobile terminal is a dynamic model;
Generating light source simulation data according to a scene light source of the target scene, model attributes of the plurality of entity models and preset processing parameters in the current time, wherein the light source simulation data comprises indirect light data used for carrying out illumination simulation on the plurality of entity models;
Drawing model shadows for the mobile terminal based on the direct light data and the indirect light data of the scene light source;
And carrying out illumination treatment on the mobile terminal in the scene map by adopting the direct light data, the model shadow and the indirect light data according to the position attribute of the mobile terminal in the scene map, wherein the position attribute is at least a distant view attribute or a close view attribute.
In another embodiment, before the determining the mobile terminal and loading the scene map of the target scene where the mobile terminal is located, the method further includes:
Determining the scene light source at the current time, and setting the preset processing parameters for the scene light source, wherein the scene light source comprises a real-time light source, a natural light source and a point light source, and the preset processing parameters indicate whether the scene light source moves and generates shadows when generating the light source simulation data;
Inquiring the entity type of each entity model in the plurality of entity models, extracting a first entity model with the entity type being a preset entity type from the plurality of entity models, and setting the model attribute of the first entity model as a dynamic model;
And counting the length-width ratio of a second solid model in the plurality of solid models, and setting model attributes for the second solid model based on the length-width ratio, wherein the second solid model is other solid models except the first solid model in the plurality of solid models.
In another embodiment, the setting the model attribute for the second solid model based on the aspect ratio includes:
Obtaining a preset model proportion, and comparing the aspect ratio with the preset model proportion;
when the aspect ratio is greater than or equal to the preset model proportion, setting the model attribute of the second entity model as a static model;
and setting the model attribute of the second entity model as the dynamic model when the aspect ratio is smaller than the preset model proportion.
In another embodiment, the generating the light source simulation data according to the scene light source of the target scene, the model attributes of the plurality of entity models and the preset processing parameters at the current time includes:
Baking the scene light source based on the preset processing parameters to obtain first indirect light baking data in a map form, wherein the first indirect light baking data is used for carrying out illumination simulation on a solid model with model properties being static models in the plurality of solid models;
baking the scene light source, the first indirect light baking data and the solid models with model attributes of static models in the plurality of solid models by adopting the preset processing parameters to obtain second indirect light baking data in the form of spherical harmonic data, wherein the second indirect light baking data is used for carrying out illumination simulation on the solid models with model attributes of dynamic models in the plurality of solid models;
the first indirect light roasting data and the second indirect light roasting data are taken as the light source simulation data.
In another embodiment, the drawing a model shadow for the mobile terminal based on the direct light data and the indirect light data of the scene light source includes:
extracting second indirect light bake data from the indirect light data;
and adopting the direct light data and the second indirect light baking data to perform illumination simulation on the mobile terminal, and drawing the model shadow of the mobile terminal.
In another embodiment, after the drawing the model shadow for the mobile terminal based on the direct light data and the indirect light data of the scene light source, the method further includes:
Counting the display occupied area of the mobile terminal in the scene map, and determining the actual display area of the mobile terminal in the target scene;
when the ratio of the display occupied area to the actual display area is smaller than or equal to a ratio threshold, setting the position attribute of the mobile terminal as the perspective attribute;
And when the ratio of the display occupied area to the actual display area is larger than a ratio threshold, setting the position attribute of the mobile terminal as the close-range attribute.
In another embodiment, the performing, according to the location attribute of the mobile terminal corresponding to the mobile terminal in the scene map, the illumination processing on the mobile terminal in the scene map by using the direct light data, the model shadow and the indirect light data includes:
Determining the position attribute of the mobile terminal;
when the position attribute is a distant view attribute, adopting the direct light data, a first layer shadow in the model shadow and the indirect light data to carry out illumination treatment on the mobile terminal in the scene map;
And when the position attribute is a close-range attribute, adopting the direct light data, a second layer shadow and a third layer shadow in the model shadow to carry out illumination treatment on the mobile terminal in the scene map.
According to a second aspect of the present application, there is provided a real-time illumination simulation apparatus for a mobile game, the apparatus comprising:
The loading module is used for determining a mobile terminal and loading a scene map of a target scene where the mobile terminal is located, wherein the scene map is generated by taking a plurality of entity models included in the target scene as units, and the model attribute of the mobile terminal is a dynamic model;
the generation module is used for generating light source simulation data according to the scene light source of the target scene, the model attributes of the plurality of entity models and preset processing parameters at the current time, wherein the light source simulation data comprises indirect light data used for carrying out illumination simulation on the plurality of entity models;
the drawing module is used for drawing model shadows for the mobile terminal based on the direct light data and the indirect light data of the scene light source;
and the processing module is used for carrying out illumination processing on the mobile terminal in the scene map by adopting the direct light data, the model shadow and the indirect light data according to the position attribute corresponding to the mobile terminal in the scene map, wherein the position attribute is at least a distant view attribute or a close view attribute.
In another embodiment, the apparatus further comprises:
The first setting module is used for determining the scene light source at the current time, setting the preset processing parameters for the scene light source, wherein the scene light source comprises a real-time light source, a natural light source and a point light source, and the preset processing parameters indicate whether the scene light source moves and generates shadows when generating the light source simulation data;
The query module is used for querying the entity type of each entity model in the plurality of entity models, extracting a first entity model with the entity type being a preset entity type from the plurality of entity models, and setting the model attribute of the first entity model as a dynamic model;
the statistics module is used for counting the aspect ratio of a second entity model in the plurality of entity models, and setting model attributes for the second entity model based on the aspect ratio, wherein the second entity model is other entity models in the plurality of entity models except the first entity model.
In another embodiment, the statistics module is configured to obtain a preset model proportion, and compare the aspect ratio proportion with the preset model proportion; when the aspect ratio is greater than or equal to the preset model proportion, setting the model attribute of the second entity model as a static model; and setting the model attribute of the second entity model as the dynamic model when the aspect ratio is smaller than the preset model proportion.
In another embodiment, the generating module is configured to bake the scene light source based on the preset processing parameter to obtain first indirect light baked data in a map form, where the first indirect light baked data is used for performing illumination simulation on a solid model with model properties being a static model in the plurality of solid models; baking the scene light source, the first indirect light baking data and the solid models with model attributes of static models in the plurality of solid models by adopting the preset processing parameters to obtain second indirect light baking data in the form of spherical harmonic data, wherein the second indirect light baking data is used for carrying out illumination simulation on the solid models with model attributes of dynamic models in the plurality of solid models; the first indirect light roasting data and the second indirect light roasting data are taken as the light source simulation data.
In another embodiment, the rendering module is configured to extract second indirect light bake data from the indirect light data; and adopting the direct light data and the second indirect light baking data to perform illumination simulation on the mobile terminal, and drawing the model shadow of the mobile terminal.
In another embodiment, the apparatus further comprises:
The determining module is used for counting the display occupied area of the mobile terminal in the scene map and determining the actual display area of the mobile terminal in the target scene;
the second setting module is used for setting the position attribute of the mobile terminal as the distant view attribute when the ratio of the display occupied area to the actual display area is smaller than or equal to a ratio threshold value;
The second setting module is further configured to set a location attribute of the mobile terminal to the close-range attribute when a ratio of the display occupied area to the actual display area is greater than a ratio threshold.
In another embodiment, the processing module is configured to determine a location attribute of the mobile terminal; when the position attribute is a distant view attribute, adopting the direct light data, a first layer shadow in the model shadow and the indirect light data to carry out illumination treatment on the mobile terminal in the scene map; and when the position attribute is a close-range attribute, adopting the direct light data, a second layer shadow and a third layer shadow in the model shadow to carry out illumination treatment on the mobile terminal in the scene map.
According to a third aspect of the present application there is provided an apparatus comprising a memory storing a computer program and a processor implementing the steps of the method of the first aspect described above when the computer program is executed by the processor.
According to a fourth aspect of the present application there is provided a storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method of the first aspect described above.
By means of the technical scheme, the real-time illumination simulation method, device and equipment for the mobile game and the storage medium provided by the application are characterized in that whether the solid model can generate indirect light is determined according to the model attribute of the solid model in the target scene of the mobile game, further, light source simulation data comprising indirect light data for carrying out illumination simulation on a plurality of solid models are generated through baking treatment of a scene light source, and shadows of a mobile terminal in the target scene are drawn by adopting the light source simulation data and direct light data generated by the scene light source, so that illumination simulation is carried out on the mobile terminal according to the position attribute of the mobile terminal in the target scene, a large number of charks are not needed to be drawn for illumination, shadows and the like, the illumination simulation is carried out directly according to the position of the mobile terminal, the shadow accuracy of the model in the mobile game is not limited by game running memory, the image quality of the scene is improved, and the reality of the illumination simulation on the mobile game is ensured.
The foregoing description is only an overview of the present application, and is intended to be implemented in accordance with the teachings of the present application in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present application more readily apparent.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
Fig. 1 shows a flow diagram of a real-time illumination simulation method for a mobile game according to an embodiment of the present application;
FIG. 2 shows a flow chart of a real-time illumination simulation method for a mobile game according to an embodiment of the present application;
fig. 3A is a schematic structural diagram of a real-time illumination simulation device for a mobile game according to an embodiment of the present application;
Fig. 3B is a schematic structural diagram of a real-time illumination simulation device for a mobile game according to an embodiment of the present application;
Fig. 3C is a schematic structural diagram of a real-time illumination simulation device for a mobile game according to an embodiment of the present application;
fig. 4 shows a schematic diagram of a device structure of an apparatus according to an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the application to those skilled in the art.
The embodiment of the application provides a real-time illumination simulation method of a mobile game, which is shown in fig. 1 and comprises the following steps:
101. determining a mobile terminal, loading a scene map of a target scene where the mobile terminal is located, wherein the scene map is generated by taking a plurality of entity models included in the target scene as units, and the model attribute of the mobile terminal is a dynamic model.
102. Generating light source simulation data according to a scene light source of a target scene at the current time, model attributes of a plurality of entity models and preset processing parameters, wherein the light source simulation data comprises indirect light data used for carrying out illumination simulation on the plurality of entity models.
103. And drawing model shadows for the mobile terminal based on the direct light data and the indirect light data of the scene light source.
104. And carrying out illumination treatment on the mobile terminal in the scene map by adopting direct light data, model shadows and indirect light data according to the corresponding position attribute of the mobile terminal in the scene map, wherein the position attribute is at least a distant view attribute or a close view attribute.
According to the method provided by the embodiment of the application, whether the entity model can generate indirect light is determined according to the model attribute of the entity model in the target scene of the mobile game, further, light source simulation data comprising indirect light data for carrying out illumination simulation on a plurality of entity models is generated through baking treatment of a scene light source, shadow of a mobile terminal in the target scene is drawn by adopting the light source simulation data and the direct light data generated by the scene light source, so that the mobile terminal is subjected to illumination simulation according to the position attribute of the mobile terminal in the target scene, a large number of mapping is not required to be drawn for illumination, shadow and the like, the corresponding parameters are selected directly according to the position of the mobile terminal to carry out illumination simulation, the shadow precision of the model in the mobile game is not limited by a game running memory, the image quality of the scene is improved, and the sense of reality of illumination simulation on the mobile game is ensured.
The embodiment of the application provides a real-time illumination simulation method of a mobile game, which is shown in fig. 2 and comprises the following steps:
201. and determining the mobile terminal, and loading a scene map of a target scene where the mobile terminal is positioned.
In recent years, the design of a game scene is more and more realistic, and illumination simulation is required to be carried out in the game scene, whether the game scene is outdoor or indoor, and the effect of the illumination simulation directly influences the authenticity of the game scene. Currently, when performing illumination simulation on a game scene, light source information such as the position, the direction, the color and the like of a light source arranged in the scene is generally determined, and an illumination map and a shadow map of the scene are baked according to the light source information, so that the illumination map and the shadow map are processed, and overall illumination simulation of the scene is realized. However, the applicant realizes that the illumination in many game scenes is real-time, for example, the outdoor natural light is continuously changed along with time, and many models in the scenes are movable, so that the real-time illumination and the movement of the models require finer illumination simulation in the games, but the requirements also require that the running occupied memory of the games is larger, and many games cannot be used as the increase of the running occupied memory, so that the effect of the illumination simulation of the scenes in many games is limited by the memory, the shadow precision of the moving models in the scenes is reduced, the image quality of the scenes is poor, and the reality of the illumination simulation of the moving models is difficult to ensure. Therefore, the application provides a real-time illumination simulation method of a moving model, which is characterized in that whether the entity model can generate indirect light is determined according to the model attribute of the entity model in a target scene, then light source simulation data comprising indirect light data for carrying out illumination simulation on a plurality of entity models is generated through baking treatment of a scene light source, shadows of a moving end in the target scene are drawn by adopting the light source simulation data and the direct light data generated by the scene light source, so that the moving end is subjected to illumination simulation by using corresponding parameters according to the position attribute of the moving end in the target scene, a large number of maps are not required to be drawn for illumination, shadows and the like, the corresponding parameters are selected directly according to the position of the moving end to carry out illumination simulation, the shadow precision of the moving end is not limited by game running memory, the image quality of the scene is improved, and the reality of the illumination simulation of the moving model is ensured.
In order to implement the technical scheme described in the application, a scene map of the target scene needs to be prepared, so that the scene map of the target scene is directly loaded, and corresponding baking processing is performed on the basis of the scene map. The scene map is generated by taking a plurality of entity models included in the target scene as units, can be generated offline, and is a full scene map of the target scene. Specifically, since some game development tools exist currently, such as UE4 (Unreal Engine, illusion 4 engine), the scene map may be LQ LIGHTMAP (light map) generated by the game development tool, and the global illumination baking is performed on the field to be processed, so that in the game development tool, the global illumination in the target scene can be controlled to be finer by selecting the global illumination. It should be noted that, in practice, the technical solutions described in the embodiments of the present application may be implemented by setting a series of parameters in a game development tool, where the game development tool is illustrated as a UE4 in the present application. In addition, in the present application, a mobile terminal is taken as an example for explanation, and the model attribute of the mobile terminal is a dynamic model, that is, the mobile terminal is movable, and all solid models with the model attribute of the target scene being the dynamic model can be used as the mobile terminal to perform illumination simulation by adopting the scheme in the embodiment of the present application.
202. Preset processing parameters are set for a scene light source in a target scene, and model attributes are set for a plurality of entity models in a scene map.
In the embodiment of the application, before the simulated illumination of the target scene is performed, parameters such as a light source, a model and the like in the target scene need to be set in a game development tool. In setting the preset processing parameters for the scene light sources in the target scene, the real-time light sources in the target scene, i.e., directioal Light (directed light) in the game development tool, need to be set to Stationary at baking, and to movable at runtime Movable. There may also be a setting for Cascaded Shadow Maps (waterfall shadow map) in the game development tool, setting Cascaded Shadow Maps to 500 distances, and 2 levels of shadows. Further, for natural Light sources in the target scene, i.e., sky Light in the game development tool, needs to be set to state at baking, state at run-time, sky Light will generate Sky Occlusion (ambient Light mask), sourceType is set to SLS SPECIFIED CubeMap (designated stereoscopic map), i.e., sourceType use SceneCaptureCube (scene capture cube) baking Cubemap after Light baking is completed. In addition, the natural light source can be used in a game development tool for 24 hours with intensity and color modifications. Further, for a Point Light source in a target scene, that is, a Point/Spot Light (Point/Spot Light source) in a game development tool, it is required to be set to Movable and Shadows off (shadow off). In some cases, RECTLIGHT (visual lights) may be present, and RECTLIGHT may be set to Static in the game development tool. Through the above process, the scene light source is determined, preset processing parameters are set for the scene light source, the scene light source comprises a real-time light source, a natural light source and a point light source, and the preset processing parameters indicate whether the scene light source moves and generates shadows when generating light source simulation data. And after the setting of the scene light source parameters is completed, the overlapping degree of the lamplight can be checked in a game development tool, so that the occurrence of excessive dark color in a target scene is avoided.
Then, considering that some larger solid models not only receive illumination brought by a light source in a real scene, but also generate some indirect light, the indirect light is diffusely reflected to other models, therefore, model attributes need to be set for a plurality of solid models in a scene map to determine which solid models generate the indirect light, and which solid models only receive the light and do not generate the light, so that shadows and illumination of the solid models can be processed in a targeted manner later. When setting the model attribute, the principle of compliance is to set a larger model in a scene as a static model, such as mountain stones, buildings, landmark buildings and the like, specifically, the model can be set with human body proportion as a reference, the model with length and width larger than or equal to human body proportion is set as the static model, and the model with length and width smaller than human body proportion is set as the dynamic model, such as grass, small stones and the like. However, in the above-mentioned solid model in the human body proportion or more, there is a solid model of a tree, the solid model of a tree has a large or small size, and the light-shielding ability is far less than that of some mountain stones, buildings, and the like, and the generated indirect light is practically negligible, so that the solid model of a tree needs to be used as a dynamic model as well. In order to treat the entity models of the tree differently, the tree is set as a preset entity type in advance, so that the model attribute is set for a plurality of entity models in the scene map, firstly, the entity type of each entity model in the plurality of entity models is queried, a first entity model with the entity type being the preset entity type is extracted from the plurality of entity models, namely, the first entity model with the entity model being the tree is extracted, and the model attribute of the first entity model is directly set as a dynamic model. And then counting the aspect ratio of a second solid model in the plurality of solid models, and setting model attributes for the second solid model based on the aspect ratio, wherein the second solid model is other solid models except the first solid model in the plurality of solid models.
When setting the model attribute for the second solid model, firstly, a preset model proportion, namely a standard human body proportion, is obtained, and particularly for different scenes, the preset model proportion can change along with the size of the scene. Comparing the aspect ratio with a preset model proportion, and when the aspect ratio is greater than or equal to the preset model proportion, determining that the second solid model is a larger model such as mountain stone and building, and setting the model attribute of the second solid model as a static model. In contrast, when the aspect ratio is smaller than the preset model proportion, the second solid model is determined to be a smaller solid model such as small stones, grass and the like, and the model attribute of the second solid model is set as a dynamic model.
In the practical application process, the materials in the target scene are required to be set, so that the developer can set corresponding materials in the game development tool according to the game development requirement. In addition, when setting the material, it is necessary to individually reinforce the AO (Ambient Occlusion, ambient light shielding) of the material in the scene map, and it is not necessary to use the global object scene.
In the practical application process, some game development tools may not support parameter setting of the diversified environment, so in the embodiment of the application, a configuration file related to the environment setting is provided, the configuration file is written into the game development tool through an editor, is introduced into a catalog of the game development tool, provides additional functions in the game development tool, ensures that no direct sunlight is generated when illumination simulation is performed, and can realize setting of more parameters in the environment based on the additional functions.
203. And generating light source simulation data according to the scene light source of the target scene at the current time, the model attributes of the plurality of entity models and the preset processing parameters.
In the embodiment of the application, since some solid models in the target scene can generate indirect light illumination, such as light generated by reflection, for other solid models after receiving illumination, the indirect light is static light, but the color and brightness of the indirect light still need to be adjusted in real time through material simulation, light source simulation data are generated according to the scene light source of the target scene at the current time, the model attributes of a plurality of solid models and preset processing parameters, the light source simulation data comprise the indirect light data for carrying out illumination simulation on the plurality of solid models, and then the indirect light data are relied on to carry out illumination simulation on the solid models in the target scene.
Specifically, first, baking a scene light source based on preset processing parameters to obtain first indirect light baking data in a map form, wherein the first indirect light baking data is used for performing illumination simulation on solid models with model properties being static models in a plurality of solid models, and the scene light source with preset processing parameters can be obtained by baking the scene light source through a game development tool. And then baking the scene light source, the first indirect light baking data and the solid model with the model attribute being a static model in the plurality of solid models by adopting preset processing parameters to obtain second indirect light baking data in the form of spherical harmonic data, wherein the second indirect light baking data is used for carrying out illumination simulation on the solid model with the model attribute being a dynamic model in the plurality of solid models, and the scene light source with the preset processing parameters can be baked by a game development tool. Finally, the first indirect light roasting data and the second indirect light roasting data are taken as light source simulation data.
It should be noted that the game level ratio is proportional to the unit of the game development tool, 1 unit for 1 cm, and determines how detailed the light calculation is, and a smaller ratio will greatly increase the construction time. Thus, for a giant level, the units of the game development tool may be set to 2 or 4 to reduce the time to build the illumination, the larger the value, the faster the baking. In addition, the number of reflections of GI (Global Illumination ) needs to be set when baking processing is performed, starting from the scene light source. 0 refers to direct illumination only, 1 refers to the first reflection, which takes the longest time to calculate, followed by the 2 nd reflection, and subsequent reflection calculations are not too time consuming. Furthermore, when the proportion of interpolation of Photon is set in the game development tool when computing IRRADIANCE CACHING (irradiation amplitude buffer), the proportion can be set at 66 or 75, so that the detail loss caused by indirect shadow and environmental shielding is avoided while less noise is hidden, and a smooth indirect illumination effect is obtained. The above parameters are only illustrative of parameter setting when the game development tool is subjected to illumination baking, and in the practical application process, static environment shielding is started, the proportion of the completely shielded sample is set, the maximum distance is set, and the like in the game development tool, so that a developer can set the parameters according to the development requirement of the game, and the application is not repeated.
204. And drawing model shadows for the mobile terminal based on the direct light data and the indirect light data of the scene light source.
In the embodiment of the application, after the light source simulation data are generated, model shadows are drawn for the mobile terminal based on the direct light data and the indirect light data of the scene light source. The model attribute of the model to be processed is a dynamic model, so that second indirect light baking data are extracted from the indirect light data, illumination simulation is performed on the mobile terminal by adopting the direct light data and the second indirect light baking data, and a model shadow of the mobile terminal is drawn. In fact, all solid models with model properties of dynamic models in the target scene can draw model shadows by adopting the process. And a solid model with a model attribute of a static model exists in the target scene, and for the solid models with model attributes of the static model in a plurality of solid models, the illumination simulation is carried out on the solid model by adopting direct light data and first indirect light baking data in indirect light data, and the model shadow of the solid model is drawn.
In the practical application process, whether a dynamic model or a static model is adopted, the generated model shadow can be a layered shadow, and particularly can be generated by setting colors of different layers and shadow distances in a game development tool by a developer. The model shadow comprises three layers, and the three layers of shadows are used in subsequent illumination simulation of the solid model with different position attributes respectively.
205. The location attribute of the mobile terminal is determined.
In the embodiment of the present application, since the distant entity model and the near entity model exist in the field Jing Tietu, the distant entity model and the near entity model are different in the degree of model shadow and illumination simulation, so that the location attribute of the mobile terminal is determined, and the illumination simulation is performed on the mobile terminal according to the location attribute, where the location attribute is classified into a distant view attribute and a near view attribute.
When the position attribute is determined, the display occupied area of the mobile terminal in the field Jing Tietu is counted, and the actual display area of the mobile terminal in the target scene is determined. When the ratio of the display occupied area to the actual display area is smaller than or equal to the ratio threshold, the size of the mobile terminal is large in reality, but the size is small when the display occupied area is reflected in the scene map, the mobile terminal can be determined to be far away, and therefore the position attribute of the mobile terminal is set to be a distant view attribute. In contrast, when the ratio of the display occupied area to the actual display area is greater than the ratio threshold, it indicates that the actual size of the mobile terminal is not greatly different from the size represented in the scene map, and it can be determined that the mobile terminal is in close proximity, and therefore, the location attribute of the mobile terminal is set as the close-range attribute. The above procedure may be used to set a location attribute for each solid model in the target scene.
206. And carrying out illumination treatment on the mobile terminal in the scene map by adopting direct light data, model shadows and indirect light data according to the corresponding position attribute of the mobile terminal in the scene map.
In the embodiment of the application, after the position attribute of the mobile terminal is determined, the mobile terminal in the scene map is subjected to illumination treatment by adopting direct light data, model shadows and indirect light data according to the position attribute of the mobile terminal corresponding to the scene map.
For mobile terminals with different position attributes, modeling shadows of different layers are required to be adopted for illumination simulation, and the specific process is as follows: the location attribute of the mobile terminal is determined. When the position attribute is a perspective attribute, adopting the direct light data, a first layer shadow in the model shadow of the mobile terminal and the indirect light data to carry out illumination treatment on the mobile terminal in the scene map, namely combining the direct light data, the first layer shadow and the indirect light data to carry out illumination treatment on the mobile terminal. And when the position attribute is a close-range attribute, adopting the direct light data and the second layer shadow and the third layer shadow in the model shadow of the mobile terminal to carry out illumination treatment on the mobile terminal in the scene map, namely combining the direct light data, the second layer shadow and the third layer shadow to carry out illumination treatment on the mobile terminal.
According to the method provided by the embodiment of the application, whether the entity model can generate indirect light is determined according to the model attribute of the entity model in the target scene of the mobile game, further, light source simulation data comprising indirect light data for carrying out illumination simulation on a plurality of entity models is generated through baking treatment of a scene light source, shadow of a mobile terminal in the target scene is drawn by adopting the light source simulation data and the direct light data generated by the scene light source, so that the mobile terminal is subjected to illumination simulation according to the position attribute of the mobile terminal in the target scene, a large number of mapping is not required to be drawn for illumination, shadow and the like, the corresponding parameters are selected directly according to the position of the mobile terminal to carry out illumination simulation, the shadow precision of the model in the mobile game is not limited by a game running memory, the image quality of the scene is improved, and the sense of reality of illumination simulation on the mobile game is ensured.
Further, as a specific implementation of the method shown in fig. 1, an embodiment of the present application provides a real-time illumination simulation apparatus for a mobile game, as shown in fig. 3A, where the apparatus includes: a loading module 301, a generating module 302, a drawing module 303 and a processing module 304.
The loading module 301 is configured to determine a mobile terminal, load a scene map of a target scene where the mobile terminal is located, where the scene map is generated by using a plurality of entity models included in the target scene as units, and a model attribute of the mobile terminal is a dynamic model;
The generating module 302 is configured to generate light source simulation data according to a scene light source of the target scene at the current time, model attributes of the plurality of entity models, and preset processing parameters, where the light source simulation data includes indirect light data for performing illumination simulation on the plurality of entity models;
the drawing module 303 is configured to draw a model shadow for the mobile terminal based on the direct light data and the indirect light data of the scene light source;
The processing module 304 is configured to perform illumination processing on the mobile terminal in the scene map by using the direct light data, the model shadow and the indirect light data according to a position attribute corresponding to the mobile terminal in the scene map, where the position attribute is at least a distant view attribute or a close view attribute.
In a specific application scenario, as shown in fig. 3B, the apparatus further includes: a first setting module 305, a query module 306 and a statistics module 307.
The first setting module 305 is configured to determine the scene light source at the current time, set the preset processing parameters for the scene light source, where the scene light source includes a real-time light source, a natural light source, and a point light source, and the preset processing parameters indicate whether the scene light source moves and generates shadows when generating the light source simulation data;
The query module 306 is configured to query an entity type of each entity model in the plurality of entity models, extract a first entity model with the entity type being a preset entity type from the plurality of entity models, and set a model attribute of the first entity model as a dynamic model;
The statistics module 307 is configured to count an aspect ratio of a second entity model of the plurality of entity models, and set model attributes for the second entity model based on the aspect ratio, where the second entity model is other entity models of the plurality of entity models than the first entity model.
In a specific application scenario, the statistics module 307 is configured to obtain a preset model proportion, and compare the aspect ratio proportion with the preset model proportion; when the aspect ratio is greater than or equal to the preset model proportion, setting the model attribute of the second entity model as a static model; and setting the model attribute of the second entity model as the dynamic model when the aspect ratio is smaller than the preset model proportion.
In a specific application scenario, the generating module 302 is configured to bake the scenario light source based on the preset processing parameter to obtain first indirect light baking data in a map form, where the first indirect light baking data is used to perform illumination simulation on a solid model with model properties being static models in the plurality of solid models; baking the scene light source, the first indirect light baking data and the solid models with model attributes of static models in the plurality of solid models by adopting the preset processing parameters to obtain second indirect light baking data in the form of spherical harmonic data, wherein the second indirect light baking data is used for carrying out illumination simulation on the solid models with model attributes of dynamic models in the plurality of solid models; the first indirect light roasting data and the second indirect light roasting data are taken as the light source simulation data.
In a specific application scenario, the drawing module 303 is configured to extract second indirect light baked data from the indirect light data; and adopting the direct light data and the second indirect light baking data to perform illumination simulation on the mobile terminal, and drawing the model shadow of the mobile terminal.
In a specific application scenario, as shown in fig. 3C, the apparatus further includes: a determination module 308 and a second setting module 309.
The determining module 308 is configured to count a display occupied area of the mobile terminal in the scene map, and determine an actual display area of the mobile terminal in the target scene;
The second setting module 309 is configured to set the location attribute of the mobile terminal to the perspective attribute when the ratio of the display occupied area to the actual display area is less than or equal to a ratio threshold;
The second setting module 309 is further configured to set the location attribute of the mobile terminal to the close-range attribute when the ratio of the display occupied area to the actual display area is greater than a ratio threshold.
In a specific application scenario, the processing module 304 is configured to determine a location attribute of the mobile terminal; when the position attribute is a distant view attribute, adopting the direct light data, a first layer shadow in the model shadow and the indirect light data to carry out illumination treatment on the mobile terminal in the scene map; and when the position attribute is a close-range attribute, adopting the direct light data, a second layer shadow and a third layer shadow in the model shadow to carry out illumination treatment on the mobile terminal in the scene map.
According to the device provided by the embodiment of the application, whether the entity model can generate indirect light is determined according to the model attribute of the entity model in the target scene of the mobile game, further, light source simulation data comprising indirect light data for carrying out illumination simulation on a plurality of entity models is generated through baking treatment of a scene light source, shadow of a mobile terminal in the target scene is drawn by adopting the light source simulation data and the direct light data generated by the scene light source, so that the mobile terminal is subjected to illumination simulation according to the position attribute of the mobile terminal in the target scene, a large number of mapping is not required to be drawn for illumination, shadow and the like, the corresponding parameters are selected directly according to the position of the mobile terminal to carry out illumination simulation, the shadow precision of the model in the mobile game is not limited by a game running memory, the image quality of the scene is improved, and the sense of reality of illumination simulation on the mobile game is ensured.
It should be noted that, other corresponding descriptions of each functional unit related to the real-time illumination simulation device for a mobile game provided by the embodiment of the present application may refer to corresponding descriptions in fig. 1 and fig. 2, and are not repeated herein.
In an exemplary embodiment, referring to fig. 4, there is further provided a device 400 including a communication bus, a processor, a memory, and a communication interface, and may further include an input-output interface, and a display device, wherein the functional units may communicate with each other via the bus. The memory stores a computer program and a processor for executing the program stored in the memory to perform the real-time illumination simulation method of the mobile game in the above embodiment.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the real-time illumination simulation method of a mobile game.
From the above description of the embodiments, it will be clear to those skilled in the art that the present application may be implemented in hardware, or may be implemented by means of software plus necessary general hardware platforms. Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.), and includes several instructions for causing a computer device (may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective implementation scenario of the present application.
Those skilled in the art will appreciate that the drawing is merely a schematic illustration of a preferred implementation scenario and that the modules or flows in the drawing are not necessarily required to practice the application.
Those skilled in the art will appreciate that modules in an apparatus in an implementation scenario may be distributed in an apparatus in an implementation scenario according to an implementation scenario description, or that corresponding changes may be located in one or more apparatuses different from the implementation scenario. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The above-mentioned inventive sequence numbers are merely for description and do not represent advantages or disadvantages of the implementation scenario.
The foregoing disclosure is merely illustrative of some embodiments of the application, and the application is not limited thereto, as modifications may be made by those skilled in the art without departing from the scope of the application.

Claims (8)

1. A real-time lighting simulation method for a mobile game, comprising:
Determining a mobile terminal, and loading a scene map of a target scene where the mobile terminal is located, wherein the scene map is generated by taking a plurality of entity models included in the target scene as units, and the model attribute of the mobile terminal is a dynamic model;
Generating light source simulation data according to the scene light source of the target scene, the model attributes of the plurality of entity models and preset processing parameters in the current time, wherein the light source simulation data comprises indirect light data for performing illumination simulation on the plurality of entity models, and the generating light source simulation data according to the scene light source of the target scene, the model attributes of the plurality of entity models and the preset processing parameters in the current time comprises the following steps: baking the scene light source based on the preset processing parameters to obtain first indirect light baking data in a map form, wherein the first indirect light baking data is used for carrying out illumination simulation on a solid model with model properties being static models in the plurality of solid models; baking the scene light source, the first indirect light baking data and the solid models with model attributes of static models in the plurality of solid models by adopting the preset processing parameters to obtain second indirect light baking data in the form of spherical harmonic data, wherein the second indirect light baking data is used for carrying out illumination simulation on the solid models with model attributes of dynamic models in the plurality of solid models; taking the first indirect light roasting data and the second indirect light roasting data as the light source simulation data;
Drawing model shadows for the mobile terminal based on the direct light data and the indirect light data of the scene light source;
Performing illumination processing on the mobile terminal in the scene map by adopting the direct light data, the model shadow and the indirect light data according to the position attribute corresponding to the mobile terminal in the scene map, wherein the position attribute is at least a far view attribute or a near view attribute, and performing the illumination processing on the mobile terminal in the scene map by adopting the direct light data, the model shadow and the indirect light data according to the position attribute corresponding to the mobile terminal in the scene map comprises the following steps: determining the position attribute of the mobile terminal; when the position attribute is a distant view attribute, adopting the direct light data, a first layer shadow in the model shadow and the indirect light data to carry out illumination treatment on the mobile terminal in the scene map; and when the position attribute is a close-range attribute, adopting the direct light data, a second layer shadow and a third layer shadow in the model shadow to carry out illumination treatment on the mobile terminal in the scene map.
2. The method of claim 1, wherein the determining the mobile terminal, before loading a scene map of a target scene in which the mobile terminal is located, further comprises:
Determining the scene light source at the current time, and setting the preset processing parameters for the scene light source, wherein the scene light source comprises a real-time light source, a natural light source and a point light source, and the preset processing parameters indicate whether the scene light source moves and generates shadows when generating the light source simulation data;
Inquiring the entity type of each entity model in the plurality of entity models, extracting a first entity model with the entity type being a preset entity type from the plurality of entity models, and setting the model attribute of the first entity model as a dynamic model;
And counting the length-width ratio of a second solid model in the plurality of solid models, and setting model attributes for the second solid model based on the length-width ratio, wherein the second solid model is other solid models except the first solid model in the plurality of solid models.
3. The method according to claim 2, wherein the setting model properties for the second solid model based on the aspect ratio comprises:
Obtaining a preset model proportion, and comparing the aspect ratio with the preset model proportion;
when the aspect ratio is greater than or equal to the preset model proportion, setting the model attribute of the second entity model as a static model;
and setting the model attribute of the second entity model as the dynamic model when the aspect ratio is smaller than the preset model proportion.
4. The method of claim 1, wherein the rendering model shadows for the mobile end based on the direct light data and the indirect light data of the scene light source comprises:
extracting second indirect light bake data from the indirect light data;
and adopting the direct light data and the second indirect light baking data to perform illumination simulation on the mobile terminal, and drawing the model shadow of the mobile terminal.
5. The method of claim 1, wherein the method further comprises, after the model shading for the mobile end based on the direct light data and the indirect light data of the scene light source:
Counting the display occupied area of the mobile terminal in the scene map, and determining the actual display area of the mobile terminal in the target scene;
when the ratio of the display occupied area to the actual display area is smaller than or equal to a ratio threshold, setting the position attribute of the mobile terminal as the perspective attribute;
And when the ratio of the display occupied area to the actual display area is larger than a ratio threshold, setting the position attribute of the mobile terminal as the close-range attribute.
6. A real-time lighting simulation device for a mobile game, comprising:
The loading module is used for determining a mobile terminal and loading a scene map of a target scene where the mobile terminal is located, wherein the scene map is generated by taking a plurality of entity models included in the target scene as units, and the model attribute of the mobile terminal is a dynamic model;
The generating module is used for generating light source simulation data according to a scene light source of the target scene, model attributes of the plurality of entity models and preset processing parameters at the current time, wherein the light source simulation data comprises indirect light data used for carrying out illumination simulation on the plurality of entity models, the generating module is used for baking the scene light source based on the preset processing parameters to obtain first indirect light baking data in a map form, and the first indirect light baking data is used for carrying out illumination simulation on entity models with model attributes of static models in the plurality of entity models; baking the scene light source, the first indirect light baking data and the solid models with model attributes of static models in the plurality of solid models by adopting the preset processing parameters to obtain second indirect light baking data in the form of spherical harmonic data, wherein the second indirect light baking data is used for carrying out illumination simulation on the solid models with model attributes of dynamic models in the plurality of solid models; taking the first indirect light roasting data and the second indirect light roasting data as the light source simulation data;
the drawing module is used for drawing model shadows for the mobile terminal based on the direct light data and the indirect light data of the scene light source;
The processing module is used for carrying out illumination processing on the mobile terminal in the scene map by adopting the direct light data, the model shadow and the indirect light data according to the position attribute corresponding to the mobile terminal in the scene map, wherein the position attribute is at least a distant view attribute or a close view attribute, and the processing module is used for determining the position attribute of the mobile terminal; when the position attribute is a distant view attribute, adopting the direct light data, a first layer shadow in the model shadow and the indirect light data to carry out illumination treatment on the mobile terminal in the scene map; and when the position attribute is a close-range attribute, adopting the direct light data, a second layer shadow and a third layer shadow in the model shadow to carry out illumination treatment on the mobile terminal in the scene map.
7. An apparatus comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 5 when the computer program is executed.
8. A storage medium having stored thereon a computer program, which when executed by a processor, implements the steps of the method of any of claims 1 to 5.
CN202011227001.1A 2020-11-06 2020-11-06 Real-time illumination simulation method, device and equipment for mobile game and storage medium Active CN112473135B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011227001.1A CN112473135B (en) 2020-11-06 2020-11-06 Real-time illumination simulation method, device and equipment for mobile game and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011227001.1A CN112473135B (en) 2020-11-06 2020-11-06 Real-time illumination simulation method, device and equipment for mobile game and storage medium

Publications (2)

Publication Number Publication Date
CN112473135A CN112473135A (en) 2021-03-12
CN112473135B true CN112473135B (en) 2024-05-10

Family

ID=74928686

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011227001.1A Active CN112473135B (en) 2020-11-06 2020-11-06 Real-time illumination simulation method, device and equipment for mobile game and storage medium

Country Status (1)

Country Link
CN (1) CN112473135B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113786616B (en) * 2021-09-30 2024-04-12 天津亚克互动科技有限公司 Indirect illumination implementation method and device, storage medium and computing equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722901A (en) * 2011-03-29 2012-10-10 腾讯科技(深圳)有限公司 Method and apparatus for processing images
CN104008563A (en) * 2014-06-07 2014-08-27 长春理工大学 Method for achieving global illumination drawing of animation three-dimensional scene with virtual point light sources
CN104268923A (en) * 2014-09-04 2015-01-07 无锡梵天信息技术股份有限公司 Illumination method based on picture level images
WO2018028669A1 (en) * 2016-08-12 2018-02-15 腾讯科技(深圳)有限公司 Illumination processing method in 3d scenario, terminal, server, and storage medium
CN109934904A (en) * 2019-03-15 2019-06-25 网易(杭州)网络有限公司 Static light is according to baking processing method, device, equipment and readable storage medium storing program for executing
CN110152291A (en) * 2018-12-13 2019-08-23 腾讯科技(深圳)有限公司 Rendering method, device, terminal and the storage medium of game picture
CN111127624A (en) * 2019-12-27 2020-05-08 珠海金山网络游戏科技有限公司 Illumination rendering method and device based on AR scene
CN111161393A (en) * 2019-12-31 2020-05-15 威创集团股份有限公司 Real-time light effect dynamic display method and system based on 3D map

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722901A (en) * 2011-03-29 2012-10-10 腾讯科技(深圳)有限公司 Method and apparatus for processing images
CN104008563A (en) * 2014-06-07 2014-08-27 长春理工大学 Method for achieving global illumination drawing of animation three-dimensional scene with virtual point light sources
CN104268923A (en) * 2014-09-04 2015-01-07 无锡梵天信息技术股份有限公司 Illumination method based on picture level images
WO2018028669A1 (en) * 2016-08-12 2018-02-15 腾讯科技(深圳)有限公司 Illumination processing method in 3d scenario, terminal, server, and storage medium
CN110152291A (en) * 2018-12-13 2019-08-23 腾讯科技(深圳)有限公司 Rendering method, device, terminal and the storage medium of game picture
CN109934904A (en) * 2019-03-15 2019-06-25 网易(杭州)网络有限公司 Static light is according to baking processing method, device, equipment and readable storage medium storing program for executing
CN111127624A (en) * 2019-12-27 2020-05-08 珠海金山网络游戏科技有限公司 Illumination rendering method and device based on AR scene
CN111161393A (en) * 2019-12-31 2020-05-15 威创集团股份有限公司 Real-time light effect dynamic display method and system based on 3D map

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吴伟和.《 基于直接光照的全局光照模拟》.计算机工程.2009,全文. *

Also Published As

Publication number Publication date
CN112473135A (en) 2021-03-12

Similar Documents

Publication Publication Date Title
CN114419240B (en) Illumination rendering method and device, computer equipment and storage medium
CN108043027B (en) Storage medium, electronic device, game screen display method and device
CN113658316B (en) Rendering method and device of three-dimensional model, storage medium and computer equipment
Lee Learning unreal engine game development
CN114119818A (en) Rendering method, device and equipment of scene model
US20140292754A1 (en) Easy selection threshold
CN112473135B (en) Real-time illumination simulation method, device and equipment for mobile game and storage medium
Tracy et al. CryENGINE 3 Game Development: Beginner's Guide
WO2024082897A1 (en) Illumination control method and apparatus, and computer device and storage medium
CN105957133A (en) Method and device for loading maps
Thorn Learn unity for 2d game development
CN111583378B (en) Virtual asset processing method and device, electronic equipment and storage medium
CN113610955A (en) Object rendering method and device and shader
Lee et al. Unreal Engine: Game Development from A to Z
CN108280887B (en) Shadow map determination method and device
CN115761105A (en) Illumination rendering method and device, electronic equipment and storage medium
CN116485981A (en) Three-dimensional model mapping method, device, equipment and storage medium
CN113313798B (en) Cloud picture manufacturing method and device, storage medium and computer equipment
CN112465941B (en) Volume cloud processing method and device, electronic equipment and storage medium
US20060033736A1 (en) Enhanced Color and Lighting Model for Computer Graphics Productions
CN114821010A (en) Virtual scene processing method and device, storage medium and electronic equipment
CN114832375A (en) Ambient light shielding processing method, device and equipment
CN112843704A (en) Animation model processing method, device, equipment and storage medium
Li Research and Analysis of 3D games
Mamgain Autodesk 3ds Max 2021: A Detailed Guide to Arnold Renderer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant