CN113470169B - Game scene generation method and device, computer equipment and readable storage medium - Google Patents

Game scene generation method and device, computer equipment and readable storage medium Download PDF

Info

Publication number
CN113470169B
CN113470169B CN202110735353.6A CN202110735353A CN113470169B CN 113470169 B CN113470169 B CN 113470169B CN 202110735353 A CN202110735353 A CN 202110735353A CN 113470169 B CN113470169 B CN 113470169B
Authority
CN
China
Prior art keywords
scene
height
target
model
block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110735353.6A
Other languages
Chinese (zh)
Other versions
CN113470169A (en
Inventor
张凌云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202110735353.6A priority Critical patent/CN113470169B/en
Publication of CN113470169A publication Critical patent/CN113470169A/en
Application granted granted Critical
Publication of CN113470169B publication Critical patent/CN113470169B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a game scene generation method and device, computer equipment and a readable storage medium, relates to the technical field of image processing, and is used for accurately judging whether the surface of an entity model is shielded or not and the range of elements to be covered according to a scene height map and a model local area, and then adding corresponding scene elements without making a large number of pictures, so that the generation efficiency of the scene is improved, a correct game scene is conveniently obtained, and the sense of reality of the generated game scene is improved. The method comprises the following steps: generating a scene height map of an original game scene; creating at least one model local area for each scene model in the original game scene; adding target scene elements to the original game scene based on the scene height map and at least one model local surface of each scene model to form at least one element area; and determining element effect parameters of the element areas, and interpolating and writing the element effect parameters into the element areas to generate a target game scene baked with target scene elements.

Description

Game scene generation method and device, computer equipment and readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for generating a game scene, a computer device, and a readable storage medium.
Background
In recent years, network games have become an essential part of people's lives. With the rapid development of image processing technology, the requirements for game scenes become higher and higher for developers and players of games, and therefore, generating more realistic game scenes has become a key point in the game design process in order to improve the quality of game pictures and enhance the substitution feeling of games.
In the related art, some snow scenes are involved in games, and when the snow scenes are generated, some pastels for representing the snow need to be made in an art drawing mode and covered on an entity model of the scenes, so that the effect of covering the snow on the surface of the model is achieved.
In carrying out the present application, the applicant has found that the related art has at least the following problems:
some entity models in the game scene have shielding relations, for example, a big tree can shield a part of ground, ray detection is needed to realize the shielding relation, the models cannot correctly transmit rays, the correct game scene is difficult to render, and the game scene lacks reality sense.
Disclosure of Invention
In view of this, the present application provides a method and an apparatus for generating a game scene, a computer device and a readable storage medium, and mainly aims to solve the problem that it is difficult to render a correct game scene at present, and the game scene lacks of realism.
According to a first aspect of the present application, there is provided a game scene generating method, including:
generating a scene height map of an original game scene, wherein the scene height map describes the height relation of the scene models in the original game scene;
creating at least one model local area for each scene model in the original game scene;
adding target scene elements to the original game scene based on the scene height map and the at least one model local area of each scene model to form at least one element area;
determining element effect parameters of the element areas, and writing the element effect parameters into the element areas in an interpolation mode to generate a target game scene baked with the target scene elements.
Optionally, the generating a scene height map of the original game scene includes:
dividing the original game scene into a plurality of scene blocks according to a preset division size;
performing collision detection on each scene block in the plurality of scene blocks in the original game scene to generate a height parameter of each scene block;
baking the scene models based on the height parameters corresponding to the scene blocks to generate a scene height map of the original game scene.
Optionally, the performing collision detection on each scene block in the plurality of scene blocks in the original game scene to generate a height parameter of each scene block includes:
for each scene tile in the plurality of scene tiles, performing collision detection on the scene tile, determining a plurality of target collision volumes of the scene tile in the original game scene, and querying the plurality of target scene tiles in which the plurality of target collision volumes are located;
determining a first target scene tile and a second target scene tile among the plurality of target scene tiles, the height of the first target scene tile being the largest of the plurality of target scene tiles, the height of the second target scene tile being the smallest of the plurality of target scene tiles;
dividing the first target scene block into a preset number of block units, and determining a target block unit in the preset number of block units, wherein the height of the target block unit is the largest among the preset number of block units;
comparing the height of the target block unit with a preset height, and generating a height parameter of the scene block according to a comparison result;
and repeating the process of generating the height parameters, and respectively generating the height parameters for each scene block.
Optionally, the comparing the height of the target block unit with a preset height, and generating the height parameter of the scene block according to the comparison result includes:
when the comparison result indicates that the height of the target block unit is smaller than the preset height, calculating a first difference value between the height of the first target block and the height of the second target block, and generating the height parameter according to the first difference value;
when the comparison result indicates that the height of the target block unit is greater than or equal to the preset height, continuing to divide the target block unit into the preset number of secondary block units, judging whether the height of the secondary block unit with the highest height in the preset number of secondary block units is smaller than the preset height or not, dividing the secondary block unit with the highest height until the height is determined to be smaller than the preset height, calculating a third difference value between the height of the designated unit and the height of the second target block, and generating the height parameter according to the third difference value.
Optionally, the creating at least one model local area surface for each scene model in the original game scene includes:
for each scene model in the original game scene, acquiring at least one local area surface creation parameter, wherein the local area surface creation parameter indicates the direction, the angle and the area of a model local area surface created for the scene model;
creating the at least one model local area surface on the top of the scene model according to the at least one local area surface creation parameter;
and repeatedly executing the creation process of the model local area surface, and respectively creating the at least one model local area surface for each scene model.
Optionally, the adding a target scene element to the original game scene based on the scene height map and the at least one model local surface of each scene model to form at least one element region includes:
for each scene model of the plurality of scene models, constructing the target scene element on at least one model local surface of the scene model;
the simulated target scene elements are transmitted to the scene model according to the normal direction of the at least one model local area surface, and the target scene elements are controlled to hit the scene model to form at least one initial area;
querying a designated scene block of the at least one initial region in the scene height map;
increasing the block size of the designated scene block to a preset block size in the scene height map, and detecting the sampling results of the at least one initial area and the designated scene block after the size is increased;
and according to the sampling result, determining an occluded partial area in the at least one initial area, and clearing the target scene elements rendered in the partial area in the at least one initial area to obtain the at least one element area.
Optionally, the determining element effect parameters of the plurality of element regions includes:
for each element area in the plurality of element areas, determining a target scene model covered by the element area, and inquiring the angle difference between the surface normal of the target scene model and the space longitudinal axis;
calculating the area top parameter, the area bottom parameter and the angle difference of the element area, and outputting the influence degree parameter of the element area;
determining an area range corresponding to the angle difference, calculating the angle difference and the area range, and outputting an element coverage range of the element area;
calculating the influence degree parameter, the element coverage range and the element color data of the target scene element to obtain an element thickness parameter;
performing product calculation on the influence degree parameter, the element coverage range and the element thickness parameter to obtain a product result, and calculating a product between the product result and a preset coefficient to serve as an element effect parameter of the element area;
and calculating element effect parameters for each element area in the element areas respectively to obtain the element effect parameters of the element areas.
Optionally, the method further comprises:
splitting indirect light in the target game scene to obtain a reflection coefficient and an illumination coefficient, determining weather color data of scene weather related to the target scene element, calculating a first product of the weather color data and the reflection coefficient, calculating a second product of the element color data of the target scene element and the illumination coefficient, calculating a sum of the first product and the second product, and rendering scene illumination for the target game scene according to the sum; or the like, or, alternatively,
determining the influence degree parameters of the element areas, fitting the indirect light according to the influence degree parameters to obtain appointed color brightness, and performing interpolation rendering on the appointed color brightness to the target game scene.
According to a second aspect of the present application, there is provided a game scene generating apparatus, comprising:
the first generation module is used for generating a scene height map of an original game scene, wherein the scene height map describes the height relation of the scene models in the original game scene;
the creating module is used for creating at least one model local area surface for each scene model in the original game scene;
an adding module, configured to add a target scene element to the original game scene based on the scene height map and the at least one model local surface of each scene model to form at least one element region;
and the second generation module is used for determining element effect parameters of the element areas, interpolating and writing the element effect parameters into the element areas, and generating the target game scene baked with the target scene elements.
Optionally, the first generating module is configured to divide the original game scene into a plurality of scene blocks according to a preset dividing size; performing collision detection on each scene block in the plurality of scene blocks in the original game scene to generate a height parameter of each scene block; baking the scene models based on the height parameters corresponding to the scene blocks to generate a scene height map of the original game scene.
Optionally, the first generating module is configured to perform collision detection on each scene tile in the plurality of scene tiles, determine a plurality of target collision volumes of the scene tile in the original game scene, and query a plurality of target scene tiles in which the plurality of target collision volumes are located in the plurality of scene tiles; determining a first target scene tile and a second target scene tile among the plurality of target scene tiles, the height of the first target scene tile being the largest of the plurality of target scene tiles, the height of the second target scene tile being the smallest of the plurality of target scene tiles; dividing the first target scene block into a preset number of block units, and determining a target block unit in the preset number of block units, wherein the height of the target block unit is the largest among the preset number of block units; comparing the height of the target block unit with a preset height, and generating a height parameter of the scene block according to a comparison result; and repeating the process of generating the height parameters, and respectively generating the height parameters for each scene block.
Optionally, the first generating module is configured to calculate a first difference between the height of the first target block and the height of the second target block when the comparison result indicates that the height of the target block unit is smaller than the preset height, and generate the height parameter according to the first difference; when the comparison result indicates that the height of the target block unit is greater than or equal to the preset height, continuing to divide the target block unit into the preset number of secondary block units, judging whether the height of the secondary block unit with the highest height in the preset number of secondary block units is smaller than the preset height or not, dividing the secondary block unit with the highest height until the height is determined to be smaller than the preset height, calculating a third difference value between the height of the designated unit and the height of the second target block, and generating the height parameter according to the third difference value.
Optionally, the creating module is configured to obtain at least one local area creation parameter for each scene model in the original game scene, where the local area creation parameter indicates a direction, an angle, and an area of a model local area created for the scene model; creating the at least one model local area surface on the top of the scene model according to the at least one local area surface creation parameter; and repeatedly executing the creation process of the model local area surface, and respectively creating the at least one model local area surface for each scene model.
Optionally, the adding module is configured to, for each scene model of the plurality of scene models, construct the target scene element on at least one model local surface of the scene model; the simulated target scene elements are transmitted to the scene model according to the normal direction of the at least one model local area surface, and the target scene elements are controlled to hit the scene model to form at least one initial area; querying a designated scene block of the at least one initial region in the scene height map; increasing the block size of the designated scene block to a preset block size in the scene height map, and detecting the sampling results of the at least one initial area and the designated scene block after the size is increased; and according to the sampling result, determining an occluded partial area in the at least one initial area, and clearing the target scene elements rendered in the partial area in the at least one initial area to obtain the at least one element area.
Optionally, the second generating module is configured to determine, for each element region in the plurality of element regions, a target scene model covered by the element region, and query an angle difference between a surface normal of the target scene model and a spatial longitudinal axis; calculating the area top parameter, the area bottom parameter and the angle difference of the element area, and outputting the influence degree parameter of the element area; determining an area range corresponding to the angle difference, calculating the angle difference and the area range, and outputting an element coverage range of the element area; calculating the influence degree parameter, the element coverage range and the element color data of the target scene element to obtain an element thickness parameter; performing product calculation on the influence degree parameter, the element coverage range and the element thickness parameter to obtain a product result, and calculating a product between the product result and a preset coefficient to serve as an element effect parameter of the element area; and calculating element effect parameters for each element area in the element areas respectively to obtain the element effect parameters of the element areas.
Optionally, the apparatus further comprises:
a rendering module, configured to split indirect light in the target game scene to obtain a reflection coefficient and an illumination coefficient, determine weather color data of scene weather related to the target scene element, calculate a first product of the weather color data and the reflection coefficient, calculate a second product of the element color data of the target scene element and the illumination coefficient, calculate a sum of the first product and the second product, and add scene illumination to the target game scene according to the sum; or determining the influence degree parameters of the element areas, fitting the indirect light according to the influence degree parameters to obtain the appointed color brightness, and writing the appointed color brightness into the target game scene through interpolation.
According to a third aspect of the present application, there is provided a computer device comprising a memory storing a computer program and a processor implementing the steps of the method of any of the first aspects when the computer program is executed.
According to a fourth aspect of the present application, there is provided a readable storage medium having stored thereon a computer program which, when executed by a processor, carries out the steps of the method of any of the above-mentioned first aspects.
By means of the technical scheme, the game scene generation method, the game scene generation device, the computer equipment and the readable storage medium generate a scene height map describing the height relation of a plurality of scene models in an original game scene, create at least one model local area for each scene model in the original game scene, add target scene elements to the original game scene based on the scene height map and the at least one model local area of each scene model to form at least one element area, determine element effect parameters of the element areas, write the element effect parameters into the element areas in an interpolation mode to generate a target game scene with the target scene elements, accurately judge whether the surface of an entity model is blocked or not and accurately judge the range of the elements to be covered according to the scene height map and the model local area, and directly add corresponding fields to the entity model according to the scene height map and the indication of the model local area Scene elements do not need to be made into a large number of pictures, so that the scene generation efficiency is improved, a correct game scene can be conveniently obtained, and the reality of the generated game scene can be improved.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a schematic flow chart illustrating a game scene generation method according to an embodiment of the present disclosure;
FIG. 2A is a schematic flow chart illustrating a game scene generation method according to an embodiment of the present disclosure;
FIG. 2B is a schematic diagram illustrating a method for generating a game scene according to an embodiment of the present disclosure;
FIG. 2C is a schematic diagram illustrating a method for generating a game scene according to an embodiment of the present disclosure;
fig. 3A is a schematic structural diagram illustrating a game scene generation apparatus according to an embodiment of the present application;
fig. 3B is a schematic structural diagram illustrating a game scene generation apparatus according to an embodiment of the present application;
fig. 4 shows a schematic device structure diagram of a computer apparatus according to an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
An embodiment of the present application provides a game scene generation method, as shown in fig. 1, the method includes:
101. and generating a scene height map of the original game scene, wherein the scene height map describes the height relation of the scene models in the original game scene.
102. At least one model local area is created for each scene model in the original game scene.
103. Adding target scene elements to the original game scene based on the scene height map and the at least one model local surface of each scene model to form at least one element area.
104. And determining element effect parameters of the element areas, and interpolating and writing the element effect parameters into the element areas to generate a target game scene baked with target scene elements.
The method provided by the embodiment of the application generates a scene height map describing the height relationship of a plurality of scene models in an original game scene, creates at least one model local area for each scene model in the original game scene, further adds target scene elements to the original game scene based on the scene height map and the at least one model local area of each scene model to form at least one element area, determines element effect parameters of the element areas, interpolates and writes the element effect parameters into the element areas to generate a target game scene baked with the target scene elements, so that whether the surface of an entity model is shielded or not and the range of the elements to be covered can be accurately judged according to the scene height map and the model local area, and the corresponding scene elements can be directly added to the entity model according to the indication of the scene height map and the model local area, a large number of maps are not required to be made, the scene generation efficiency is improved, the correct game scene can be obtained conveniently, and the reality of the generated game scene can be improved.
An embodiment of the present application provides a game scene generation method, as shown in fig. 2A, the method includes:
201. a scene height map of the original game scene is generated.
In recent years, the design of game scenes is more and more vivid, and many outdoor scenes in the game have some weather changes, such as snow days, rain days and the like, so that the experience of a user in the game is more real. When an outdoor scene of a snowy day is rendered, in order to achieve the effect of covering snow on the surface of an object, art drawing is usually required to be performed on a game scene, some maps for representing the snow on the surface of the object are made, and the maps are covered on an entity model of the game scene, so that the effect of covering the snow on the surface of the model is achieved. However, the applicant recognizes that some entity models in a game scene have a shielding relationship, for example, a large tree may shield a part of the ground, ray detection is needed to realize the shielding relationship, the model cannot correctly transmit rays, only function fitting is performed on the model, it is difficult to render a correct game scene, and the game scene lacks realism. Therefore, the application provides a game scene generation method, which generates a scene height map describing the height relationship of a plurality of scene models in an original game scene, creates at least one model local area for each scene model in the original game scene, further adds target scene elements to the original game scene based on the scene height map and the at least one model local area of each scene model to form at least one element area, determines element effect parameters of the element areas, interpolates and writes the element effect parameters into the element areas to generate a target game scene rendered with the target scene elements, so that whether the surface of an entity model is blocked or not and the range of the elements to be covered can be accurately judged according to the scene height map and the model local area, and the corresponding scene elements can be directly added to the entity model according to the indication of the scene height map and the model local area, a large number of maps are not required to be made, the generation efficiency of the scene is improved, the correct game scene is conveniently rendered, and the sense of reality of the generated game scene can be improved.
In order to implement the scheme of the present application, a scene height map of an original game scene needs to be generated, where the scene height map describes a height relationship of a plurality of scene models in the original game scene, and a process of generating the scene height map specifically refers to the following steps one to three:
step one, dividing an original game scene into a plurality of scene blocks according to a preset dividing size.
The preset division size may be the height of a scene block, for example, the preset division size may be 2000 meters, so that the original game scene may be divided into scene blocks of 2000 heights. Assuming that two solid models A and B exist in an original game scene and the preset division size is 2000 m, the original game scene is divided into 10 scene blocks with the height of 2000 m by using a 0-bit starting point of a world coordinate point of the original game scene.
And step two, performing collision detection on each scene block in the plurality of scene blocks in the original game scene to generate a height parameter of each scene block.
Taking any scene block in a plurality of scene blocks as an example to illustrate the generation process of the height parameter: for each scene tile in the plurality of scene tiles, firstly, the scene tile is subjected to collision detection, and a plurality of target collision bodies of the scene tile in the original game scene are determined. Specifically, a Physics.overlayBox function can be called to inquire whether a scene block has a collision volume in the original game scene.
Then, a plurality of target scene partitions in which the plurality of target collision volumes are located are queried, and a first target scene partition and a second target scene partition are determined among the plurality of target scene partitions. The height of the first target scene block is the largest of the plurality of target scene blocks, and the height of the second target scene block is the smallest of the plurality of target scene blocks, that is, the highest scene block and the lowest scene block are determined among the plurality of target scene blocks.
Next, the first target scene block is divided into a preset number of block units, and a target block unit is determined among the preset number of block units, where the height of the target block unit is the largest among the preset number of block units. That is, the highest scene block is continuously divided, for example, the preset number may be 10, and the highest scene block is continuously divided into 10 block units, and the highest block unit is determined as the target block unit in the 10 block units.
And finally, comparing the height of the target block unit with a preset height, and generating a height parameter of the scene block according to a comparison result. Specifically, when the comparison result indicates that the height of the target block unit is smaller than the preset height, it indicates that the height of the target block unit is already lower than the preset height, and therefore, the block height of the first target block which is queried last is returned, a first difference between the height of the first target block and the height of the second target block is calculated, and the height parameter is generated according to the first difference. On the contrary, when the comparison result indicates that the height of the target block unit is greater than or equal to the preset height, it indicates that the height of the target block unit is still higher than the preset height, and the target block unit needs to be continuously divided, so that the target block unit is continuously divided into a preset number of secondary block units, whether the height of the secondary block unit with the highest height in the preset number of secondary block units is smaller than the preset height and the secondary block unit with the highest height is divided is judged until the designated unit with the height smaller than the preset height is determined, a third difference value between the unit height of the designated unit and the height of the second target block is calculated, and a height parameter is generated according to the third difference value. Namely, the division is carried out in a circulating mode until the division finds a unit with the height lower than the preset height, and then the height parameter is generated.
By repeatedly performing the above-described process of generating the height parameter, the height parameter can be generated for each scene block, respectively.
And thirdly, baking the scene models based on the height parameters corresponding to the scene blocks to generate a scene height map of the original game scene.
After the height parameters of each scene block are determined, the multiple scene models can be baked based on the height parameters corresponding to the multiple scene blocks to generate a scene height map of the original game scene, so that the process of generating the height map by partitioning the scene and using collision detection on the scene blocks is realized, and the occlusion relation between the entity models in the original game scene is processed by using the scene height map subsequently.
202. At least one model local area is created for each scene model in the original game scene.
In the embodiment of the present application, since some scene models have self-occlusion relationships, for example, some buildings are designed elaborately, the roof is constructed by a plurality of inclined surfaces, and there are some occlusion relationships between the surfaces, which are the self-occlusion relationships of the scene models. The self-occlusion relationship actually affects the addition of scene elements such as snow accumulation and rain accumulation, so that the scene elements of some surfaces are less or basically not accumulated, in the embodiment of the application, at least one model local surface is created for each scene model in the original game scene, and the self-occlusion relationship of the scene model is baked based on the model local surface, so that the positions of the scene model where the scene elements need to be added and the positions where the scene elements do not need to be added are determined. Specifically, the process of creating at least one model local area surface is as follows:
for each scene model in the original game scene, at least one local area creation parameter is obtained, and the local area creation parameter indicates the direction, the angle and the area of the model local area created for the scene model. Then, at least one model local area is created on top of the scene model according to the at least one local area creation parameter. And respectively creating at least one model local area surface for each scene model by repeatedly executing the creation process of the model local area surfaces. For example, assuming that at least one local area creation parameter indicates that 8 local areas with 30 ° orientation and vertical downward orientation are created on top of the scene model, at least one local area as shown in fig. 2B may be created on top of the scene model according to the at least one local area creation parameter.
203. Adding target scene elements to the original game scene based on the scene height map and the at least one model local surface of each scene model to form at least one element area.
In the embodiment of the application, after determining the scene height map for describing the occlusion relationship between the models and the at least one model local surface for describing the self-occlusion relationship of the scene models, adding the target scene element to the original game scene based on the scene height map and the at least one model local surface of each scene model may be started to form at least one element area.
Specifically, for each scene model of the plurality of scene models, first, a target scene element is constructed on at least one model local area of the scene model, and assuming that the target scene element is a snow particle, a snow particle is constructed on at least one model local area. And then, transmitting the simulated target scene elements to the scene model according to the normal direction of at least one model local area, and controlling the target scene elements to hit the scene model to form at least one initial area. It should be noted that, because the accumulated snow or the accumulated rain of the vertices or the areas of the scene model closer to the sky may be more than the accumulated snow or the accumulated rain of other vertices or the areas of the scene model farther from the sky, when the target scene element is simulated to be transmitted to at least one local surface, the distance between each vertex of the scene model and the local surface may be detected, and the number of the added target scene elements may be controlled according to the distance.
Then, a designated scene block of the at least one initial area in the scene height map is inquired, the block size of the designated scene block is increased to a preset block size in the scene height map, and sampling results of the at least one initial area and the designated scene block with the increased size are detected. The reason why the block size of the designated scene block is increased is that when the particles of the target scene element are located at the boundary of the scene block, different height maps need to be sampled for multiple times to obtain the correct occlusion relationship between the models, so that the block size of the designated scene block is increased, and the designated scene block is expanded. Therefore, when the scene model in the appointed scene block exceeds the appointed scene block, the block and the height map can be directly switched, and the correct shielding relation between the model and the model can be obtained only by sampling one picture. Referring to fig. 2C, the dashed line represents the real division of scene blocks a and B when the block size is not increased, and the left boundary of B after the size is increased is shown in the figure. Finally, since the occlusion relationship between the scene models is determined based on the sampling result of the height map, and the initial region constructed according to the local area surface does not actually consider the occlusion relationship between the scene models, it is necessary to determine the occluded partial region in the at least one initial region according to the sampling result, and remove the target scene element rendered in the partial region in the at least one initial region to obtain at least one element region. The target scene elements of the shielded surface are removed due to the relative relation between the scene models, so that the accumulation of the elements on the surface is avoided, and the condition that the target scene elements accord with the actual natural condition is ensured.
204. An element effect parameter is determined for a plurality of element regions.
In the embodiment of the present application, after a plurality of element regions are determined, since the depth, thickness, range, and the like of elements at each position in a game scene are different, for example, the more obvious the stacking effect of elements in a region closer to the sky of the game scene is, the greater the stacking effect at the bottom of the game scene is than that at the top of the game scene, and the like, it is necessary to determine element effect parameters of the plurality of element regions, and subsequently interpolate the element effect parameters to corresponding element regions. The process of specifically determining the element effect parameters is as follows:
for each element area in the element areas, firstly, a target scene model covered by the element area is determined, the angle difference between the surface normal of the target scene model and the space longitudinal axis is inquired, the area top parameter, the area bottom parameter and the angle difference of the element area are calculated, and the influence degree parameter of the element area is output. Specifically, the influence degree parameter may be output by using a code half level — SnowProgress-level (_ snowtopaddress, _ SnowBottomAmount, saturrate (normalsws. Wherein half level represents the influence degree parameter, snowtopaffect represents the region top parameter, SnowBottomAmount represents the region bottom parameter, and saturrate (normalsws.
And then, determining an area range corresponding to the angle difference, calculating the angle difference and the area range, and outputting an element coverage range of the element area. Specifically, the code half snowRange output element coverage (normal ws. y + _ snowRange) may be used, where half snowRange represents element coverage, normal ws. y represents angle difference, and snowRange represents area range, thereby controlling the influence of normal ws. y on element stacking range based on a snowRange variable.
And then, calculating the influence degree parameter, the element coverage range and the element color data of the target scene element to obtain an element thickness parameter. Specifically, the element thickness parameter may be output using a code half snowHeight left (0.0,1.0, height colorMask height). Wherein half snowHeight represents an element thickness parameter, height represents an influence degree parameter, colorMask represents element color data, and highmask represents an element coverage.
And finally, performing product calculation on the influence degree parameter, the element coverage range and the element thickness parameter to obtain a product result, and calculating a product between the product result and a preset coefficient to serve as an element effect parameter of the element area. Specifically, the element effect parameter may be output using the code half morphvalue level 5.0 snowHeight. Wherein, half lempwvalue represents element effect parameters, snorhrange represents element coverage, level represents influence degree parameters, snorheight represents element thickness parameters, and 5.0 represents a preset coefficient.
By repeatedly executing the above-described process, the element effect parameter can be calculated for each of the plurality of element regions, respectively, to obtain the element effect parameters of the plurality of element regions.
205. And writing the element effect parameter interpolation into a plurality of element areas to obtain a target game scene baked with target scene elements.
In the embodiment of the application, after the element effect parameters are determined, the element effect parameters are interpolated and written into the plurality of element areas, so that the target game scene with the baked target scene elements can be obtained. After determining the interpolation of the element effect parameters, the required Albedo (reflectivity), Smoothness (Smoothness), normal and height may be generated in an element resource library such as a Substance designer according to the element effect parameters, where the a channel of the former is Smoothness and the b channel of the former is height, and the corresponding values of the plurality of element regions may be interpolated.
It should be noted that, when performing interpolation, it is necessary to interpolate to 0.0 the Metallic option in the game development tool, so as to reduce the degree of metallization of the scene model in the interpolated target scene element and avoid distortion of the scene model with excessively high degree of metallization.
206. Indirect light in the target game scene is adjusted.
In the embodiment of the present application, since the target scene element has its own element color, and after the target scene element is stacked in the target game scene, the scene model in the target game scene basically shows the color of the target scene element, and the brightness of the ambient light in the target game scene is obviously improved relative to the previously unrendered surface after the ambient light bounces off the surface of the stacked target scene element, the indirect light in the target game scene is adjusted to brighten the ambient light in the target game scene. Specifically, the adjustment of indirect light can be performed in two ways:
the first way is to split indirect light in the target game scene to obtain a reflection coefficient and an illumination coefficient, determine weather color data of scene weather related to the target scene element, calculate a first product of the weather color data and the reflection coefficient, calculate a second product of the element color data of the target scene element and the illumination coefficient, calculate a sum of the first product and the second product, and render scene illumination for the target game scene according to the sum. I.e. the ambient light, which changes in real time at runtime, the Environment Lighting is set to Gradient in the game development tool and the tristimulus values are set to grey values, thus splitting the indirect light into ambient light brightness values reflection coefficients + model surface colors illumination coefficients. And when the system runs, acquiring real-time weather color data, calculating a reflection coefficient and an element color data illumination coefficient of the weather color data as the color of the target indirect light, and performing interpolation rendering of the indirect light by using the variable _ Snowprogress mentioned in the step 204 as an interpolation variable.
The second mode is that the influence degree parameters of a plurality of element areas are determined, indirect light is fitted according to the influence degree parameters to obtain appointed color brightness, and the appointed color brightness is subjected to interpolation rendering to the target game scene. That is, baking with Skybox (sky box) to get the color of the ambient light, and interpolating the indirect light to a specified color brightness according to the stacking degree of the target scene element.
The method provided by the embodiment of the application generates a scene height map describing the height relationship of a plurality of scene models in an original game scene, creates at least one model local area for each scene model in the original game scene, further adds target scene elements to the original game scene based on the scene height map and the at least one model local area of each scene model to form at least one element area, determines element effect parameters of the element areas, interpolates and writes the element effect parameters into the element areas to generate a target game scene baked with the target scene elements, so that whether the surface of an entity model is shielded or not and the range of the elements to be covered can be accurately judged according to the scene height map and the model local area, and the corresponding scene elements can be directly added to the entity model according to the indication of the scene height map and the model local area, a large number of maps are not required to be made, the scene generation efficiency is improved, the correct game scene can be obtained conveniently, and the reality of the generated game scene can be improved.
Further, as a specific implementation of the method shown in fig. 1, an embodiment of the present application provides a game scene generation apparatus, and as shown in fig. 3A, the apparatus includes: a first generation module 301, a creation module 302, an addition module 303 and a second generation module 304.
The first generating module 301 is configured to generate a scene height map of an original game scene, where the scene height map describes a height relationship of the scene models in the original game scene;
the creating module 302 is configured to create at least one model local area for each scene model in the original game scene;
the adding module 303 is configured to add a target scene element to the original game scene based on the scene height map and the at least one model local area of each scene model to form at least one element area;
the second generating module 304 is configured to determine element effect parameters of the multiple element regions, interpolate and write the element effect parameters into the multiple element regions, and generate a target game scene baked with the target scene elements.
In a specific application scenario, the first generating module 301 is configured to divide the original game scenario into a plurality of scenario blocks according to a preset division size; performing collision detection on each scene block in the plurality of scene blocks in the original game scene to generate a height parameter of each scene block; baking the scene models based on the height parameters corresponding to the scene blocks to generate a scene height map of the original game scene.
In a specific application scenario, the first generating module 301 is configured to perform collision detection on each scene tile in the plurality of scene tiles, determine a plurality of target collision volumes of the scene tile in the original game scene, and query a plurality of target scene tiles in which the plurality of target collision volumes are located in the plurality of scene tiles; determining a first target scene tile and a second target scene tile among the plurality of target scene tiles, the height of the first target scene tile being the largest of the plurality of target scene tiles, the height of the second target scene tile being the smallest of the plurality of target scene tiles; dividing the first target scene block into a preset number of block units, and determining a target block unit in the preset number of block units, wherein the height of the target block unit is the largest among the preset number of block units; comparing the height of the target block unit with a preset height, and generating a height parameter of the scene block according to a comparison result; and repeating the process of generating the height parameters, and respectively generating the height parameters for each scene block.
In a specific application scenario, the first generating module 301 is configured to calculate a first difference between the height of the first target block and the height of the second target block when the comparison result indicates that the height of the target block unit is smaller than the preset height, and generate the height parameter according to the first difference; when the comparison result indicates that the height of the target block unit is greater than or equal to the preset height, continuing to divide the target block unit into the preset number of secondary block units, judging whether the height of the secondary block unit with the highest height in the preset number of secondary block units is smaller than the preset height or not, dividing the secondary block unit with the highest height until the height is determined to be smaller than the preset height, calculating a third difference value between the height of the designated unit and the height of the second target block, and generating the height parameter according to the third difference value.
In a specific application scenario, the creating module 302 is configured to obtain, for each scene model in the original game scenario, at least one local area creation parameter, where the local area creation parameter indicates a direction, an angle, and an area of a model local area created for the scene model; creating the at least one model local area surface on the top of the scene model according to the at least one local area surface creation parameter; and repeatedly executing the creation process of the model local area surface, and respectively creating the at least one model local area surface for each scene model.
In a specific application scenario, the adding module 303 is configured to, for each of the plurality of scene models, construct the target scene element on at least one model local surface of the scene model; the simulated target scene elements are transmitted to the scene model according to the normal direction of the at least one model local area surface, and the target scene elements are controlled to hit the scene model to form at least one initial area; querying a designated scene block of the at least one initial region in the scene height map; increasing the block size of the designated scene block to a preset block size in the scene height map, and detecting the sampling results of the at least one initial area and the designated scene block after the size is increased; and according to the sampling result, determining an occluded partial area in the at least one initial area, and clearing the target scene elements rendered in the partial area in the at least one initial area to obtain the at least one element area.
In a specific application scenario, the second generating module 304 is configured to, for each element region of the plurality of element regions, determine a target scene model covered by the element region, and query an angle difference between a surface normal of the target scene model and a spatial longitudinal axis; calculating the area top parameter, the area bottom parameter and the angle difference of the element area, and outputting the influence degree parameter of the element area; determining an area range corresponding to the angle difference, calculating the angle difference and the area range, and outputting an element coverage range of the element area; calculating the influence degree parameter, the element coverage range and the element color data of the target scene element to obtain an element thickness parameter; performing product calculation on the influence degree parameter, the element coverage range and the element thickness parameter to obtain a product result, and calculating a product between the product result and a preset coefficient to serve as an element effect parameter of the element area; and calculating element effect parameters for each element area in the element areas respectively to obtain the element effect parameters of the element areas.
In a specific application scenario, as shown in fig. 3B, the apparatus further includes: a rendering module 305.
The rendering module 305 is configured to split indirect light in the target game scene to obtain a reflection coefficient and an illumination coefficient, determine weather color data of scene weather related to the target scene element, calculate a first product of the weather color data and the reflection coefficient, calculate a second product of the element color data of the target scene element and the illumination coefficient, calculate a sum of the first product and the second product, and add scene illumination to the target game scene according to the sum; or determining the influence degree parameters of the element areas, fitting the indirect light according to the influence degree parameters to obtain the appointed color brightness, and writing the appointed color brightness into the target game scene through interpolation.
The device provided by the embodiment of the application generates a scene height map describing the height relationship of a plurality of scene models in an original game scene, creates at least one model local area for each scene model in the original game scene, further adds target scene elements to the original game scene based on the scene height map and the at least one model local area of each scene model to form at least one element area, determines element effect parameters of the element areas, interpolates and writes the element effect parameters into the element areas to generate a target game scene baked with the target scene elements, so that whether the surface of an entity model is shielded or not and the range of the elements to be covered can be accurately judged according to the scene height map and the model local area, and the corresponding scene elements can be directly added to the entity model according to the indication of the scene height map and the model local area, a large number of maps are not required to be made, the scene generation efficiency is improved, the correct game scene can be obtained conveniently, and the reality of the generated game scene can be improved.
It should be noted that other corresponding descriptions of the functional units related to the game scene generating device provided in the embodiment of the present application may refer to the corresponding descriptions in fig. 1 and fig. 2A, and are not repeated herein.
In an exemplary embodiment, referring to fig. 4, there is further provided a device including a communication bus, a processor, a memory, and a communication interface, and further including an input/output interface and a display device, wherein the functional units may communicate with each other through the bus. The memory stores computer programs, and the processor is used for executing the programs stored in the memory and executing the game scene generation method in the embodiment.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the game scene generation method.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present application can be implemented by hardware, and also by software plus a necessary general hardware platform. Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the implementation scenarios of the present application.
Those skilled in the art will appreciate that the figures are merely schematic representations of one preferred implementation scenario and that the blocks or flow diagrams in the figures are not necessarily required to practice the present application.
Those skilled in the art will appreciate that the modules in the devices in the implementation scenario may be distributed in the devices in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the present implementation scenario with corresponding changes. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The above application serial numbers are for description purposes only and do not represent the superiority or inferiority of the implementation scenarios.
The above disclosure is only a few specific implementation scenarios of the present application, but the present application is not limited thereto, and any variations that can be made by those skilled in the art are intended to fall within the scope of the present application.

Claims (9)

1. A game scene generation method, comprising:
generating a scene height map of an original game scene, wherein the scene height map describes the height relation of a plurality of scene models in the original game scene;
creating at least one model local area for each scene model in the original game scene;
adding target scene elements to the original game scene based on the scene height map and the at least one model local area of each scene model to form at least one element area;
determining element effect parameters of a plurality of element areas, and interpolating and writing the element effect parameters into the plurality of element areas to generate a target game scene baked with the target scene elements;
wherein the generating of the scene height map of the original game scene comprises: dividing the original game scene into a plurality of scene blocks according to a preset division size; for each scene tile in the plurality of scene tiles, performing collision detection on the scene tile, determining a plurality of target collision volumes of the scene tile in the original game scene, and querying the plurality of target scene tiles in which the plurality of target collision volumes are located; determining a first target scene block with a largest height and a second target scene block with a smallest height among the plurality of target scene blocks; dividing the first target scene block into a preset number of block units, and determining a target block unit with the maximum height in the preset number of block units; comparing the height of the target block unit with a preset height, and generating a height parameter of the scene block according to a comparison result; baking the scene models based on the height parameters corresponding to the scene blocks to generate a scene height map of the original game scene.
2. The method of claim 1, wherein the comparing the height of the target block unit with a preset height and generating the height parameter of the scene block according to the comparison result comprises:
when the comparison result indicates that the height of the target block unit is smaller than the preset height, calculating a first difference value between the height of the first target scene block and the height of the second target scene block, and generating the height parameter according to the first difference value;
when the comparison result indicates that the height of the target block unit is greater than or equal to the preset height, continuing to divide the target block unit into the preset number of secondary block units, judging whether the height of the secondary block unit with the highest height in the preset number of secondary block units is smaller than the preset height or not, dividing the secondary block unit with the highest height until the height is determined to be smaller than the preset height, calculating a third difference value between the unit height of the designated unit and the height of the second target scene block, and generating the height parameter according to the third difference value.
3. The method of claim 1, wherein creating at least one model local area surface for each scene model in the original game scene comprises:
for each scene model in the original game scene, acquiring at least one local area surface creation parameter, wherein the local area surface creation parameter indicates the direction, the angle and the area of a model local area surface created for the scene model;
creating the at least one model local area surface on the top of the scene model according to the at least one local area surface creation parameter;
and repeatedly executing the creation process of the model local area surface, and respectively creating the at least one model local area surface for each scene model.
4. The method of claim 1, wherein adding a target scene element to the original game scene based on the scene height map and the at least one model local surface of each scene model to form at least one element region comprises:
for each scene model of the plurality of scene models, constructing the target scene element on at least one model local surface of the scene model;
the simulated target scene elements are transmitted to the scene model according to the normal direction of the at least one model local area surface, and the target scene elements are controlled to hit the scene model to form at least one initial area;
querying a designated scene block of the at least one initial region in the scene height map;
increasing the block size of the designated scene block to a preset block size in the scene height map, and detecting the sampling results of the at least one initial area and the designated scene block after the size is increased;
and according to the sampling result, determining an occluded partial area in the at least one initial area, and clearing the target scene elements rendered in the partial area in the at least one initial area to obtain the at least one element area.
5. The method of claim 1, wherein determining the element effect parameters for the plurality of element regions comprises:
for each element area in the plurality of element areas, determining a target scene model covered by the element area, and inquiring the angle difference between the surface normal of the target scene model and the space longitudinal axis;
calculating the area top parameter, the area bottom parameter and the angle difference of the element area, and outputting the influence degree parameter of the element area;
determining an area range corresponding to the angle difference, calculating the angle difference and the area range, and outputting an element coverage range of the element area;
calculating the influence degree parameter, the element coverage range and the element color data of the target scene element to obtain an element thickness parameter;
performing product calculation on the influence degree parameter, the element coverage range and the element thickness parameter to obtain a product result, and calculating a product between the product result and a preset coefficient to serve as an element effect parameter of the element area;
and calculating element effect parameters for each element area in the element areas respectively to obtain the element effect parameters of the element areas.
6. The method of claim 1, further comprising:
splitting indirect light in the target game scene to obtain a reflection coefficient and an illumination coefficient, determining weather color data of scene weather related to the target scene element, calculating a first product of the weather color data and the reflection coefficient, calculating a second product of the element color data of the target scene element and the illumination coefficient, calculating a sum of the first product and the second product, and rendering scene illumination for the target game scene according to the sum; or the like, or, alternatively,
determining the influence degree parameters of the element areas, fitting the indirect light according to the influence degree parameters to obtain appointed color brightness, and performing interpolation rendering on the appointed color brightness to the target game scene.
7. A game scene generation apparatus, comprising:
the game system comprises a first generation module, a second generation module and a third generation module, wherein the first generation module is used for generating a scene height map of an original game scene, and the scene height map describes the height relation of a plurality of scene models in the original game scene;
the creating module is used for creating at least one model local area surface for each scene model in the original game scene;
an adding module, configured to add a target scene element to the original game scene based on the scene height map and the at least one model local surface of each scene model to form at least one element region;
the second generation module is used for determining element effect parameters of a plurality of element areas, interpolating and writing the element effect parameters into the plurality of element areas, and generating a target game scene baked with the target scene elements;
the first generation module is specifically used for dividing the original game scene into a plurality of scene blocks according to a preset division size; for each scene tile in the plurality of scene tiles, performing collision detection on the scene tile, determining a plurality of target collision volumes of the scene tile in the original game scene, and querying the plurality of target scene tiles in which the plurality of target collision volumes are located; determining a first target scene block with a largest height and a second target scene block with a smallest height among the plurality of target scene blocks; dividing the first target scene block into a preset number of block units, and determining a target block unit with the maximum height in the preset number of block units; comparing the height of the target block unit with a preset height, and generating a height parameter of the scene block according to a comparison result; baking the scene models based on the height parameters corresponding to the scene blocks to generate a scene height map of the original game scene.
8. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 6 when executing the computer program.
9. A readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
CN202110735353.6A 2021-06-30 2021-06-30 Game scene generation method and device, computer equipment and readable storage medium Active CN113470169B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110735353.6A CN113470169B (en) 2021-06-30 2021-06-30 Game scene generation method and device, computer equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110735353.6A CN113470169B (en) 2021-06-30 2021-06-30 Game scene generation method and device, computer equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN113470169A CN113470169A (en) 2021-10-01
CN113470169B true CN113470169B (en) 2022-04-29

Family

ID=77874394

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110735353.6A Active CN113470169B (en) 2021-06-30 2021-06-30 Game scene generation method and device, computer equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN113470169B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114285888B (en) * 2021-12-22 2023-08-18 福建天晴数码有限公司 Method and system for realizing seamless world

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110262865A (en) * 2019-06-14 2019-09-20 网易(杭州)网络有限公司 Construct method and device, the computer storage medium, electronic equipment of scene of game
CN110706324A (en) * 2019-10-18 2020-01-17 网易(杭州)网络有限公司 Method and device for rendering weather particles
CN111870952A (en) * 2020-07-24 2020-11-03 上海米哈游天命科技有限公司 Height map generation method, device, equipment and storage medium
CN112023400A (en) * 2020-07-24 2020-12-04 上海米哈游天命科技有限公司 Height map generation method, device, equipment and storage medium
CN112102492A (en) * 2020-08-21 2020-12-18 完美世界(北京)软件科技发展有限公司 Game resource manufacturing method and device, storage medium and terminal
CN112802165A (en) * 2020-12-31 2021-05-14 珠海剑心互动娱乐有限公司 Game scene snow accumulation rendering method, device and medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112891946B (en) * 2021-03-15 2024-05-28 网易(杭州)网络有限公司 Game scene generation method and device, readable storage medium and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110262865A (en) * 2019-06-14 2019-09-20 网易(杭州)网络有限公司 Construct method and device, the computer storage medium, electronic equipment of scene of game
CN110706324A (en) * 2019-10-18 2020-01-17 网易(杭州)网络有限公司 Method and device for rendering weather particles
CN111870952A (en) * 2020-07-24 2020-11-03 上海米哈游天命科技有限公司 Height map generation method, device, equipment and storage medium
CN112023400A (en) * 2020-07-24 2020-12-04 上海米哈游天命科技有限公司 Height map generation method, device, equipment and storage medium
CN112102492A (en) * 2020-08-21 2020-12-18 完美世界(北京)软件科技发展有限公司 Game resource manufacturing method and device, storage medium and terminal
CN112802165A (en) * 2020-12-31 2021-05-14 珠海剑心互动娱乐有限公司 Game scene snow accumulation rendering method, device and medium

Also Published As

Publication number Publication date
CN113470169A (en) 2021-10-01

Similar Documents

Publication Publication Date Title
CN111105491B (en) Scene rendering method and device, computer readable storage medium and computer equipment
US8223148B1 (en) Method and apparatus for computing indirect lighting for global illumination rendering in 3-D computer graphics
CN109771951B (en) Game map generation method, device, storage medium and electronic equipment
CN111968216B (en) Volume cloud shadow rendering method and device, electronic equipment and storage medium
US8207968B1 (en) Method and apparatus for irradiance caching in computing indirect lighting in 3-D computer graphics
US9619920B2 (en) Method and system for efficient modeling of specular reflection
CN110738721A (en) Three-dimensional scene rendering acceleration method and system based on video geometric analysis
US20130120385A1 (en) Methods and Apparatus for Diffuse Indirect Illumination Computation using Progressive Interleaved Irradiance Sampling
US20050041024A1 (en) Method and apparatus for real-time global illumination incorporating stream processor based hybrid ray tracing
US20100136507A1 (en) Driving simulation apparatus, wide-angle camera video simulation apparatus, and image deforming/compositing apparatus
CN111968215A (en) Volume light rendering method and device, electronic equipment and storage medium
CN102446365B (en) Estimate virtual environment a bit on the method for light quantity that receives
CN113457137B (en) Game scene generation method and device, computer equipment and readable storage medium
CN112102492B (en) Game resource manufacturing method and device, storage medium and terminal
CN113470169B (en) Game scene generation method and device, computer equipment and readable storage medium
CN111870953A (en) Height map generation method, device, equipment and storage medium
CN112973121B (en) Reflection effect generation method and device, storage medium and computer equipment
CN115487495A (en) Data rendering method and device
US9311747B2 (en) Three-dimensional image display device and three-dimensional image display program
CN116958457A (en) OSGEarth-based war misting effect drawing method
CN115035231A (en) Shadow baking method, shadow baking device, electronic apparatus, and storage medium
CN115803782A (en) Augmented reality effect of perception geometry with real-time depth map
CN112473135A (en) Real-time illumination simulation method, device, equipment and storage medium for mobile game
CN116824082B (en) Virtual terrain rendering method, device, equipment, storage medium and program product
CN116266374A (en) Image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant