CN114119836A - Rendering method, rendering device, electronic equipment and readable storage medium - Google Patents
Rendering method, rendering device, electronic equipment and readable storage medium Download PDFInfo
- Publication number
- CN114119836A CN114119836A CN202111469649.4A CN202111469649A CN114119836A CN 114119836 A CN114119836 A CN 114119836A CN 202111469649 A CN202111469649 A CN 202111469649A CN 114119836 A CN114119836 A CN 114119836A
- Authority
- CN
- China
- Prior art keywords
- terrain
- prosthesis
- scene
- map
- rendered
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 72
- 238000009877 rendering Methods 0.000 title claims abstract description 59
- 238000003860 storage Methods 0.000 title claims abstract description 23
- 239000000463 material Substances 0.000 claims abstract description 82
- 230000000694 effects Effects 0.000 claims abstract description 40
- 230000003068 static effect Effects 0.000 claims abstract description 24
- 230000004927 fusion Effects 0.000 claims description 7
- 238000013507 mapping Methods 0.000 claims description 2
- 230000011218 segmentation Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 14
- 238000004590 computer program Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 9
- 238000012876 topography Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 241000272184 Falconiformes Species 0.000 description 2
- 235000008331 Pinus X rigitaeda Nutrition 0.000 description 2
- 235000011613 Pinus brutia Nutrition 0.000 description 2
- 241000018646 Pinus brutia Species 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000005520 cutting process Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000011022 operating instruction Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000002994 raw material Substances 0.000 description 1
- 239000002023 wood Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Remote Sensing (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application discloses a rendering method, a rendering device, an electronic device and a readable storage medium, wherein the method comprises the following steps: receiving a terrain prosthesis generated according to real terrain, and generating the material and the map of the terrain prosthesis, wherein the terrain prosthesis is a static grid; according to the position information of a target virtual character in a scene to be rendered, dividing the scene to be rendered into a near view terrain and a far view terrain; generating a checkpoint simplified model according to the real terrain corresponding to the close-range terrain; and fusing the checkpoint simplified model and the terrain prosthesis to realize the rendering of the scene to be rendered. On the premise of guaranteeing the long-range effect and the topographic effect, the method and the device realize the purposes of using less memory, reducing the IO frequency and achieving the balance of effect and performance overhead; for the player, aiming at different mobile device performances of the player, under the condition of ensuring a relatively ideal effect, the memory and performance overhead is reduced as much as possible, and the game experience is greatly improved.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a rendering method, an apparatus, an electronic device, and a readable storage medium.
Background
With the increasing popularity of open world games, the terrain in the scene is also larger and larger, and the memory and rendering costs are also obviously increased, especially for mobile-end games, the terrain in the scene is large, wherein the far-view terrain effect is not obvious, but still occupies a large performance cost.
In the prior art, for rendering a large scene, a UE4 engine is usually used for rendering, a near-scene terrain is Landscape (a rendering method in UE4, which is not a uniform name in the industry), and a far-scene terrain can be implemented in a LOD (level of Details, a rendering method in UE 4), which is simply understood as baking a small piece of terrain into a simple static grid and storing the static grid in a checkpoint.
However, the use of the UE4 engine has many disadvantages, for example, the default LOD of the terrain in the UE4 engine is to divide the terrain into a plurality of models, and then to splice the models together, which cannot achieve topological continuity, and can see the joints between the land parcels; for another example, the UE4 engine is not suitable for a larger scenario of 4K or more, and in the larger scenario of 4K or more, although the LOD reduces the number of modes, due to a large increase of the land parcel, Input/Output (IO) is very frequent, so that IO consumption is increased, and a large performance overhead is occupied.
Disclosure of Invention
In view of the foregoing, embodiments of the present application provide a rendering method, an apparatus, an electronic device, and a readable storage medium, so as to overcome or partially overcome the defects in the prior art.
In a first aspect, a rendering method is provided, including:
receiving a terrain prosthesis generated according to real terrain, and generating the material and the map of the terrain prosthesis, wherein the terrain prosthesis is a static grid;
according to the position information of a target virtual character in a scene to be rendered, dividing the scene to be rendered into a near view terrain and a far view terrain;
generating a checkpoint simplified model according to the real terrain corresponding to the close-range terrain;
and fusing the checkpoint simplified model and the terrain prosthesis to realize the rendering of the scene to be rendered.
In a second aspect, there is provided a rendering apparatus, the apparatus comprising:
the system comprises a first generation unit, a second generation unit and a third generation unit, wherein the first generation unit is used for receiving a terrain prosthesis generated according to real terrain and generating the material and the mapping of the terrain prosthesis, and the terrain prosthesis is a static grid;
the segmentation unit is used for segmenting the real terrain of the scene to be rendered into a near-view terrain and a far-view terrain according to the position information of the target virtual character in the scene to be rendered;
the second generation unit is used for generating a checkpoint simplified model according to a real terrain corresponding to a close-range terrain;
and the fusion unit is used for fusing the checkpoint simplified model and the terrain prosthesis so as to realize the rendering of the scene to be rendered.
In a third aspect, an embodiment of the present application further provides an electronic device, including: a processor; and a memory arranged to store computer executable instructions that, when executed, cause the processor to perform any of the methods described above.
In a fourth aspect, this application embodiment also provides a computer-readable storage medium storing one or more programs which, when executed by an electronic device including a plurality of application programs, cause the electronic device to perform any of the methods described above.
The embodiment of the application adopts at least one technical scheme which can achieve the following beneficial effects:
according to the method and the device, the game scene is divided into the near-view terrain and the far-view terrain according to the position of the target virtual character in the game scene, the far-view terrain is replaced by the terrain prosthesis, and the near-view terrain is replaced by the terrain simple model according to the real terrain, so that the whole game scene is rendered. On the premise of ensuring the overall effect, the memory demand and the frequency of IO Input/Output (Input/Output) are greatly reduced, and the balance between the effect and the performance overhead is achieved; for the player, aiming at different mobile device performances of the player, under the condition of ensuring a relatively ideal effect, the memory and performance overhead is reduced as much as possible, and the game experience is greatly improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 shows a flow diagram of a rendering method according to an embodiment of the application;
FIG. 2 shows a flow diagram of a rendering method according to another embodiment of the present application;
FIG. 3 shows a schematic structural diagram of a rendering apparatus according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
In the prior art, a UE4 engine is usually adopted to render a scene in a game, because the LOD of the UE4 engine divides a terrain into a plurality of plots, each plot generates a new static grid according to the parameters of the LOD, the plurality of models are spliced into a remote terrain, continuity of a topological structure cannot be achieved, a player can see gaps between the models, and the game experience is poor; in the open world, the terrain is required to reach 4K (the resolution ratio means that the pixel value of each row in the horizontal direction reaches or approaches 4096, and the picture-to-width ratio is not considered) or even larger, so that a plurality of plot gates exist, when a player moves at a high speed, loading and unloading are frequently triggered, large IO consumption is generated, the performance of equipment is greatly occupied, and even the situation of blocking is caused.
The utility model provides a design lies in, to above-mentioned circumstances, this application has introduced the tactics of topography false body, at first according to the true topography that fine arts submitted, generate a lower topography false body of precision for replace distant view topography, then fuse together its and the close-range topography that adopts the topography simplified die rendering technique, thereby realized the rendering of whole scene, can guarantee the display effect of distant topography like this, also can not need the LOD of loading distant place landmass, thereby reach and show the purpose that reduces IO, be particularly useful for 4K and above big scene.
Fig. 1 shows a schematic flowchart of a rendering method according to an embodiment of the present application, and as can be seen from fig. 1, the present application at least includes steps S110 to S140:
step S110: receiving a terrain prosthesis generated according to real terrain, and generating the material and the map of the terrain prosthesis, wherein the terrain prosthesis is a static grid.
The real terrain of a scene in a game, which is the real visual experience that game developers want to provide to game players, is typically carved by artistic personnel. However, in practice, because the scene is very large and the resolution requirement is higher and higher, if the whole large scene is rendered by a method with relatively high precision, the calculation amount is very large, the requirement on the hardware performance is very high, and the method is an irrevocable way.
In the prior art, in order to avoid the situation, a UE4 engine is used to render a scene in a game, a terrain is divided into a plurality of plots, each plot generates a new static grid according to the parameters of the LOD, a plurality of models are spliced into a remote terrain, continuity of a topological structure cannot be achieved, a player can see gaps between the models, and the game experience is poor; and are not suitable for large scenes with high resolution.
The rendering method can well solve the problems, firstly, a terrain prosthesis generated according to real terrain is received, the terrain prosthesis is a static grid, a static grid body is used as a model of a distant view terrain prosthesis, the static grid body is a basic unit used for creating a checkpoint world scene geometry body in an illusion engine, and the form of the static grid body is usually 'static grid body Actor'. The static grid body can achieve a relatively ideal effect, so that an ideal visual effect can be guaranteed, and the performance overhead of hardware can be saved.
The terrain prosthesis is generated aiming at the whole scene to be rendered, and the terrain prosthesis can be stored in the scene to be rendered all the time, so that the current situation that the block checkpoint is frequently triggered to be loaded and unloaded in the prior art is avoided, and the IO frequency is obviously reduced.
In a topographical prosthesis, a simple shape may be used to represent an "object", such as a hill, for example, as opposed to a vertebral body. Of course, because the terrain prosthesis is a static grid, in the process of rendering, the material and the map of the terrain prosthesis need to be generated, that is, the surface of the vertebral body needs to be covered with the vegetation mountain stones and other objects, and the vegetation mountain stones and other objects can be expressed by the material and the map.
For the generation of the texture and the map, any one of the prior arts may be referred to, such as the generation method of the texture and the map in the UE4 engine.
The terrain prosthesis is an integral static mesh body generated for the entire scene, and therefore, unlike the prior art, which uses the LOD of UE4, the player does not see the gap between the models.
Step S120: and according to the position information of the virtual character in the scene to be rendered, dividing the scene to be rendered into a near view terrain and a far view terrain.
The method includes dividing a scene to be rendered into a near view terrain and a far view terrain, considering a distance between a virtual character and the scene to be rendered, for example, a distance may be used as a basis, a distance threshold is preset, a terrain having a distance with a target virtual character less than or equal to the distance threshold is defined as the near view terrain, a terrain having a distance with the target virtual character greater than the distance threshold is defined as the far view terrain, and the dividing manner is only an exemplary description, and other dividing manners may be adopted, which may be determined according to a rendering effect to be achieved.
Step S130: and generating a checkpoint simplified model according to the real terrain corresponding to the close-range terrain.
For the short-range terrain and the long-range terrain, two different processing modes are adopted, for the short-range terrain part, a method with higher precision and more ideal effect can be required, and the method can be selected according to the requirement. Such as may be done using a prior art multiple level of detail (LOD) approach. Since the accuracy requirement for the close-range terrain is relatively high, the LOD level can be set relatively high to meet the expected accuracy requirement. In the LOD, in the near-field terrain, a plurality of models are usually included, and the LOD usually renders each model separately, and a relatively ideal rendering effect can be obtained by using the LOD method, but a large amount of hardware performance is consumed.
For this, the above-mentioned manner may be replaced by a manner of integrating a plurality of models into one simple model by using a checkpoint simple model, for example, by using a hierarchical level of detail (HLOD) method, a model community is first established, and then a checkpoint simple model is generated.
The process of establishing the model community determines how to group a plurality of models in a scene, the number of the groups, whether to generate materials and the like. The reasonable grouping of the models is realized by fully considering the space, the observation frequency, the self-positioning strategy of a maker and the like, so that a cluster consisting of different grouped intra-group models is obtained.
After a satisfactory model cluster is obtained, a checkpoint simplified model can be generated, namely, a new grid body is generated for the obtained model cluster, in the process, materials are combined, and a new illumination map is generated (if needed).
Step S140: and fusing the rendered close-range terrain with the terrain prosthesis to realize the rendering of the scene to be rendered.
And finally, fusing the generated checkpoint simple model with the terrain prosthesis to finish rendering the scene to be rendered.
The fusion mode is not limited in the application, and the generated checkpoint short-form co-location is adopted to replace the corresponding position of the terrain prosthesis, so that the rendering work of the whole scene to be rendered is completed.
Due to the existence of the terrain prosthesis, a high-precision rendering method is adopted for rendering the close-range terrain at this time; and for distant terrain, the terrain prosthesis with generated materials and pictures is still reserved without processing, so that the aims of ensuring the visual effect and reducing the consumption of performance are fulfilled.
As can be seen from the method shown in fig. 1, according to the position of the target virtual character in the game scene, the game scene is divided into the near-view terrain and the far-view terrain, the far-view terrain is replaced by the terrain prosthesis, and the near-view terrain is replaced by the terrain simplified model according to the real terrain, so that the rendering of the whole game scene is realized. On the premise of ensuring the overall effect, the memory demand and IO frequency are greatly reduced, and the balance between the effect and the performance overhead is achieved; for the player, aiming at different mobile device performances of the player, under the condition of ensuring a relatively ideal effect, the memory and performance overhead is reduced as much as possible, and the game experience is greatly improved.
In some embodiments of the application, the receiving a terrain prosthesis generated from real terrain comprises: receiving the terrain prosthesis, wherein the terrain prosthesis is generated according to real terrain based on Houdini software, and the height of each vertex of the terrain prosthesis is lower than the real terrain.
The terrain prosthesis can be understood as a large model prosthesis for the whole scene, the model prosthesis is a proxy grid, the terrain prosthesis is usually generated by third-party software, and for the generation process of the terrain prosthesis, the third-party software in the prior art can be used for generating the terrain prosthesis, and the idea is to simplify the real terrain, as mentioned above, a mountain is simplified into a vertebral body and the like. The accuracy of the generated terrain prosthesis is lower than that of the real terrain, so that the calculation amount can be reduced, and the visual effect of the terrain prosthesis cannot be obviously changed although the accuracy is reduced because the distance of the distant terrain is relatively far.
In some embodiments of the present application, the third-party software is Houdini (a three-dimensional computer graphics software, which has no unified chinese name in the industry for a moment), and since Houdini is a product designed based on a node mode, Houdini has an interface of a third-party renderer, and can export a scene to other rendering engines for rendering. The reason why the Houdini software is recommended to be used is that the height of each vertex of the terrain prosthesis is lower than that of the real terrain through the programming node programming limitation of the Houdini software, so that 'die-cutting' is avoided, and bad game experience brought to game players by die-cutting is avoided.
In some embodiments of the present application, for the generation of the material and the map, reference may be made to the following methods: for a material, receiving a first generation parameter of the material, and generating the material of the terrain prosthesis according to the first generation parameter, wherein the first generation parameter comprises: material type, material color, material texture, etc.; for a map, receiving a second generation parameter of the map, and generating the map of the terrain prosthesis according to the second generation parameter, wherein the second generation parameter comprises: source of the map, size of the map, pose of the map, etc.
For example, for the generation of the map and the material, the generation parameters of the map and the material may be generated, and the generation parameters may be set manually, for example, for the trunk portion of a tree model, when the material is selected, the material raw material and the material texture may be selected, and if the material is selected to be wood originally, and the material texture is pine texture, the tree model may show a pine appearance. Certainly, for a mountain in a long-distance view, generation of a map and a material of one tree is not needed, the material of the mountain is only selected to be the tree or a specific tree, if other objects exist on the mountain, if flying hawks exist, the hawks cannot adopt material representation, and parameters such as a map source, a map size, a map posture and the like can be designated and displayed by the map. It should be noted that, there is a very strong database storage for the map and the material, and when in use, the database can be directly called by selection, or if the database does not exist, the database can be built by itself and stored in the database to be called.
In other embodiments of the present application, for the generation of the texture and the map, reference may be made to the following methods: and receiving generation parameters of the material and the map, generating a material example corresponding to the terrain prosthesis according to the generation parameters, and generating the material and the map of the terrain prosthesis according to the material example.
Taking the material as an example, when generating the material, hundreds of materials may be needed, thereby affecting the size of the bag body, for this reason, in some embodiments of the present application, the material may not be directly used, but a material instance is adopted, which is an instance of the material, the material instance inherits the material, possesses shader logic of the material, but exposes adjustable parameters (i.e., Params) only, and shields the capability of editing the shader, so that when some model(s) need to modify the parameters, other models using the material may not be affected. And when the logic of the material is updated, all the material examples are updated accordingly.
The material example can be in the same place a plurality of materials integration, forms a material example, and in this application, because the topography prosthesis is the scene forever, so the material example can be static material example, drags the static material example that generates to the topography prosthesis on, can realize the generation of material and picture.
In some embodiments of the present application, in the above method, the fusing the checkpoint profile with the topographic prosthesis comprises: placing the terrain prosthesis at a corresponding position of the scene to be rendered; and responding to the scene loading instruction, executing the step of generating the checkpoint simplified model according to the real terrain corresponding to the close-range terrain.
For the terrain prosthesis, it can be understood as a permanent checkpoint of a scene, and in a permanent scene, it is only required to place the terrain prosthesis at a corresponding position in the scene, and it should be noted that the terrain prosthesis after generating a material and a map is used here.
When a game is operated, when a virtual character in the game reaches a certain position, a scene loading instruction is generated, a near-view terrain can be rendered in response to the scene loading instruction, specifically, a step of generating a checkpoint short mould according to a real terrain corresponding to the near-view terrain is executed, it needs to be noted that a part of the near-view terrain and a terrain prosthesis may overlap, and for the part, the checkpoint short mould of the near-view terrain is adopted to directly replace the terrain prosthesis.
In some embodiments of the present application, the fusing the checkpoint profile with the topographic prosthesis further comprises: and adjusting the material of the terrain prosthesis and the highlight and roughness of the map so as to enable the terrain prosthesis to be close to the real terrain.
Because the close-range terrain is rendered according to the real terrain, the influence of light and shadow is considered in the rendering, and in the fusion process with the terrain prosthesis, the effects of the material of the surface of the terrain prosthesis, the highlight position, the roughness and the like of the map are preferably adjusted to pursue the rendering effect which is closer to the real terrain.
In some embodiments of the present application, in the above method, the adjusting the highlight and the roughness of the material and the map of the terrain prosthesis comprises: receiving the effect parameters of the material and the map, and adjusting the display effect of the generated map and the material according to the effect parameters, wherein the effect parameters comprise highlight position and roughness.
The parameters such as the position of the light source, the roughness and the like also have great influence on the visual effect displayed by the model, and after the chartlet and the material are generated, the display effect of the generated chartlet and the material can be adjusted according to the obtained effect parameters so as to improve the reality degree of the distant-view terrain prosthesis.
Taking the light source position as an example, 8 am: 00, and 12 noon: 00, the position of the sun is different, and the formed shadow, gray scale and the like are also different, so that the effect parameters are preferably set, and the display effect of the chartlet and the material is adjusted according to the effect parameters so as to achieve the effect very close to the real terrain.
The effect parameters can be manually set or adjusted, and workers can adjust the effect parameters according to the light source position in the close-range terrain, the roughness of the material and the like, so that the display effect of the close-range terrain and the display effect of the far-range terrain are more coordinated and are also more connected with the real terrain.
In some embodiments of the present application, the method further comprises: any of the above methods are integrated into an interface-callable script so that the UE4 engine can call the script through the interface.
The UE4 engine is the most common rendering engine at present, and there are many places in the application that need to be assisted by the prior art, and for convenience of use, the code for implementing any method of the application can be combined into a script that can be called through an interface for the convenience of reference of the UE4 engine. Namely, during rendering, the UE4 engine can still be used as a rendering engine, and the method of the present application can be used to render a scene by calling a script.
Fig. 2 is a flowchart illustrating a rendering method according to another embodiment of the present application, and as can be seen from fig. 2, the present embodiment includes:
and receiving a terrain prosthesis generated by Houdini software according to the real terrain, and generating the material and the map of the terrain prosthesis, wherein each vertex of the terrain prosthesis is lower than the real terrain.
Setting a distance threshold, judging whether the distance between one model and a target virtual character is greater than the distance threshold, if so, dividing the model into distant view terrain, and if not, dividing the model into close view terrain.
And in response to the scene loading instruction, placing the terrain prosthesis at the corresponding position of the scene to be rendered, and performing high-precision rendering on the terrain in the close range according to the corresponding real terrain by adopting the existing method in the UE4 to finish rendering the scene to be rendered.
Fig. 3 shows a schematic structural diagram of a rendering apparatus according to an embodiment of the present application, and as can be seen from fig. 3, the apparatus 300 includes:
a first generating unit 310, configured to receive a terrain prosthesis generated according to real terrain, and generate a material and a map of the terrain prosthesis, where the terrain prosthesis is a static mesh;
a segmenting unit 320, configured to segment a real terrain of a scene to be rendered into a near-view terrain and a far-view terrain according to position information of a target virtual character in the scene to be rendered;
the second generating unit 330 is configured to generate a checkpoint simplified model according to a real terrain corresponding to a close-range terrain;
and a fusion unit 340, configured to fuse the checkpoint simplified model with the terrain prosthesis, so as to implement rendering of the scene to be rendered.
In some embodiments of the present application, in the above apparatus, the first generating unit 310 is configured to receive the terrain prosthesis, wherein the terrain prosthesis is generated according to real terrain based on Houdini software, and the height of each vertex of the terrain prosthesis is lower than the real terrain.
In some embodiments of the present application, in the above apparatus, the first generating unit 310 is configured to receive a first generating parameter of the material, and generate the material of the terrain prosthesis according to the first generating parameter, where the first generating parameter includes: material type, material color and material texture; receiving a second generation parameter of the map, and generating the map of the terrain prosthesis according to the second generation parameter, wherein the second generation parameter comprises: map source, map size, and map pose.
In some embodiments of the present application, in the above apparatus, the fusion unit 340 is configured to place the terrain prosthesis at a corresponding position of the scene to be rendered; and responding to the scene loading instruction, executing the step of generating the checkpoint simplified model according to the real terrain corresponding to the close-range terrain.
In some embodiments of the present application, in the above apparatus, the fusion unit 340 is further configured to adjust the material and the highlight and roughness of the map of the terrain prosthesis, so that the terrain prosthesis is close to the real terrain.
In some embodiments of the present application, in the above apparatus, the fusion unit 340 is further configured to receive effect parameters of the material and the map, and adjust the generated display effect of the map and the material according to the effect parameters, where the effect parameters include highlight position and roughness.
In some embodiments of the present application, the apparatus further comprises: an integration unit, configured to integrate any one of the above methods into an interface-callable script; the callable script is integrated into the UE4 engine through an interface.
It should be noted that the rendering apparatus can implement the rendering method one by one, and details are not repeated here.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application. Referring to fig. 4, at a hardware level, the electronic device includes a processor, and optionally further includes an internal bus, a network interface, and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory, such as at least 1 disk Memory. Of course, the electronic device may also include hardware required for other services.
The processor, the network interface, and the memory may be connected to each other via an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 4, but that does not indicate only one bus or one type of bus.
And the memory is used for storing programs. In particular, the program may include program code comprising computer operating instructions. The memory may include both memory and non-volatile storage and provides instructions and data to the processor.
The processor reads a corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to form the rendering device on a logic level. The processor is used for executing the program stored in the memory and is specifically used for executing the following operations:
receiving a terrain prosthesis generated according to real terrain, and generating the material and the map of the terrain prosthesis, wherein the terrain prosthesis is a static grid;
according to the position information of a target virtual character in a scene to be rendered, dividing the scene to be rendered into a near view terrain and a far view terrain;
generating a checkpoint simplified model according to the real terrain corresponding to the close-range terrain;
and fusing the checkpoint simplified model and the terrain prosthesis to realize the rendering of the scene to be rendered.
The method performed by the rendering apparatus according to the embodiment shown in fig. 3 of the present application may be applied to a processor, or may be implemented by a processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The electronic device may further execute the method executed by the rendering apparatus in fig. 3, and implement the function of the rendering apparatus in the embodiment shown in fig. 3, which is not described herein again in this embodiment of the present application.
An embodiment of the present application further provides a computer-readable storage medium storing one or more programs, where the one or more programs include instructions, which, when executed by an electronic device including a plurality of application programs, enable the electronic device to perform the method performed by the rendering apparatus in the embodiment shown in fig. 3, and are specifically configured to perform:
receiving a terrain prosthesis generated according to real terrain, and generating the material and the map of the terrain prosthesis, wherein the terrain prosthesis is a static grid;
according to the position information of a target virtual character in a scene to be rendered, dividing the scene to be rendered into a near view terrain and a far view terrain;
generating a checkpoint simplified model according to the real terrain corresponding to the close-range terrain;
and fusing the checkpoint simplified model and the terrain prosthesis to realize the rendering of the scene to be rendered. As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.
Claims (10)
1. A rendering method, comprising:
receiving a terrain prosthesis generated according to real terrain, and generating the material and the map of the terrain prosthesis, wherein the terrain prosthesis is a static grid;
according to the position information of a target virtual character in a scene to be rendered, dividing the scene to be rendered into a near view terrain and a far view terrain;
generating a checkpoint simplified model according to the real terrain corresponding to the close-range terrain;
and fusing the checkpoint simplified model and the terrain prosthesis to realize the rendering of the scene to be rendered.
2. The method of claim 1, wherein receiving a terrain prosthesis generated from real terrain comprises:
receiving the terrain prosthesis, wherein the terrain prosthesis is generated according to real terrain based on Houdini software, and the height of each vertex of the terrain prosthesis is lower than the real terrain.
3. The method of claim 1, wherein the generating a texture and map of the topographical prosthesis comprises:
receiving first generation parameters of the material, and generating the material of the terrain prosthesis according to the first generation parameters, wherein the first generation parameters comprise: material type, material color and material texture;
receiving a second generation parameter of the map, and generating the map of the terrain prosthesis according to the second generation parameter, wherein the second generation parameter comprises: map source, map size, and map pose.
4. The method of claim 1, wherein said fusing the checkpoint profile with the topographical prosthesis comprises:
placing the terrain prosthesis at a corresponding position of the scene to be rendered;
and responding to the scene loading instruction, executing the step of generating the checkpoint simplified model according to the real terrain corresponding to the close-range terrain.
5. The method of claim 4, wherein said fusing the checkpoint profile with the topographical prosthesis further comprises:
and adjusting the material of the terrain prosthesis and the highlight and roughness of the map so as to enable the terrain prosthesis to be close to the real terrain.
6. The method of claim 5, wherein the adjusting of the texture and texture of the topographical prosthesis and the relief of the relief comprises:
receiving the effect parameters of the material and the map, and adjusting the display effect of the generated map and the material according to the effect parameters, wherein the effect parameters comprise highlight position and roughness.
7. The method according to any one of claims 1 to 6, further comprising:
integrating the method of any one of claims 1-6 into an interface callable script;
the callable script is integrated into the UE4 engine through an interface.
8. A rendering apparatus, characterized in that the apparatus comprises:
the system comprises a first generation unit, a second generation unit and a third generation unit, wherein the first generation unit is used for receiving a terrain prosthesis generated according to real terrain and generating the material and the mapping of the terrain prosthesis, and the terrain prosthesis is a static grid;
the segmentation unit is used for segmenting the real terrain of the scene to be rendered into a near-view terrain and a far-view terrain according to the position information of the target virtual character in the scene to be rendered;
the second generation unit is used for generating a checkpoint simplified model according to a real terrain corresponding to a close-range terrain;
and the fusion unit is used for fusing the checkpoint simplified model and the terrain prosthesis so as to realize the rendering of the scene to be rendered.
9. An electronic device, comprising:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the method of claims 1 to 7.
10. A computer readable storage medium storing one or more programs which, when executed by an electronic device comprising a plurality of application programs, cause the electronic device to perform the method of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111469649.4A CN114119836A (en) | 2021-12-03 | 2021-12-03 | Rendering method, rendering device, electronic equipment and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111469649.4A CN114119836A (en) | 2021-12-03 | 2021-12-03 | Rendering method, rendering device, electronic equipment and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114119836A true CN114119836A (en) | 2022-03-01 |
Family
ID=80366673
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111469649.4A Pending CN114119836A (en) | 2021-12-03 | 2021-12-03 | Rendering method, rendering device, electronic equipment and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114119836A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118096985A (en) * | 2023-07-11 | 2024-05-28 | 北京艾尔飞康航空技术有限公司 | Real-time rendering method and device for virtual forest scene |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102722901A (en) * | 2011-03-29 | 2012-10-10 | 腾讯科技(深圳)有限公司 | Method and apparatus for processing images |
US20180322725A1 (en) * | 2017-05-04 | 2018-11-08 | Inspired Gaming (Uk) Limited | Generation of variations in computer graphics from intermediate file formats of limited variability, including generation of different game outcomes |
CN111105491A (en) * | 2019-11-25 | 2020-05-05 | 腾讯科技(深圳)有限公司 | Scene rendering method and device, computer readable storage medium and computer equipment |
CN111415400A (en) * | 2020-03-25 | 2020-07-14 | 网易(杭州)网络有限公司 | Model rendering method and device, electronic equipment and storage medium |
CN111882631A (en) * | 2020-07-24 | 2020-11-03 | 上海米哈游天命科技有限公司 | Model rendering method, device, equipment and storage medium |
-
2021
- 2021-12-03 CN CN202111469649.4A patent/CN114119836A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102722901A (en) * | 2011-03-29 | 2012-10-10 | 腾讯科技(深圳)有限公司 | Method and apparatus for processing images |
US20180322725A1 (en) * | 2017-05-04 | 2018-11-08 | Inspired Gaming (Uk) Limited | Generation of variations in computer graphics from intermediate file formats of limited variability, including generation of different game outcomes |
CN111105491A (en) * | 2019-11-25 | 2020-05-05 | 腾讯科技(深圳)有限公司 | Scene rendering method and device, computer readable storage medium and computer equipment |
CN111415400A (en) * | 2020-03-25 | 2020-07-14 | 网易(杭州)网络有限公司 | Model rendering method and device, electronic equipment and storage medium |
CN111882631A (en) * | 2020-07-24 | 2020-11-03 | 上海米哈游天命科技有限公司 | Model rendering method, device, equipment and storage medium |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118096985A (en) * | 2023-07-11 | 2024-05-28 | 北京艾尔飞康航空技术有限公司 | Real-time rendering method and device for virtual forest scene |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110458930B (en) | Rendering method and device of three-dimensional map and storage medium | |
CN111968216B (en) | Volume cloud shadow rendering method and device, electronic equipment and storage medium | |
CN111968215B (en) | Volume light rendering method and device, electronic equipment and storage medium | |
US8004518B2 (en) | Combined spatial index for static and dynamic objects within a three-dimensional scene | |
US8284195B2 (en) | Cooperative utilization of spatial indices between application and rendering hardware | |
CN111476877B (en) | Shadow rendering method and device, electronic equipment and storage medium | |
US20080259075A1 (en) | Dynamically Configuring and Selecting Multiple Ray Tracing Intersection Methods | |
CN113034657B (en) | Rendering method, device and equipment for illumination information in game scene | |
CN108805971B (en) | Ambient light shielding method | |
CN113658316B (en) | Rendering method and device of three-dimensional model, storage medium and computer equipment | |
CN112233214B (en) | Snow scene rendering method, device and equipment for large scene and storage medium | |
CN112233220A (en) | OpenSceneGraph-based volume light generation method, device, equipment and storage medium | |
CN116958379A (en) | Image rendering method, device, electronic equipment, storage medium and program product | |
CN114119834A (en) | Rendering method, rendering device, electronic equipment and readable storage medium | |
CN117218273A (en) | Image rendering method and device | |
CN114119836A (en) | Rendering method, rendering device, electronic equipment and readable storage medium | |
CN117974856A (en) | Rendering method, computing device and computer-readable storage medium | |
CN108280887B (en) | Shadow map determination method and device | |
CN114596348B (en) | Screen space-based ambient occlusion calculating method, device, operator and readable storage medium | |
CN113181642B (en) | Method and device for generating wall model with mixed materials | |
CN114307158A (en) | Three-dimensional virtual scene data generation method and device, storage medium and terminal | |
CN114299202A (en) | Processing method and device for virtual scene creation, storage medium and terminal | |
CN109729285B (en) | Fuse grid special effect generation method and device, electronic equipment and storage medium | |
CN113592999A (en) | Rendering method of virtual luminous body and related equipment | |
CN115423917B (en) | Real-time drawing method and system for global three-dimensional wind field |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |