CN113599818B - Vegetation rendering method and device, electronic equipment and readable storage medium - Google Patents

Vegetation rendering method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN113599818B
CN113599818B CN202110938397.9A CN202110938397A CN113599818B CN 113599818 B CN113599818 B CN 113599818B CN 202110938397 A CN202110938397 A CN 202110938397A CN 113599818 B CN113599818 B CN 113599818B
Authority
CN
China
Prior art keywords
vegetation
target
original
polygonal
patches
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110938397.9A
Other languages
Chinese (zh)
Other versions
CN113599818A (en
Inventor
闻亚洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202110938397.9A priority Critical patent/CN113599818B/en
Publication of CN113599818A publication Critical patent/CN113599818A/en
Application granted granted Critical
Publication of CN113599818B publication Critical patent/CN113599818B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The present disclosure provides a vegetation rendering method, device, electronic equipment and storage medium, the vegetation rendering method comprising: acquiring an original vegetation model corresponding to a vegetation object, wherein the original vegetation model comprises a plurality of original polygonal patches; the plurality of original polygonal patches form a coronal outline of the vegetation; amplifying and rotating the plurality of original polygonal patches respectively to obtain a plurality of target polygonal patches; obtaining a target vegetation model based on the plurality of target polygonal patches, wherein each target polygonal patch of the target vegetation model faces towards a virtual camera and at least part of adjacent target polygonal patches are intersected; rendering the target vegetation model to generate the vegetation object. According to the embodiment of the application, the rendering effect of vegetation can be guaranteed while the rendering efficiency is improved.

Description

Vegetation rendering method and device, electronic equipment and readable storage medium
Technical Field
The disclosure relates to the technical field of graphic rendering, in particular to a vegetation rendering method, a device, electronic equipment and a storage medium.
Background
Vegetation is an important element in a game scene, and more real natural environment can be simulated by rendering the vegetation in the game scene, so that substitution sense of players in the game is increased, and game experience of the players is improved.
In the development process of a game scene, due to the limitation of the performance of a mobile phone, a small cluster or branches and leaves of a branch are generally modeled in a generalized mode in a surface patch mode, a plurality of surface patches are distributed on the trunk, and a map of the leaf cluster is attached to serve as a leaf cluster of a tree in the game.
However, the above method reduces the loss of the system, but the rendering effect is poor, such as thin clumps of plants, poor volume feeling, and the like. Therefore, how to achieve both performance loss and vegetation rendering realism is a highly desirable problem.
Disclosure of Invention
The embodiment of the disclosure at least provides a vegetation rendering method, a vegetation rendering device, electronic equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a vegetation rendering method, including:
acquiring an original vegetation model corresponding to a vegetation object; the original vegetation model comprises a plurality of original polygonal patches that form a coronal outline of the vegetation;
Amplifying and rotating the plurality of original polygonal patches respectively to obtain a plurality of target polygonal patches;
obtaining a target vegetation model based on the plurality of target polygonal patches, wherein each target polygonal patch of the target vegetation model faces towards a virtual camera and at least part of adjacent target polygonal patches are intersected;
rendering the target vegetation model to generate the vegetation object.
In this disclosed embodiment, through carrying out the enlargement and the rotation respectively to a plurality of primitive polygon dough pieces for every target polygon dough piece that obtains all faces virtual camera, and at least partly is adjacent the crossing fork of target polygon dough piece, and then can improve the utilization ratio of model summit, make under the same summit calculated amount, final vegetation effect can be more dense, perhaps under the vegetation effect of same density, the overhead of summit can be less, namely, can be when improving the rendering efficiency, promoted the rendering effect of vegetation.
In a possible implementation manner according to the first aspect, the original polygonal surface patch is a quadrilateral surface patch.
In the embodiment of the disclosure, the original polygonal surface patch is a quadrilateral surface patch, so that the effect of the rendered vegetation object is better, and the rendering effect is improved.
In a possible implementation manner according to the first aspect, before the amplifying and rotating the plurality of original polygon patches, the method further includes:
performing coordinate transformation on the original UV coordinates of the plurality of original polygonal patches;
the amplifying and rotating the plurality of original polygonal patches respectively includes:
and respectively amplifying and rotating the plurality of original polygonal patches after coordinate conversion.
In the embodiment of the disclosure, the scaling according to the center of the original polygon patches is realized by performing coordinate conversion on the original UV coordinates of the plurality of original polygon patches, so that the effect after the plurality of target polygon patches are intersected is more uniform, and the rendering effect is further improved.
In a possible implementation manner, the method further includes, before the rendering the target vegetation model and generating the vegetation object:
determining a target view angle orientation of the virtual camera lens upon detecting a trigger event for the virtual camera lens adjustment;
and respectively carrying out rotation processing on the plurality of target polygonal patches according to the target visual angle direction, so that each target polygonal patch faces the virtual camera.
In the embodiment of the disclosure, after the visual angle direction of the virtual camera lens is changed, the plurality of target polygon patches are respectively rotated along with the visual angle direction of the virtual camera lens, so that the rendering effect of the presented vegetation object is not changed, and the visual experience of a player is improved.
In one possible implementation manner, the amplifying and rotating the plurality of original polygon patches to obtain a plurality of target polygon patches includes:
amplifying the original polygonal surface patch;
performing rotation processing on the amplified original polygonal surface piece based on the original orientation angle and the adjustment value of the original polygonal surface piece to obtain the target polygonal surface piece; the plurality of target polygon patches are intersected with each other.
In the embodiment of the disclosure, based on the original orientation angle and the adjustment value of the original polygonal surface patch, the amplified original polygonal surface patch is subjected to rotation processing, so that the angles of the orientation directions of the rotated target polygonal surface patches are not exactly the same although the target polygonal surface patches are all oriented to the virtual camera, the effect of staggered interweaving of leaves is achieved, and the rendering authenticity is improved.
In a possible implementation manner, the method further includes, before the rendering the target vegetation model:
obtaining wind attribute information in a target scene;
and dynamically processing the original UV coordinates of the plurality of original polygonal patches based on the wind attribute information.
In the embodiment of the disclosure, under the condition that wind blows exist in a game scene, a vegetation object can show the corresponding wind blowing effect, and then the rendered vegetation object is combined with natural weather, so that the rendering verisimilitude is improved.
In a possible implementation manner, the rendering the target vegetation model to generate the vegetation object includes:
and in the process of rendering the target vegetation model, eliminating pixels to generate the vegetation object.
In the embodiment of the disclosure, pixels are removed in the rendering process, so that a hollowed-out effect is generated on a vegetation object, and the rendered vegetation object is more real.
In a second aspect, embodiments of the present disclosure provide a vegetation rendering apparatus, including:
the acquisition module is used for acquiring an original vegetation model corresponding to the vegetation object; the original vegetation model comprises a plurality of original polygonal patches that form a coronal outline of the vegetation;
The processing module is used for respectively amplifying and rotating the plurality of original polygonal patches to obtain a plurality of target polygonal patches;
the determining module is used for obtaining a target vegetation model based on the plurality of target polygonal patches, wherein each target polygonal patch of the target vegetation model faces the virtual camera and at least part of adjacent target polygonal patches are intersected;
the rendering module is used for rendering the target vegetation model and generating the vegetation object.
According to a second aspect, in a possible implementation, the original polygonal surface patch is a quadrilateral surface patch.
According to a second aspect, in a possible implementation manner, the processing module is further configured to:
performing coordinate transformation on the original UV coordinates of the plurality of original polygonal patches; and
and respectively amplifying and rotating the plurality of original polygonal patches after coordinate conversion.
According to a second aspect, in a possible implementation manner, the determining module is further configured to:
determining a target view angle orientation of the virtual camera lens upon detecting a trigger event for the virtual camera lens adjustment;
The processing module is further configured to:
and respectively carrying out rotation processing on the plurality of target polygonal patches according to the target visual angle direction, so that each target polygonal patch faces the virtual camera.
According to a second aspect, in one possible implementation manner, the processing module is specifically configured to:
amplifying the original polygonal surface patch;
performing rotation processing on the amplified original polygonal surface piece based on the original orientation angle and the adjustment value of the original polygonal surface piece to obtain the target polygonal surface piece; the plurality of target polygon patches are intersected with each other.
According to a second aspect, in a possible implementation manner, the obtaining module is further configured to:
obtaining wind attribute information in a target scene;
the processing module is further configured to:
and dynamically processing the original UV coordinates of the plurality of original polygonal patches based on the wind attribute information.
According to a second aspect, in a possible implementation manner, the rendering module is specifically configured to:
and in the process of rendering the target vegetation model, eliminating pixels to generate the vegetation object.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory in communication via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the vegetation rendering method as described in the first aspect.
In a fourth aspect, embodiments of the present disclosure provide a computer-readable storage medium having a computer program stored thereon, which when executed by a processor performs the vegetation rendering method according to the first aspect.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for the embodiments are briefly described below, which are incorporated in and constitute a part of the specification, these drawings showing embodiments consistent with the present disclosure and together with the description serve to illustrate the technical solutions of the present disclosure. It is to be understood that the following drawings illustrate only certain embodiments of the present disclosure and are therefore not to be considered limiting of its scope, for the person of ordinary skill in the art may admit to other equally relevant drawings without inventive effort.
Fig. 1 shows a schematic diagram of an execution body of a vegetation rendering method provided by an embodiment of the present disclosure;
FIG. 2 illustrates a flow chart of a vegetation rendering method provided by embodiments of the present disclosure;
FIG. 3 shows a schematic view of an original vegetation model provided by an embodiment of the present disclosure;
FIG. 4 illustrates a schematic diagram of a target vegetation model provided by an embodiment of the present disclosure;
FIG. 5 illustrates a schematic diagram of a generated vegetation object provided by an embodiment of the present disclosure;
FIG. 6 is a flow chart of a method for enlarging and rotating a plurality of primitive polygon patches, respectively, according to an embodiment of the present disclosure;
FIG. 7 illustrates a flow chart of another vegetation rendering method provided by embodiments of the present disclosure;
FIG. 8 illustrates a flow chart of a method for dynamically processing an original polygonal patch provided by an embodiment of the present disclosure;
fig. 9 shows a schematic structural diagram of a vegetation rendering device according to an embodiment of the disclosure;
fig. 10 shows a schematic diagram of an electronic device provided by an embodiment of the disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, but not all embodiments. The components of the embodiments of the present disclosure, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure provided in the accompanying drawings is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be made by those skilled in the art based on the embodiments of this disclosure without making any inventive effort, are intended to be within the scope of this disclosure.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The term "and/or" is used herein to describe only one relationship, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Vegetation is an important element in a game scene, and more real natural environment can be simulated by rendering the vegetation in the game scene, so that substitution sense of players in the game is increased, and game experience of the players is improved.
With the development of the game industry, the more games are developed for the mobile terminal based on the flexibility of the mobile terminal. Currently, when a wide vegetation effect (such as grass sea, flower sea, forest, etc.) is shown in a game, a large number of patch geometries with texture maps are used, and textures are designated on the patch geometries to simulate the vegetation effect.
In the development of game scenes, it is difficult to model and render each leaf in a polygonal manner due to the limitation of the device performance of the game client on the calculation amount of polygon vertices. In general, a small cluster or a branch and leaf of a branch are modeled in a generalized manner in the form of a patch, and a plurality of patches are distributed on a trunk, and a map of the leaf cluster is attached to the trunk to be used as the leaf cluster of a tree in a game.
However, it was found that although the loss of the system was reduced, the rendering effect was poor, such as a thin cluster, a weak sense of volume, and the like. Therefore, how to achieve both performance loss and vegetation rendering realism is a highly desirable problem.
The present disclosure provides a vegetation rendering method, comprising: acquiring an original vegetation model corresponding to a vegetation object; the original vegetation model comprises a plurality of original polygonal patches that form a coronal outline of the vegetation; amplifying and rotating the plurality of original polygonal patches respectively to obtain a plurality of target polygonal patches; obtaining a target vegetation model based on the plurality of target polygonal patches, wherein each target polygonal patch of the target vegetation model faces towards a virtual camera and at least part of adjacent target polygonal patches are intersected; rendering the target vegetation model to generate the vegetation object.
In the embodiment of the disclosure, the plurality of original polygon patches are respectively amplified and rotated, so that each obtained target polygon patch faces the virtual camera, and at least part of adjacent target polygon patches are intersected and crossed, and then the rendered vegetation object can be enabled to present real and full effects under the condition of reducing the number of polygon patches, that is, the rendering effect of the vegetation is improved while the rendering efficiency is improved.
Referring to fig. 1, a schematic diagram of an execution subject of a vegetation rendering method according to an embodiment of the disclosure is shown, where the execution subject of the method is an electronic device 100, and the electronic device 100 may include a terminal and a server. For example, the method may be applied to a terminal, which may be, but not limited to, a smart phone 10, a desktop computer 20, a notebook computer 30, etc. shown in fig. 1, and may also be a smart watch, a tablet computer, etc. not shown in fig. 1. The method may also be applied to the server 40 or may be applied in an implementation environment consisting of the terminal and the server 40. The server 40 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud services, cloud databases, cloud computing, cloud storage, big data, artificial intelligence platforms, and the like.
In other embodiments, the electronic device 100 may also include an AR (Augmented Reality) device, a VR (Virtual Reality) device, an MR (Mixed Reality) device, and the like. For example, the AR device may be a mobile phone or a tablet computer with AR function, or may be AR glasses, which is not limited herein.
In some embodiments, the server 40 may communicate with the smart phone 10, the desktop computer 20, and the notebook computer 30 via the network 50. The network 50 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
In addition, the vegetation rendering method may also be software running in a terminal or a server, for example, an application program with vegetation rendering function, etc. In some possible implementations, the vegetation rendering method may be implemented by a processor invoking computer readable instructions stored in a memory.
Referring to fig. 2, a flowchart of a vegetation rendering method according to an embodiment of the disclosure includes the following steps S101 to S104:
s101, acquiring an original vegetation model corresponding to a vegetation object; the original vegetation model includes a plurality of original polygonal patches that form a coronal profile of the vegetation.
Referring to fig. 3, fig. 3 is a schematic diagram of an original vegetation model according to an embodiment of the disclosure. In the embodiment of the invention, vegetation objects required by a game scene can be created, designed and edited through three-dimensional modeling software, and then the created original vegetation model is imported into a game for rendering, so that a corresponding game scene is generated, and the game scene is displayed to a user. Specifically, an artist may create an original vegetation model of a vegetation object in three-dimensional modeling software, such that a game engine may obtain the original vegetation model from the three-dimensional modeling software.
Illustratively, the original vegetation model may be created based on three-dimensional modeling software such as 3d Max, maya, etc., and optionally, at least one virtual vegetation a may be included in the created original vegetation model, which may include, but is not limited to: in the embodiment of the disclosure, the virtual vegetation a is illustrated by taking a crown tree as an example, and of course, the application is not limited to a specific engine type and a specific software type, and can be flexibly selected according to actual application scenarios.
Illustratively, in modeling the original vegetation model, the plurality of polygonal patches A1 need to be separated from each other. Specifically, a break is required between the UV coordinates of each polygonal patch, and the UV coordinates of each polygonal patch are spread out over the UV space (0-1). In addition, the directions of the UV coordinates of the plurality of polygonal patches are irregularly changed, namely, the directions of the UV coordinates of the plurality of polygonal patches are disturbed as much as possible.
In the embodiment of the present disclosure, in order to achieve a better rendering effect, the original polygonal patch A1 is a quadrilateral patch. It will be appreciated that in other embodiments, the original polygonal surface patch may also be a triangular surface patch or other polygonal surface patch, without limitation.
S102, amplifying and rotating the plurality of original polygonal patches respectively to obtain a plurality of target polygonal patches.
Illustratively, each original polygonal patch may be individually subjected to magnification and rotation processing by a Shader. Wherein, the loader is an editable program for realizing image rendering and replacing a fixed rendering pipeline. The shaders mainly include a Vertex Shader (Vertex Shader) and a Pixel Shader (Pixel Shader). Wherein the vertex shader is mainly responsible for the calculation of the geometrical relationship of the vertex, etc., and the pixel shader is mainly responsible for the calculation of the source color of the chip, etc.
In some embodiments, when the plurality of original polygon patches are respectively amplified, the original polygon patches are amplified based on the center of each original polygon patch, and the amplification ratio may be preset, that is, the amplification ratio may be preset, or may be generated according to attribute information generated by player control, which is not limited herein. In addition, when the rotation processing is performed on each of the plurality of original polygon patches, the rotation is performed based on the position of the virtual camera.
And S103, obtaining a target vegetation model based on the plurality of target polygonal patches, wherein each target polygonal patch of the target vegetation model faces the virtual camera, and at least part of adjacent target polygonal patches are intersected.
Referring to fig. 4, fig. 4 is a schematic diagram of a target vegetation model according to an embodiment of the disclosure. It can be understood that, after the plurality of original polygonal patches are amplified and rotated, respectively, the target polygonal patch B1 can be obtained, and the target vegetation model B can be obtained based on the plurality of target polygonal patches B1. And when the included angle between the lens orientation direction of the virtual camera and the normal line of the target polygonal surface patch is 0 degree, the lens orientation direction of the virtual camera is perpendicular to the target polygonal surface patch.
It should be noted that, in the embodiment of the present disclosure, the lens orientation of the virtual camera refers to the direction of the center line of the field angle of the virtual camera, and the preset angle may be 30 degrees, 40 degrees or other angles, which may be determined according to specific requirements, and is not limited herein.
In some specific implementation manners, the original UV coordinates of the multiple polygonal patches may be obtained first, then the matrix transformation processing is performed on the original UV coordinates to obtain vertex offset information, and then the vertex offset information and the original vertex coordinate information of the original vegetation model are added under the model local space, so that the target vegetation model may be obtained.
Specifically, the process of obtaining the vertex offset information may include the following (1) to (4):
(1) Converting the original UV coordinates (two-dimensional data) into three-dimensional data to obtain first target data;
(2) Taking the first target data as coordinate information in world space, and converting the first target data into observation space to obtain second target data;
(3) Taking the second target data as coordinate information in the local space, and converting the second target data into world space to obtain third target data;
(4) And carrying out normalization processing on the third target data to obtain the vertex offset information.
And S104, rendering the target vegetation model to generate the vegetation object.
Referring to fig. 5, fig. 5 is a schematic diagram of a generated vegetation object according to an embodiment of the disclosure, and as shown in fig. 3, after a target vegetation model is obtained, the target vegetation model may be rendered, so as to generate a vegetation object C.
In some embodiments, pixels can be removed in the process of rendering the target vegetation model, for example, pixels can be removed according to Alpha information when textures are rendered, so that the rendered vegetation object C has a hollowed-out effect, and the reality of the vegetation rendering effect is further improved.
In this disclosed embodiment, through carrying out the enlargement and the rotation respectively to a plurality of primitive polygon dough pieces for every target polygon dough piece that obtains all faces virtual camera, and at least partly is adjacent the crossing fork of target polygon dough piece, and then can improve the utilization ratio of model summit, make under the same summit calculated amount, final vegetation effect can be more dense, perhaps under the vegetation effect of same density, the overhead of summit can be less, namely, can be when improving the rendering efficiency, promoted the rendering effect of vegetation.
The above step S102 will be described in detail with reference to specific embodiments.
In some embodiments, referring to fig. 6, for the above step S102, when the plurality of original polygon patches are respectively amplified and rotated to obtain a plurality of target polygon patches, the following steps S1021 to S1022 may be included:
S1021, amplifying the original polygonal surface patch.
S1022, performing rotation processing on the amplified original polygonal surface piece based on the original orientation angle and the adjustment value of the original polygonal surface piece to obtain the target polygonal surface piece; the plurality of target polygon patches are intersected with each other.
For example, the original polygonal surface patch may be first subjected to amplification processing, and then the amplified original polygonal surface patch may be subjected to rotation processing based on the original orientation angle and the adjustment value of the original polygonal surface patch, so as to obtain the target polygonal surface patch, so that the target polygonal surface patches are intersected with each other.
The original orientation angle of the original polygonal surface patch refers to an included angle between the normal line of the original polygonal surface patch and the virtual camera lens orientation. The adjustment value can also be called a noise value, and can be a random number generated randomly or a random number generated based on the original orientation angle of the original polygonal surface patch, so that the orientation angles of the rotated target polygonal surface patch are not completely the same although the target polygonal surface patch faces the virtual camera, the effect of staggered interweaving of leaves is achieved, and the rendering authenticity is improved.
Referring to fig. 7, fig. 7 is a flowchart of another vegetation rendering method according to an embodiment of the disclosure. The vegetation rendering method comprises the following steps of S201 to S207:
s201, acquiring an original vegetation model corresponding to a vegetation object; the original vegetation model includes a plurality of original polygonal patches that form a coronal profile of the vegetation.
This step is similar to the aforementioned step S101, and will not be described again here.
S202, carrying out coordinate conversion on the original UV coordinates of the plurality of original polygonal patches.
Illustratively, since the original UV coordinate range is (0, 1), in order to achieve scaling according to the center of the original polygon surface patch so that the effect after the intersection of the plurality of target polygon surface patches is more uniform, it is necessary to coordinate-convert the original UV coordinates of the plurality of original polygon surface patches, and convert the original UV coordinate range from (0, 1) to (-1, 1), so that the UV coordinates at the center of the polygon surface patch are (0, 0).
And S203, amplifying and rotating the plurality of original polygonal patches after the coordinate conversion respectively.
This step is similar to the aforementioned step S102, and will not be described again here.
S204, obtaining a target vegetation model based on the target polygonal patches, wherein each target polygonal patch of the target vegetation model faces the virtual camera, and at least part of adjacent target polygonal patches are intersected.
This step is similar to the aforementioned step S103 and will not be described again here.
S205, in a case where a trigger event for the virtual camera lens adjustment is detected, determining a target view angle orientation of the virtual camera lens.
It will be appreciated that in a game scenario, the perspective orientation of the virtual camera may change according to the player's control, as the perspective orientation of the virtual camera changes, the content presented will also be different. Thus, in the event that a trigger event for the virtual camera lens adjustment is detected, a target view angle orientation of the virtual camera lens needs to be determined.
And S206, respectively performing rotation processing on the plurality of target polygonal patches according to the target visual angle direction, so that each target polygonal patch faces the virtual camera.
When the visual angle azimuth of the virtual camera is changed, if the directions of the multiple target polygonal patches are not adjusted, the rendered vegetation objects are distorted, so that the multiple target polygonal patches are required to be respectively rotated according to the target visual angle azimuth, each target polygonal patch faces the virtual camera, and the rendering effect of the rendered vegetation objects is not changed after the visual angle azimuth of the camera lens is changed, and the visual experience of a player is improved.
S207, rendering the target vegetation model to generate the vegetation object.
This step is similar to the aforementioned step S104, and will not be described again here.
It will be appreciated that there may be a wind effect in the game scene, and if there is a wind effect, the vegetation object does not show the wind effect, which will affect the experience of the player, so, in some embodiments, in order to enhance the real experience of the player, as shown in fig. 8, before rendering the target vegetation model, the vegetation rendering method further includes the following steps S208 to S209:
s208, obtaining wind attribute information in the target scene.
And S209, dynamically processing the UV coordinates of the plurality of original polygonal patches based on the wind attribute information.
In the embodiment of the disclosure, the UV coordinates of the plurality of original polygon patches are dynamically processed, and then the aforementioned enlarging and rotating processes are performed. It will be appreciated that in other embodiments, the original polygonal dough sheet may be first subjected to the enlarging and rotating process, and then subjected to the dynamic process, that is, as long as the wind blowing effect is finally achieved, and the specific implementation process is not limited herein.
The wind attribute information includes, but is not limited to, wind magnitude information, wind direction information, and the like.
In some embodiments, the process of dynamically processing the raw UV coordinates of the plurality of raw polygonal patches may specifically include the following (a) - (d):
(a) Performing rotation treatment on the original UV coordinates to obtain rotated UV coordinates;
(b) Converting the original vegetation model from a local space to the world space to obtain a target vertex coordinate of a vertex of the original vegetation model in the world space;
(c) Generating a noise texture based on a time-varying engine built-in function and the target vertex coordinates;
(d) And fusing (such as lerp mixing) the original UV coordinates and the rotated UV coordinates by adopting the noise ripple principle to obtain target UV coordinates.
In this embodiment, the rotation of the original UV coordinates means that the original UV coordinates of each polygon surface piece are randomly rotated by different angles such as 90 degrees, 180 degrees, 270 degrees, etc. After the target UV coordinates are obtained, the target UV coordinates can be subjected to subsequent corresponding steps, such as coordinate conversion, amplification, rotation and the like.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
Based on the same technical concept, the embodiment of the disclosure further provides a vegetation rendering device corresponding to the vegetation rendering method, and since the principle of solving the problem by the device in the embodiment of the disclosure is similar to that of the vegetation rendering method in the embodiment of the disclosure, the implementation of the device can be referred to the implementation of the method, and the repetition is omitted.
Referring to fig. 9, a schematic diagram of a vegetation rendering device 500 according to an embodiment of the disclosure is provided, where the device includes:
an obtaining module 501, configured to obtain an original vegetation model corresponding to a vegetation object; the original vegetation model comprises a plurality of original polygonal patches that form a coronal outline of the vegetation;
the processing module 502 is configured to amplify and rotate the plurality of original polygon patches respectively to obtain a plurality of target polygon patches;
a determining module 503, configured to obtain a target vegetation model based on the multiple target polygon patches, where each target polygon patch of the target vegetation model faces a virtual camera and at least some adjacent target polygon patches intersect;
And the rendering module 504 is configured to render the target vegetation model and generate the vegetation object.
In one possible embodiment, the original polygonal surface patch is a quadrilateral surface patch.
In one possible implementation, the processing module 502 is further configured to:
performing coordinate transformation on the original UV coordinates of the plurality of original polygonal patches; and
and respectively amplifying and rotating the plurality of original polygonal patches after coordinate conversion.
In a possible implementation manner, the determining module 503 is further configured to:
determining a target view angle orientation of the virtual camera lens upon detecting a trigger event for the virtual camera lens adjustment;
the processing module 502 is further configured to:
and respectively carrying out rotation processing on the plurality of target polygonal patches according to the target visual angle direction, so that each target polygonal patch faces the virtual camera.
In one possible implementation, the processing module 502 is specifically configured to:
amplifying the original polygonal surface patch;
performing rotation processing on the amplified original polygonal surface piece based on the original orientation angle and the adjustment value of the original polygonal surface piece to obtain the target polygonal surface piece; the plurality of target polygon patches are intersected with each other.
In a possible implementation manner, the obtaining module 501 is further configured to:
obtaining wind attribute information in a target scene;
the processing module 502 is further configured to:
and dynamically processing the original UV coordinates of the plurality of original polygonal patches based on the wind attribute information.
In one possible implementation, the rendering module 504 is specifically configured to:
and in the process of rendering the target vegetation model, eliminating pixels to generate the vegetation object.
The process flow of each module in the apparatus and the interaction flow between the modules may be described with reference to the related descriptions in the above method embodiments, which are not described in detail herein.
Based on the same technical concept, the embodiment of the disclosure also provides electronic equipment. Referring to fig. 10, a schematic structural diagram of an electronic device 700 according to an embodiment of the disclosure includes a processor 701, a memory 702, and a bus 703. The memory 702 is configured to store execution instructions, including a memory 7021 and an external memory 7022; the memory 7021 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 701 and data exchanged with an external memory 7022 such as a hard disk, and the processor 701 exchanges data with the external memory 7022 via the memory 7021.
In the embodiment of the present application, the memory 702 is specifically configured to store application program codes for executing the solution of the present application, and the processor 701 controls the execution. That is, when the electronic device 700 is in operation, communication between the processor 701 and the memory 702 via the bus 703 causes the processor 701 to execute the application code stored in the memory 702, thereby performing the methods described in any of the previous embodiments.
The Memory 702 may be, but is not limited to, random access Memory (Random Access Memory, RAM), read Only Memory (ROM), programmable Read Only Memory (Programmable Read-Only Memory, PROM), erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc.
The processor 701 may be an integrated circuit chip having signal processing capabilities. The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also Digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 700. In other embodiments of the present application, electronic device 700 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the vegetation rendering method in the method embodiments described above. Wherein the storage medium may be a volatile or nonvolatile computer readable storage medium.
Embodiments of the present disclosure further provide a computer program product, where the computer program product carries program code, and instructions included in the program code may be used to perform steps of the vegetation rendering method in the foregoing method embodiments, and specifically reference may be made to the foregoing method embodiments, which are not described herein.
Wherein the above-mentioned computer program product may be realized in particular by means of hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solution, or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present disclosure, and are not intended to limit the scope of the disclosure, but the present disclosure is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, it is not limited to the disclosure: any person skilled in the art, within the technical scope of the disclosure of the present disclosure, may modify or easily conceive changes to the technical solutions described in the foregoing embodiments, or make equivalent substitutions for some of the technical features thereof; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the disclosure, and are intended to be included within the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A vegetation rendering method, comprising:
acquiring an original vegetation model corresponding to a vegetation object; the original vegetation model comprises a plurality of original polygonal patches that form a coronal outline of the vegetation;
amplifying and rotating the plurality of original polygonal patches respectively to obtain a plurality of target polygonal patches;
obtaining a target vegetation model based on the plurality of target polygonal patches, wherein each target polygonal patch of the target vegetation model faces towards a virtual camera and at least part of adjacent target polygonal patches are intersected;
rendering the target vegetation model to generate the vegetation object.
2. The method of claim 1, wherein the original polygonal patch is a quadrilateral patch.
3. The method of claim 1, wherein prior to the amplifying and rotating the plurality of raw polygonal patches, respectively, the method further comprises:
performing coordinate transformation on the original UV coordinates of the plurality of original polygonal patches;
the amplifying and rotating the plurality of original polygonal patches respectively includes:
And respectively amplifying and rotating the plurality of original polygonal patches after coordinate conversion.
4. The method of any one of claims 1-3, wherein prior to rendering the target vegetation model to generate the vegetation object, the method further comprises:
determining a target view angle orientation of the virtual camera lens upon detecting a trigger event for the virtual camera lens adjustment;
and respectively carrying out rotation processing on the plurality of target polygonal patches according to the target visual angle direction, so that each target polygonal patch faces the virtual camera.
5. The method of claim 1, wherein the amplifying and rotating the plurality of original polygon facets to obtain a plurality of target polygon facets, respectively, comprises:
amplifying the original polygonal surface patch;
performing rotation processing on the amplified original polygonal surface piece based on the original orientation angle and the adjustment value of the original polygonal surface piece to obtain the target polygonal surface piece; the plurality of target polygon patches are intersected with each other.
6. The method of claim 1, wherein prior to rendering the target vegetation model, the method further comprises:
Obtaining wind attribute information in a target scene;
and dynamically processing the original UV coordinates of the plurality of original polygonal patches based on the wind attribute information.
7. The method of claim 1, wherein the rendering the target vegetation model to generate the vegetation object comprises:
and in the process of rendering the target vegetation model, eliminating pixels to generate the vegetation object.
8. A vegetation rendering device, comprising:
the acquisition module is used for acquiring an original vegetation model corresponding to the vegetation object; the original vegetation model comprises a plurality of original polygonal patches that form a coronal outline of the vegetation;
the processing module is used for respectively amplifying and rotating the plurality of original polygonal patches to obtain a plurality of target polygonal patches;
the determining module is used for obtaining a target vegetation model based on the plurality of target polygonal patches, wherein each target polygonal patch of the target vegetation model faces the virtual camera and at least part of adjacent target polygonal patches are intersected;
The rendering module is used for rendering the target vegetation model and generating the vegetation object.
9. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory in communication via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the vegetation rendering method of any of claims 1-7.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, performs the vegetation rendering method of any of claims 1-7.
CN202110938397.9A 2021-08-16 2021-08-16 Vegetation rendering method and device, electronic equipment and readable storage medium Active CN113599818B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110938397.9A CN113599818B (en) 2021-08-16 2021-08-16 Vegetation rendering method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110938397.9A CN113599818B (en) 2021-08-16 2021-08-16 Vegetation rendering method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN113599818A CN113599818A (en) 2021-11-05
CN113599818B true CN113599818B (en) 2023-07-21

Family

ID=78308696

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110938397.9A Active CN113599818B (en) 2021-08-16 2021-08-16 Vegetation rendering method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN113599818B (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003228725A (en) * 2002-02-04 2003-08-15 Japan Science & Technology Corp 3d image processing system
CN105701313A (en) * 2016-02-24 2016-06-22 福州大学 Virtual plant canopy photosynthesis effective radiation distribution simulating method of multi-layer data structure
CN106204735A (en) * 2016-07-18 2016-12-07 中国人民解放军理工大学 Unity3D terrain data using method in Direct3D 11 environment
CN106408637A (en) * 2016-08-29 2017-02-15 北京像素软件科技股份有限公司 Vegetation scene rendering method
CN107452066A (en) * 2017-08-08 2017-12-08 中国林业科学研究院资源信息研究所 A kind of tree crown three-dimensional configuration analogy method based on B-spline curves
WO2018175625A1 (en) * 2017-03-22 2018-09-27 Magic Leap, Inc. Depth based foveated rendering for display systems
CN110084894A (en) * 2019-04-30 2019-08-02 贝壳技术有限公司 Partial enlargement methods of exhibiting, device and the electronic equipment of threedimensional model
CN110124318A (en) * 2019-06-12 2019-08-16 网易(杭州)网络有限公司 The method and device of virtual vegetation production, electronic equipment, storage medium
CN110880204A (en) * 2019-11-21 2020-03-13 腾讯科技(深圳)有限公司 Virtual vegetation display method and device, computer equipment and storage medium
CN111476877A (en) * 2020-04-16 2020-07-31 网易(杭州)网络有限公司 Shadow rendering method and device, electronic equipment and storage medium
CN112700517A (en) * 2020-12-28 2021-04-23 北京字跳网络技术有限公司 Method for generating visual effect of fireworks, electronic equipment and storage medium
CN112862968A (en) * 2021-03-15 2021-05-28 网易(杭州)网络有限公司 Rendering display method, device and equipment of target vegetation model and storage medium
CN112884873A (en) * 2021-03-12 2021-06-01 腾讯科技(深圳)有限公司 Rendering method, device, equipment and medium for virtual object in virtual environment
CN112950753A (en) * 2019-12-11 2021-06-11 腾讯科技(深圳)有限公司 Virtual plant display method, device, equipment and storage medium
CN113034350A (en) * 2021-03-24 2021-06-25 网易(杭州)网络有限公司 Vegetation model processing method and device
CN114283230A (en) * 2021-12-13 2022-04-05 网易(杭州)网络有限公司 Vegetation model rendering method and device, readable storage medium and electronic device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7643025B2 (en) * 2003-09-30 2010-01-05 Eric Belk Lange Method and apparatus for applying stereoscopic imagery to three-dimensionally defined substrates
US9299185B2 (en) * 2013-02-12 2016-03-29 Disney Enterprises, Inc. Enhanced system and method for rendering visual surface

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003228725A (en) * 2002-02-04 2003-08-15 Japan Science & Technology Corp 3d image processing system
CN105701313A (en) * 2016-02-24 2016-06-22 福州大学 Virtual plant canopy photosynthesis effective radiation distribution simulating method of multi-layer data structure
CN106204735A (en) * 2016-07-18 2016-12-07 中国人民解放军理工大学 Unity3D terrain data using method in Direct3D 11 environment
CN106408637A (en) * 2016-08-29 2017-02-15 北京像素软件科技股份有限公司 Vegetation scene rendering method
WO2018175625A1 (en) * 2017-03-22 2018-09-27 Magic Leap, Inc. Depth based foveated rendering for display systems
CN107452066A (en) * 2017-08-08 2017-12-08 中国林业科学研究院资源信息研究所 A kind of tree crown three-dimensional configuration analogy method based on B-spline curves
CN110084894A (en) * 2019-04-30 2019-08-02 贝壳技术有限公司 Partial enlargement methods of exhibiting, device and the electronic equipment of threedimensional model
CN110124318A (en) * 2019-06-12 2019-08-16 网易(杭州)网络有限公司 The method and device of virtual vegetation production, electronic equipment, storage medium
CN110880204A (en) * 2019-11-21 2020-03-13 腾讯科技(深圳)有限公司 Virtual vegetation display method and device, computer equipment and storage medium
CN112950753A (en) * 2019-12-11 2021-06-11 腾讯科技(深圳)有限公司 Virtual plant display method, device, equipment and storage medium
CN111476877A (en) * 2020-04-16 2020-07-31 网易(杭州)网络有限公司 Shadow rendering method and device, electronic equipment and storage medium
CN112700517A (en) * 2020-12-28 2021-04-23 北京字跳网络技术有限公司 Method for generating visual effect of fireworks, electronic equipment and storage medium
CN112884873A (en) * 2021-03-12 2021-06-01 腾讯科技(深圳)有限公司 Rendering method, device, equipment and medium for virtual object in virtual environment
CN112862968A (en) * 2021-03-15 2021-05-28 网易(杭州)网络有限公司 Rendering display method, device and equipment of target vegetation model and storage medium
CN113034350A (en) * 2021-03-24 2021-06-25 网易(杭州)网络有限公司 Vegetation model processing method and device
CN114283230A (en) * 2021-12-13 2022-04-05 网易(杭州)网络有限公司 Vegetation model rendering method and device, readable storage medium and electronic device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
在游戏中通过法线映射模拟体积渲染效果;刘贇;;电子技术与软件工程(第10期);48-49 *
基于细分曲面控制网格优化的光滑树模型重建;黄正宇;张志毅;耿楠;胡少军;;计算机仿真(第12期);179-184 *
景观表现中高真实度竹林仿真的研究;何秋海;彭月橙;黄心渊;;计算机工程与应用(第03期);179-184 *
网络环境中虚拟树木的建模和实时渲染研究;张倩倩;淮永建;;计算机仿真(第04期);269-272, 310 *

Also Published As

Publication number Publication date
CN113599818A (en) 2021-11-05

Similar Documents

Publication Publication Date Title
CN109427088B (en) Rendering method for simulating illumination and terminal
CN107358649B (en) Processing method and device of terrain file
CN104966312B (en) A kind of rendering intent, device and the terminal device of 3D models
CN108154548A (en) Image rendering method and device
CN109544658B (en) Map rendering method and device, storage medium and electronic device
CN112206535B (en) Rendering display method and device of virtual object, terminal and storage medium
CN111199573B (en) Virtual-real interaction reflection method, device, medium and equipment based on augmented reality
EP1866870B1 (en) Rendering 3d computer graphics using 2d computer graphics capabilities
CN108182723B (en) Starry sky simulation method and starry sky simulation device
CN108074285B (en) Volume cloud simulation method and volume cloud simulation device
CN115375822A (en) Cloud model rendering method and device, storage medium and electronic device
CN113599818B (en) Vegetation rendering method and device, electronic equipment and readable storage medium
CN108230430B (en) Cloud layer mask image processing method and device
CN113648655A (en) Rendering method and device of virtual model, storage medium and electronic equipment
Nowrouzezahrai et al. Visibility silhouettes for semi‐analytic spherical integration
CN115965735A (en) Texture map generation method and device
CN112132938B (en) Model element deformation processing and picture rendering method, device, equipment and medium
CN115761105A (en) Illumination rendering method and device, electronic equipment and storage medium
CN109064539A (en) A kind of method and computer readable storage medium being embedded in special efficacy in UGUI
CN105894560B (en) A kind of method of image procossing, user equipment and system
CN111462343B (en) Data processing method and device, electronic equipment and storage medium
CN111652807B (en) Eye adjusting and live broadcasting method and device, electronic equipment and storage medium
CN113240787A (en) Shadow rendering method and device and electronic equipment
CN111882637B (en) Picture rendering method, device, equipment and medium
CN112132935A (en) Model element deformation processing method, model element deformation processing device, model element image rendering method, model element image rendering device and model element image rendering medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant