CN110838162B - Vegetation rendering method and device, storage medium and electronic equipment - Google Patents

Vegetation rendering method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN110838162B
CN110838162B CN201911176102.8A CN201911176102A CN110838162B CN 110838162 B CN110838162 B CN 110838162B CN 201911176102 A CN201911176102 A CN 201911176102A CN 110838162 B CN110838162 B CN 110838162B
Authority
CN
China
Prior art keywords
coordinates
real
virtual character
target vegetation
time coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911176102.8A
Other languages
Chinese (zh)
Other versions
CN110838162A (en
Inventor
石皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201911176102.8A priority Critical patent/CN110838162B/en
Publication of CN110838162A publication Critical patent/CN110838162A/en
Application granted granted Critical
Publication of CN110838162B publication Critical patent/CN110838162B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/44Morphing

Abstract

The disclosure provides a vegetation rendering method and device, electronic equipment and a computer readable storage medium, and relates to the technical field of computer mobile pictures. The vegetation rendering method comprises the following steps: acquiring real-time coordinates of each vertex of the target vegetation model and real-time coordinates of a central point of each surface; acquiring real-time coordinates of the virtual character; obtaining an offset value generated by the target vegetation model under the action of the virtual character according to the real-time coordinates of the center point and the real-time coordinates of the virtual character; adding the offset value to the real-time coordinates of the vertexes of the target vegetation model to obtain the offset coordinates of the target vegetation model; and rendering the target vegetation model according to the offset coordinates. The method and the device can improve the problem that vegetation is seriously deformed in the process of interacting with the virtual character, and achieve a more natural and real vegetation rendering effect.

Description

Vegetation rendering method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the technical field of computer graphics, and in particular, to a vegetation rendering method, a vegetation rendering device, an electronic device, and a computer-readable storage medium.
Background
Vegetation interactive animation in games has a very important impact on the game user experience. The current common method for realizing the vegetation interactive animation is to transmit the coordinates of the virtual characters into a vegetation shade (coloring device), and calculate the coordinates of each vertex of the vegetation and the coordinates of the virtual characters according to the same formula to obtain the interaction degree of each vertex of the vegetation and the virtual characters, so as to realize the rendering of the interaction effect of the vegetation and the virtual characters. However, in the method, because the interaction degrees calculated by different vertexes are different, the vertexes corresponding to the same leaf are pulled up to a plurality of different directions, so that the problem of vegetation deformation is caused, and the vegetation interaction effect is lost.
The other mode is to bind the vegetation nodes at the computer end and realize interactive animation through the nodes, but the method not only can greatly improve the working cost, but also can increase the consumption of a central processing unit. The method of directly writing the animation into the vertex animation and the skeleton animation is difficult to generate interaction effect with the player.
Therefore, a vegetation rendering method capable of improving serious deformation of vegetation in the process of interacting with virtual characters and realizing more natural and real interaction effects needs to be provided.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of an embodiment of the present disclosure is to provide a vegetation rendering method, a vegetation rendering device, an electronic device, and a computer-readable storage medium, so as to improve the problem of serious deformation of vegetation in the process of interacting with a virtual character, and achieve a more natural and real vegetation rendering effect.
According to a first aspect of the present disclosure, there is provided a vegetation rendering method, comprising:
acquiring real-time coordinates of each vertex of the target vegetation model and real-time coordinates of a central point of each surface;
acquiring real-time coordinates of the virtual character;
obtaining an offset value generated by the target vegetation model under the action of the virtual character according to the real-time coordinates of the center point and the real-time coordinates of the virtual character;
adding the offset value to the real-time coordinates of the vertexes of the target vegetation model to obtain the offset coordinates of the target vegetation model;
and rendering the target vegetation model according to the offset coordinates.
In an exemplary embodiment of the present disclosure, the acquiring real-time coordinates of each vertex of the target vegetation model and real-time coordinates of a center point of each face includes:
acquiring real-time coordinates of each vertex of the target vegetation model;
and obtaining the real-time coordinates of the center point of each face of the target vegetation model according to the real-time coordinates of each vertex and the difference coordinates, wherein the difference coordinates are differences between the original coordinates of the vertex of each face of the target vegetation model and the original coordinates of the center point of the corresponding face.
In an exemplary embodiment of the present disclosure, the obtaining the offset value generated by the target vegetation model under the action of the virtual character according to the real-time coordinates of the center point and the real-time coordinates of the virtual character includes:
vector operation is carried out on the real-time coordinates of the center point and the real-time coordinates of the virtual character, and the size and the direction of the offset value are determined according to the operation result.
In an exemplary embodiment of the present disclosure, the method further comprises:
acquiring a speed vector of the virtual character and an influence coefficient corresponding to the target vegetation model;
obtaining deformation coordinates of the target vegetation model in the movement direction of the virtual character under the action of the virtual character according to the speed vector of the virtual character, the influence coefficient and the vertex color of the target vegetation model;
Rendering the target vegetation model according to the offset coordinates comprises:
and rendering the target vegetation model according to the offset coordinates and the deformation coordinates.
In an exemplary embodiment of the present disclosure, the obtaining, according to the velocity vector of the virtual character, the influence coefficient, and the vertex color of the target vegetation model, deformation coordinates of the target vegetation model in the movement direction of the virtual character under the action of the virtual character includes:
acquiring a vegetation height coefficient of the target vegetation model according to the vertex color of the target vegetation model;
and multiplying the speed vector, the influence coefficient and the vegetation height coefficient to obtain the deformation coordinate.
In an exemplary embodiment of the present disclosure, before the obtaining, according to the velocity vector of the virtual character, the influence coefficient, and the vertex color of the target vegetation model, deformation coordinates of the target vegetation model in the movement direction of the virtual character under the action of the virtual character, the method further includes:
and determining that the modulus of the velocity vector of the virtual character is smaller than a preset threshold value.
The method further comprises the steps of:
In one exemplary embodiment of the present disclosure, a target vegetation model range affected by the virtual character is determined according to a distance between real-time coordinates of the center point and real-time coordinates of the virtual character.
According to a second aspect of the present disclosure, there is provided a vegetation rendering apparatus comprising:
the vegetation coordinate acquisition module is used for acquiring the real-time coordinates of each vertex of the target vegetation model and the real-time coordinates of the center point of each surface;
the role coordinate acquisition module is used for acquiring real-time coordinates of the virtual roles;
the offset value calculation module is used for obtaining an offset value generated by the target vegetation model under the action of the virtual character according to the real-time coordinates of the central point and the real-time coordinates of the virtual character;
the offset coordinate calculation module is used for adding the offset value to the real-time coordinates of the vertexes of the target vegetation model to obtain the offset coordinates of the target vegetation model;
and the rendering module is used for rendering the target vegetation model according to the offset coordinates.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any of the above via execution of the executable instructions.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any one of the above.
Exemplary embodiments of the present disclosure may have some or all of the following advantages:
in the vegetation rendering method provided by the disclosed example embodiment, first, the real-time coordinates of each vertex of the target vegetation model and the real-time coordinates of the center point of each surface are obtained; acquiring real-time coordinates of the virtual character; obtaining an offset value generated by the target vegetation model under the action of the virtual character according to the real-time coordinates of the center point and the real-time coordinates of the virtual character; adding the offset value to the real-time coordinates of the vertexes of the target vegetation model to obtain the offset coordinates of the target vegetation model; and rendering the target vegetation model according to the offset coordinates. On the one hand, the offset value of the target vegetation model under the action of the virtual role is obtained through the real-time coordinates of the center point and the real-time coordinates of the virtual role, the offset coordinates of each vertex are obtained through the offset value, the vertex coordinates are operated based on the same offset value, each vertex corresponding to the same leaf cannot be pulled up to a plurality of different directions, and the target vegetation is prevented from being seriously deformed in interaction. On the other hand, after the offset coordinates of the target vegetation are obtained, the target vegetation is rendered according to the offset coordinates, so that the real and natural interaction effect of the target vegetation and the virtual character can be realized. Meanwhile, as the coordinate operation related in the vegetation rendering method can be performed in the graphic processor, the consumption of the central processing unit can be reduced, and the interaction process can be realized in the mobile terminal.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which the methods and apparatus of vegetation rendering of embodiments of the present disclosure may be applied;
FIG. 2 illustrates a schematic diagram of a computer system suitable for use in implementing embodiments of the present disclosure;
FIG. 3 schematically illustrates an effect diagram of a vegetation rendering method according to one embodiment of the disclosure;
FIG. 4 schematically illustrates a flow chart of a vegetation rendering method according to one embodiment of the disclosure;
FIG. 5 schematically illustrates a schematic view of a target vegetation model of a vegetation rendering method according to one embodiment of the disclosure;
FIG. 6 schematically illustrates a schematic diagram of storing differences in a vegetation rendering method according to one embodiment of the present disclosure;
FIG. 7 schematically illustrates a flowchart of calculating deformation coordinates for a vegetation rendering method according to one embodiment of the disclosure;
FIG. 8 schematically illustrates an effect diagram of a vegetation rendering method according to one embodiment of the disclosure;
FIG. 9 schematically illustrates an effect diagram of a vegetation rendering method according to one embodiment of the disclosure;
FIG. 10 schematically illustrates an effect diagram of a vegetation rendering method according to one embodiment of the disclosure;
FIG. 11 schematically illustrates an effect diagram of a vegetation rendering method according to one embodiment of the disclosure;
FIG. 12 schematically illustrates a schematic diagram of a target vegetation model applied in accordance with one embodiment of the present disclosure;
FIG. 13 schematically illustrates an operational interface diagram of interaction level adjustment of a vegetation rendering method according to one embodiment of the disclosure;
fig. 14 schematically illustrates a block diagram of a vegetation rendering device according to one embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 shows a schematic diagram of a system architecture of an exemplary application environment in which a vegetation rendering method and apparatus of embodiments of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of the terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others. The terminal devices 101, 102, 103 may be various electronic devices with display screens including, but not limited to, desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, the server 105 may be a server cluster formed by a plurality of servers.
The vegetation rendering method provided by the embodiments of the present disclosure may be performed by the terminal devices 101, 102, 103, and correspondingly, the vegetation rendering apparatus may also be disposed in the terminal devices 101, 102, 103. In addition, the vegetation rendering method provided in the embodiment of the present disclosure may also be executed by the server 105, and accordingly, the vegetation rendering device may be disposed in the server 105, which is not limited in particular in the present exemplary embodiment.
For example, in this exemplary embodiment, the real-time coordinates of each vertex of the target vegetation model, the real-time coordinates of the center point of each surface, and the real-time coordinates of the virtual character may be obtained in the server 105, the offset value generated by the target vegetation model under the action of the virtual character is obtained according to the obtained real-time coordinates of the center point and the real-time coordinates of the virtual character, the offset value is added to the real-time coordinates of the vertices of the target vegetation model, so as to obtain the offset coordinates of the target vegetation model, and finally the target vegetation model is rendered according to the offset coordinates. Further, in the present exemplary embodiment, the above-described process may also be performed by the terminal device 101, 102, or 103.
Fig. 2 shows a schematic diagram of a computer system suitable for use in implementing embodiments of the present disclosure.
It should be noted that the computer system 200 of the electronic device shown in fig. 2 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present disclosure.
As shown in fig. 2, the computer system 200 includes a Central Processing Unit (CPU) 201, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 202 or a program loaded from a storage section 208 into a Random Access Memory (RAM) 203. In the RAM 203, various programs and data required for the system operation are also stored. The CPU 201, ROM 202, and RAM 203 are connected to each other through a bus 204. An input/output (I/O) interface 205 is also connected to bus 204.
The following components are connected to the I/O interface 205: an input section 206 including a keyboard, a mouse, and the like; an output portion 207 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker, and the like; a storage section 208 including a hard disk or the like; and a communication section 209 including a network interface card such as a LAN card, a modem, and the like. The communication section 209 performs communication processing via a network such as the internet. The drive 210 is also connected to the I/O interface 205 as needed. A removable medium 211 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed on the drive 210 as needed, so that a computer program read out therefrom is installed into the storage section 208 as needed.
Regarding the rendering of vegetation and virtual character interactions in a game, the inventors have found that the three methods are mainly involved:
the first method is to transmit the coordinates of the virtual character into a target vegetation shade (shader), and calculate the extrusion degree of each vertex of the target vegetation by using the following formula:
OffsetPos=(pos_glass-pos_player)*factor
wherein pos_layer is the coordinates of the virtual character, pos_glass is the target vegetation coordinates, and factor is the extrusion force.
When the virtual character approaches to the target vegetation, the target vegetation is pushed to a direction away from the virtual character according to the calculated extrusion degree, and the pushing force is larger as the distance between the target vegetation and the virtual character is closer. This process is performed once at each vertex of the target vegetation, resulting in an effect map of the target vegetation interacted with by the virtual character, as shown in fig. 3.
In practice, it is found that although the method can achieve the interaction effect of the target vegetation and the virtual character, the following problems exist: because the extrusion degree calculated for each vertex in the target vegetation is different, each vertex corresponding to the same leaf in the target vegetation is pulled up to a plurality of different directions, so that the problem of vegetation deformation is caused, and the interactive effect is lost.
The second method is to bind some nodes on the target vegetation and interact with the virtual character through the bound nodes. However, in this way, on one hand, the working cost of implementing interactive rendering is greatly increased, and on the other hand, the consumption of the central processing unit is increased, so that the interaction is difficult to implement at the mobile terminal. In addition, controlling the nodes bound to the target vegetation is challenging, increasing the difficulty of work.
The third method is to directly write the animation into the types of vertex animation and skeleton animation, but such animation is difficult to generate interaction effect with the virtual character, which greatly reduces the game experience of the user.
In order to solve the problems in the three methods, in this exemplary embodiment, the inventor proposes a new technical solution, and the following details the technical solution of the embodiment of the disclosure are set forth:
the present exemplary embodiment first provides a vegetation rendering method. The vegetation rendering method may be applied to one or more of the terminal devices 101, 102, 103, or may be applied to the server 105. Referring to fig. 4, the vegetation rendering method specifically includes the following steps:
Step S410: acquiring real-time coordinates of each vertex of the target vegetation model and real-time coordinates of a central point of each surface;
step S420: acquiring real-time coordinates of the virtual character;
step S430: obtaining an offset value generated by the target vegetation model under the action of the virtual character according to the real-time coordinates of the center point and the real-time coordinates of the virtual character;
step S440: adding the offset value to the real-time coordinates of the vertexes of the target vegetation model to obtain the offset coordinates of the target vegetation model;
step S450: and rendering the target vegetation model according to the offset coordinates.
In the vegetation rendering method provided in this exemplary embodiment, on one hand, the offset value of the target vegetation model under the effect of the virtual role is obtained through the real-time coordinates of the center point and the real-time coordinates of the virtual role, and the offset coordinates of each vertex are obtained through the offset value, each vertex coordinate is calculated based on the same offset value, and each vertex corresponding to the same leaf is not pulled up to a plurality of different orientations, so that the target vegetation is prevented from being seriously deformed in interaction. On the other hand, after the offset coordinates of the target vegetation are obtained, the target vegetation is rendered according to the offset coordinates, so that the real and natural interaction effect of the target vegetation and the virtual character can be realized. Meanwhile, as the coordinate operation related in the vegetation rendering method can be performed in the graphic processor, the consumption of the central processing unit can be reduced, and the interaction process can be realized in the mobile terminal.
In another embodiment, the above steps are described in more detail below.
In step S410, the real-time coordinates of each vertex of the target vegetation model and the real-time coordinates of the center point of each face are acquired.
Step S410 may further include: acquiring real-time coordinates of each vertex of a target vegetation model; and obtaining the real-time coordinates of the center point of each face of the target vegetation model according to the real-time coordinates of each vertex and the difference coordinates, wherein the difference coordinates are the difference between the original coordinates of the vertex of each face of the target vegetation model and the original coordinates of the center point of the corresponding face.
In this exemplary embodiment, the target vegetation model is vegetation subjected to interaction in the game, and the original coordinates of each vertex and the original coordinates of the center point of each surface can be obtained through the corresponding target vegetation model, and the target vegetation model can be manufactured through art software, as shown in fig. 5. For example, when the target vegetation model is derived, the plug-in unit may be used to obtain the original coordinates of each vertex of the target vegetation model and the original coordinates of the center point of each surface in the three-dimensional animation rendering and producing software, or may obtain the original coordinates of each vertex and each center point in other manners, where the three-dimensional animation rendering and producing software may be MAX or other similar software, which all fall into the protection scope of the present exemplary embodiment.
In this example embodiment, the real-time coordinates are coordinates corresponding to each vertex of the target vegetation and each center point of each surface at each moment in the game, specifically, current coordinates of each vertex of the target vegetation model and each center point of each surface when the target vegetation model is rendered at the GPU end. The original coordinates obtained when the target vegetation model is derived can be converted into real-time coordinates in the game through specific changes. For example, the original coordinates may be subjected to matrix change to obtain corresponding real-time coordinates. In addition, the conversion from the original coordinates to the real-time coordinates may be accomplished by other technical means, which is not particularly limited in the present exemplary embodiment.
In this example embodiment, a specific implementation of acquiring real-time coordinates of each vertex of the target vegetation model in the game may be as follows: the original coordinates corresponding to each vertex of the target vegetation model are obtained through the game engine, and matrix transformation is carried out on the obtained original coordinates of each vertex, so that real-time coordinates of the target vegetation model in the game are obtained. It should be noted that the above scenario is only an exemplary illustration, and other ways of obtaining the real-time coordinates of each vertex of the target vegetation in the game also fall within the protection scope of the present exemplary embodiment.
Since the original coordinates or the real-time coordinates corresponding to the center point of each surface of the target vegetation model cannot be directly obtained in the game, in this example embodiment, a method for obtaining the real-time coordinates of the center point of each surface of the target vegetation model in the game by the original coordinates of each vertex of the target vegetation model and the original coordinates of the center point of each surface is provided.
The method for obtaining the real-time coordinates of the center point of each surface of the target vegetation model in the game can be concretely realized as follows: when the target vegetation model is derived, the original coordinates of each vertex of the target vegetation model and the original coordinates of the center point of each surface are obtained through the plug-in, the difference between the original coordinates of the vertex of each surface of the target vegetation model and the original coordinates of the center point of the corresponding surface is calculated, and the difference is stored in the pre-defined texture coordinates, such as a third set of texture coordinates of the vertex, as shown in fig. 6. When the real-time coordinates of the center point of the target vegetation are obtained in the game, the original coordinates of each vertex of the target vegetation model and the difference value stored in the texture coordinates are firstly obtained through a game engine, and the real-time coordinates of the center point on each corresponding surface are calculated by utilizing the difference value and the real-time coordinates of each vertex.
It should be noted that the above scenario is only an exemplary illustration, and other ways of obtaining real-time coordinates of the center points of the respective surfaces of the target vegetation in the game also belong to the protection scope of the present exemplary embodiment.
In step S420, real-time coordinates of the virtual character are acquired.
In this example embodiment, the virtual character is a character of a game user in a game. The velocity vector of the virtual character may be calculated in a CPU (central processing unit), and the calculated velocity vector and real-time coordinates of the virtual character may be transferred to a GPU (graphics processor) in frame units, and then subjected to subsequent processing in the graphics processor. The subsequent processing in the graphics processor can reduce the consumption of the central processing unit, so that the vegetation rendering method provided by the example embodiment can also be implemented at the mobile terminal. It should be noted that the above scenario is only an exemplary illustration, and the protection scope of the present exemplary embodiment is not limited thereto.
In step S430, an offset value generated by the target vegetation model under the action of the virtual character is obtained according to the real-time coordinates of the center point and the real-time coordinates of the virtual character.
In this example embodiment, the offset value is a coordinate offset value of each vertex and a center point of each face in a game coordinate system during interaction with the virtual character by the target vegetation model. The offset value can be calculated according to the real-time coordinates of the center point on each face of the target vegetation model and the real-time coordinates of the virtual characters.
Step S430 may include: vector operation is carried out on the real-time coordinates of the center point and the real-time coordinates of the virtual character, and the size and the direction of the offset value are determined according to the operation result.
The foregoing implementation of calculating the offset value according to the real-time coordinates of the center point on each surface of the target vegetation model and the real-time coordinates of the virtual character may specifically be as follows: vector operation is carried out on the real-time coordinates of the central point on each surface of the target vegetation model acted by the virtual character and the real-time coordinates of the virtual character, so as to obtain an offset value corresponding to each vertex in each surface, the direction of the offset value is the direction corresponding to the vector obtained by operation, and the magnitude of the offset value can be determined through the magnitude of the absolute value of the vector operation result.
It should be noted that the above scenario is only an exemplary illustration, and other ways of obtaining the offset value generated by the target vegetation model under the action of the virtual character also fall into the protection scope of the present exemplary embodiment.
In this example embodiment, determining the range of the target vegetation model acted upon by the virtual character may further include determining the range of the target vegetation model by a distance between the real-time coordinates of the center point of the target vegetation model and the real-time coordinates of the virtual character. The implementation of calculating the affected range of the target vegetation can be specifically as follows: and calculating the real-time coordinates of the center points on the surfaces in the game by using the real-time coordinates of each vertex of the target vegetation model and the difference values stored in the texture coordinate system. And calculating the distance between the real-time coordinates of the center point and the real-time coordinates of the virtual character, dividing the obtained distance by a constant preset according to the game scene, and subtracting the result of the division by 1 to obtain a distance influence coefficient, wherein the distance influence coefficient is used for representing the relation between the influenced degree of the target vegetation model and the calculated distance. The affected range of the target vegetation model can be determined according to the distance influence coefficient, and the calculation process of the distance influence coefficient can obtain that the closer the distance between the target vegetation model and the virtual character is, the more obvious the affected degree is.
It should be noted that the above scenario is only an exemplary illustration, and other ways of obtaining the offset value generated by the target vegetation model under the action of the virtual character also fall into the protection scope of the present exemplary embodiment.
In step S440, the offset value is added to the real-time coordinates of the vertices of the target vegetation model to obtain offset coordinates of the target vegetation model.
In this exemplary embodiment, in order to prevent the vertices on the same surface of the target vegetation model from being affected by different degrees, and further cause the target vegetation model to be severely deformed in the process of interacting with the virtual character, the offset coordinates corresponding to the target vegetation model are obtained through the offset values calculated in step S430, so that the vertices corresponding to the same surface in the target vegetation model are offset to the same degree.
The above-mentioned calculation of the offset coordinates of the target vegetation model may be specifically implemented as follows: vector operation is carried out on the real-time coordinates of the central point on each surface of the target vegetation model acted by the virtual character and the real-time coordinates of the virtual character to obtain offset values corresponding to each vertex in each surface, and addition operation is carried out on the obtained offset values and the real-time coordinates of each vertex on the corresponding surface to obtain the offset coordinates of the corresponding target vegetation model.
In a specific implementation, the extent to which vegetation is affected by the virtual character interaction may be written into the vertex color data of the model file, where the extent of influence includes an offset value.
It should be noted that the above scenario is only an exemplary illustration, and other methods for obtaining the offset coordinates of the target vegetation model also fall within the protection scope of the present exemplary embodiment.
In step S450, the target vegetation model is rendered according to the offset coordinates.
In the present exemplary embodiment, the target vegetation model is rendered using the offset coordinates obtained in step S440, so that the effect that the target vegetation model is squeezed out by the virtual character is achieved.
In the embodiment of the present invention, the virtual character may refer to a virtual object, and not only includes a virtual character, but also may be another moving object.
In addition, in order to achieve the interaction effect of the target vegetation model and the virtual character, in another embodiment, after the offset coordinates are calculated, a deformation coordinate is calculated, so as to achieve the effect that the target vegetation model is deformed by the virtual character. As shown in fig. 7, the deformation coordinates are calculated as follows:
step S710: acquiring a speed vector of the virtual character and an influence coefficient corresponding to the target vegetation model;
Step S720: and obtaining deformation coordinates of the target vegetation model in the movement direction of the virtual character under the action of the virtual character according to the speed vector of the virtual character, the influence coefficient and the vertex color of the target vegetation model.
Thus, rendering the target vegetation model according to the offset coordinates includes: and rendering the target vegetation model according to the offset coordinates and the deformation coordinates.
The following describes the calculation process of the deformation coordinates in more detail in conjunction with the steps in fig. 7:
in step S710, a speed vector of the virtual character and an influence coefficient corresponding to the target vegetation model are obtained.
In the present exemplary embodiment, the velocity vector of the virtual character means movement of the virtual character from frame to frame, the distance of the movement is the magnitude (modulus) of the velocity vector, and the direction of the movement is the direction of the velocity vector. The velocity vector may be calculated in the central processing unit or may be obtained by other methods, which are not particularly limited in this exemplary embodiment.
In this exemplary embodiment, the influence coefficient corresponding to the target vegetation model is a custom coefficient based on the influence degree of the target vegetation model by the virtual character, and different types of vegetation may correspond to different influence coefficients. Preferably, the influence coefficient may be an adjustable coefficient. For example, some of the custom influence coefficients may be preset in the game, and when the target vegetation model interacts with the virtual character, the custom influence coefficients may be automatically matched according to the interaction degree, and may also be manually adjusted, which all fall into the protection scope of the present exemplary embodiment.
In step S720, deformation coordinates of the target vegetation model in the movement direction of the virtual character under the action of the virtual character are obtained according to the velocity vector of the virtual character, the influence coefficient and the vertex color of the target vegetation model.
Step S720 may specifically include: acquiring a vegetation height coefficient of the target vegetation model according to the vertex color of the target vegetation model; and multiplying the speed vector, the influence coefficient and the vegetation height coefficient to obtain a deformation coordinate.
In the present exemplary embodiment, the height influence coefficient of the target vegetation model itself is stored in the apex color of the target vegetation model. The deformation coordinates of the target vegetation model in the movement direction of the virtual character under the action of the virtual character can be calculated through the height influence coefficient of the target vegetation model, the obtained speed vector of the virtual character and the self-defined influence coefficient.
The above-mentioned deformation coordinate calculation process may be implemented as follows: obtaining a self-height influence coefficient of a target vegetation model stored in the target vegetation model, and performing multiplication operation on the self-height influence coefficient, a speed vector of the virtual character and a self-defined influence coefficient to obtain deformation coordinates of the target vegetation model in the movement direction of the virtual character under the action of the virtual character. It should be noted that the above scenario is only an exemplary illustration, and other methods for obtaining the deformation coordinates of the target vegetation model also fall within the protection scope of the present exemplary embodiment.
In the present exemplary embodiment, before proceeding to step S720, further including: the modulus of the velocity vector of the virtual character is determined to be less than a preset threshold.
The preset threshold is a threshold value of whether the movement of the virtual character can affect the target vegetation model, when the speed vector is larger than the preset threshold value, the target vegetation model can not be continuously increased and deformed, namely, the speed vector larger than the preset threshold value can not participate in calculation of deformation coordinates of the target vegetation model any more, and only the speed vector smaller than or equal to the preset threshold value can be transmitted into the GPU. When the speed vector of the virtual character is smaller than the preset threshold, the influence degree of the movement of the virtual character on the target vegetation is increased according to the increase of the speed vector of the virtual character, and when the speed vector of the virtual character is increased to the preset threshold, the speed vector of the virtual character is gradually reduced to zero without being influenced by the movement of the virtual character.
In the above process, when the velocity vector of the virtual character increases to the preset threshold value, the meaning of gradually reducing the velocity vector to zero is that a more realistic picture transition effect can be achieved. This is because if the velocity vector of the virtual character is set to zero, the vegetation animation also suddenly stops, which does not conform to the real scene. Therefore, in order to make the animation transition of the target vegetation model natural, it is necessary to make the velocity vector gradually smaller. It should be noted that the above scenario is only an exemplary illustration, and does not limit the scope of protection of the present exemplary embodiment.
In the vegetation rendering method provided by the present exemplary embodiment, after the deformation coordinates are calculated, the target vegetation model is rendered by combining the offset coordinates calculated in steps S420 to S440, so that a more natural and real interaction effect between the target vegetation model and the virtual character can be achieved. The rendering process can be concretely realized as follows: and adding the calculated offset coordinates and deformation coordinates with the real-time coordinates of each vertex of the target vegetation model, as shown in fig. 8 to 11. Fig. 8 shows an effect of the target vegetation before the virtual character is acted upon, and fig. 9 to 11 are effect of the target vegetation during the squeezing process by the virtual character, respectively. It should be noted that the above scenario is only an exemplary illustration, and does not limit the scope of protection of the present exemplary embodiment.
Fig. 12 shows a schematic view of a target vegetation model derived by the art making software, illustrating each vertex and each face composed of vertices contained in the target vegetation model.
In addition, the affected degree and the affected range of the target vegetation model according to the vegetation rendering method provided in this exemplary embodiment may be adjusted by the above method, or may be set by other operation interfaces, for example, by adjusting the target vegetation model file on a material ball as shown in fig. 13, and by setting the wind force, the affected range, the inertia, the extrusion degree, etc. of the target vegetation model through the material ball setting interface in fig. 13.
It should be noted that although the steps of the methods in the present disclosure are depicted in the accompanying drawings in a particular order, this does not require or imply that the steps must be performed in that particular order, or that all illustrated steps be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
Further, in this example embodiment, a vegetation rendering apparatus is also provided. The vegetation rendering device can be applied to terminal equipment and also can be applied to a server. Referring to fig. 14, the vegetation rendering device 1400 may include a vegetation coordinate acquiring module 1410, a character coordinate acquiring module 1420, an offset value calculating module 1430, an offset coordinate calculating module 1440, and a rendering module 1450, wherein:
the vegetation coordinate acquisition module can be used for acquiring the real-time coordinates of each vertex of the target vegetation model and the real-time coordinates of the center point of each surface;
the role coordinate acquisition module can be used for acquiring real-time coordinates of the virtual roles;
the offset value calculation module can be used for obtaining an offset value generated by the target vegetation model under the action of the virtual character according to the real-time coordinates of the central point and the real-time coordinates of the virtual character;
The offset coordinate calculation module may be configured to add the offset value to a real-time coordinate of a vertex of the target vegetation model to obtain an offset coordinate of the target vegetation model;
the rendering module may be configured to render the target vegetation model according to the offset coordinates.
The specific details of each module or unit in the vegetation rendering device are described in detail in the corresponding vegetation rendering method, so that the details are not repeated here.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
As another aspect, the present application also provides a computer-readable medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer-readable medium carries one or more programs which, when executed by one of the electronic devices, cause the electronic device to implement the methods described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 3 to 13, and the like.
It should be noted that the computer readable medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (9)

1. A vegetation rendering method, comprising:
acquiring real-time coordinates of each vertex of the target vegetation model and real-time coordinates of a central point of each surface;
acquiring real-time coordinates of the virtual character;
obtaining an offset value generated by the target vegetation model under the action of the virtual character according to the real-time coordinates of the center point and the real-time coordinates of the virtual character;
adding the offset value to the real-time coordinates of the vertexes of the target vegetation model to obtain the offset coordinates of the target vegetation model;
rendering the target vegetation model according to the offset coordinates;
the obtaining an offset value generated by the target vegetation model under the action of the virtual character according to the real-time coordinates of the center point and the real-time coordinates of the virtual character comprises the following steps:
vector operation is carried out on the real-time coordinates of the center point and the real-time coordinates of the virtual character, and the size and the direction of the offset value are determined according to the operation result.
2. The vegetation rendering method of claim 1, wherein the acquiring real-time coordinates of each vertex of the target vegetation model and real-time coordinates of the center point of each face comprises:
acquiring real-time coordinates of each vertex of the target vegetation model;
and obtaining the real-time coordinates of the center point of each face of the target vegetation model according to the real-time coordinates of each vertex and the difference coordinates, wherein the difference coordinates are differences between the original coordinates of the vertex of each face of the target vegetation model and the original coordinates of the center point of the corresponding face.
3. A vegetation rendering method according to claim 1, wherein the method further comprises:
acquiring a speed vector of the virtual character and an influence coefficient corresponding to the target vegetation model;
obtaining deformation coordinates of the target vegetation model in the movement direction of the virtual character under the action of the virtual character according to the speed vector of the virtual character, the influence coefficient and the vertex color of the target vegetation model;
rendering the target vegetation model according to the offset coordinates comprises:
and rendering the target vegetation model according to the offset coordinates and the deformation coordinates.
4. A vegetation rendering method according to claim 3, wherein the obtaining deformation coordinates of the target vegetation model in the movement direction of the virtual character under the action of the virtual character according to the velocity vector of the virtual character, the influence coefficient, and the vertex color of the target vegetation model comprises:
acquiring a vegetation height coefficient of the target vegetation model according to the vertex color of the target vegetation model;
and multiplying the speed vector, the influence coefficient and the vegetation height coefficient to obtain the deformation coordinate.
5. A vegetation rendering method according to claim 3, wherein the obtaining the deformation coordinates of the target vegetation model in the movement direction of the virtual character under the action of the virtual character according to the velocity vector of the virtual character, the influence coefficient, and the vertex color of the target vegetation model further comprises:
and determining that the modulus of the velocity vector of the virtual character is smaller than a preset threshold value.
6. The vegetation rendering method of any of claims 1-5, wherein the method further comprises:
and determining a target vegetation model range acted by the virtual character according to the distance between the real-time coordinates of the center point and the real-time coordinates of the virtual character.
7. A vegetation rendering device, comprising:
the vegetation coordinate acquisition module is used for acquiring the real-time coordinates of each vertex of the target vegetation model and the real-time coordinates of the center point of each surface;
the role coordinate acquisition module is used for acquiring real-time coordinates of the virtual roles;
the offset value calculation module is used for obtaining an offset value generated by the target vegetation model under the action of the virtual character according to the real-time coordinates of the central point and the real-time coordinates of the virtual character;
the offset coordinate calculation module is used for adding the offset value to the real-time coordinates of the vertexes of the target vegetation model to obtain the offset coordinates of the target vegetation model;
the rendering module is used for rendering the target vegetation model according to the offset coordinates;
the offset value calculation module is configured to:
vector operation is carried out on the real-time coordinates of the center point and the real-time coordinates of the virtual character, and the size and the direction of the offset value are determined according to the operation result.
8. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of any of claims 1-6.
9. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1-6 via execution of the executable instructions.
CN201911176102.8A 2019-11-26 2019-11-26 Vegetation rendering method and device, storage medium and electronic equipment Active CN110838162B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911176102.8A CN110838162B (en) 2019-11-26 2019-11-26 Vegetation rendering method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911176102.8A CN110838162B (en) 2019-11-26 2019-11-26 Vegetation rendering method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN110838162A CN110838162A (en) 2020-02-25
CN110838162B true CN110838162B (en) 2023-11-28

Family

ID=69577357

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911176102.8A Active CN110838162B (en) 2019-11-26 2019-11-26 Vegetation rendering method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN110838162B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111583373B (en) * 2020-05-11 2023-06-27 上海米哈游天命科技有限公司 Model rendering method, device, equipment and storage medium
CN111667563B (en) * 2020-06-19 2023-04-07 抖音视界有限公司 Image processing method, device, equipment and storage medium
CN111882637B (en) * 2020-07-24 2023-03-31 上海米哈游天命科技有限公司 Picture rendering method, device, equipment and medium
CN111882677B (en) * 2020-08-04 2024-02-23 网易(杭州)网络有限公司 Method and device for editing three-dimensional plant model, electronic equipment and storage medium
CN112132938B (en) * 2020-09-22 2024-03-12 上海米哈游天命科技有限公司 Model element deformation processing and picture rendering method, device, equipment and medium
CN112132936B (en) * 2020-09-22 2024-03-29 上海米哈游天命科技有限公司 Picture rendering method and device, computer equipment and storage medium
CN112132935A (en) * 2020-09-22 2020-12-25 上海米哈游天命科技有限公司 Model element deformation processing method, model element deformation processing device, model element image rendering method, model element image rendering device and model element image rendering medium
CN112132934A (en) * 2020-09-22 2020-12-25 上海米哈游天命科技有限公司 Model element deformation processing method, model element deformation processing device, model element image rendering method, model element image rendering device and model element image rendering medium
CN112206528B (en) * 2020-10-12 2024-03-01 网易(杭州)网络有限公司 Vegetation model rendering method, device, equipment and storage medium
CN112206534A (en) * 2020-11-04 2021-01-12 网易(杭州)网络有限公司 Vegetation object processing method, device, equipment and storage medium
CN112241993B (en) * 2020-11-30 2021-03-02 成都完美时空网络技术有限公司 Game image processing method and device and electronic equipment
CN112802166B (en) * 2021-01-18 2023-08-08 网易(杭州)网络有限公司 Display method, device, storage medium and equipment for simulating wind swing of virtual plants
CN112807685A (en) * 2021-01-22 2021-05-18 珠海天燕科技有限公司 Grassland rendering method, grassland rendering device and grassland rendering equipment based on game role track
CN113034350B (en) * 2021-03-24 2023-03-24 网易(杭州)网络有限公司 Vegetation model processing method and device
CN113134239A (en) * 2021-05-13 2021-07-20 网易(杭州)网络有限公司 Rendering method, rendering device, electronic equipment and computer-readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6421048B1 (en) * 1998-07-17 2002-07-16 Sensable Technologies, Inc. Systems and methods for interacting with virtual objects in a haptic virtual reality environment
CN104700413A (en) * 2015-03-20 2015-06-10 中国人民解放军装甲兵工程学院 Real-time dynamic drawing method for vegetation in three-dimensional virtual scene
CN108389245A (en) * 2018-02-13 2018-08-10 鲸彩在线科技(大连)有限公司 Rendering intent, device, electronic equipment and the readable storage medium storing program for executing of cartoon scene
CN109101690A (en) * 2018-07-11 2018-12-28 深圳地平线机器人科技有限公司 Method and apparatus for rendering the scene in Vehicular automatic driving simulator
CN109461199A (en) * 2018-11-15 2019-03-12 腾讯科技(深圳)有限公司 Picture rendering method and device, storage medium and electronic device
CN109934897A (en) * 2019-03-06 2019-06-25 珠海金山网络游戏科技有限公司 A kind of swing effect simulation system, calculates equipment and storage medium at method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6421048B1 (en) * 1998-07-17 2002-07-16 Sensable Technologies, Inc. Systems and methods for interacting with virtual objects in a haptic virtual reality environment
CN104700413A (en) * 2015-03-20 2015-06-10 中国人民解放军装甲兵工程学院 Real-time dynamic drawing method for vegetation in three-dimensional virtual scene
CN108389245A (en) * 2018-02-13 2018-08-10 鲸彩在线科技(大连)有限公司 Rendering intent, device, electronic equipment and the readable storage medium storing program for executing of cartoon scene
CN109101690A (en) * 2018-07-11 2018-12-28 深圳地平线机器人科技有限公司 Method and apparatus for rendering the scene in Vehicular automatic driving simulator
CN109461199A (en) * 2018-11-15 2019-03-12 腾讯科技(深圳)有限公司 Picture rendering method and device, storage medium and electronic device
CN109934897A (en) * 2019-03-06 2019-06-25 珠海金山网络游戏科技有限公司 A kind of swing effect simulation system, calculates equipment and storage medium at method

Also Published As

Publication number Publication date
CN110838162A (en) 2020-02-25

Similar Documents

Publication Publication Date Title
CN110838162B (en) Vegetation rendering method and device, storage medium and electronic equipment
CN111145326B (en) Processing method of three-dimensional virtual cloud model, storage medium, processor and electronic device
JP4693159B2 (en) Program, information storage medium, and image generation system
CN109448137B (en) Interaction method, interaction device, electronic equipment and storage medium
CN112241993B (en) Game image processing method and device and electronic equipment
CN113658309A (en) Three-dimensional reconstruction method, device, equipment and storage medium
CN110930492B (en) Model rendering method, device, computer readable medium and electronic equipment
CN112580213A (en) Method and apparatus for generating display image of electric field lines, and storage medium
CN113516774A (en) Rendering quality adjusting method and related equipment
CN112528707A (en) Image processing method, device, equipment and storage medium
JP4754384B2 (en) Program, information recording medium, and image generation system
CN115861510A (en) Object rendering method, device, electronic equipment, storage medium and program product
US7710419B2 (en) Program, information storage medium, and image generation system
CN114581586A (en) Method and device for generating model substrate, electronic equipment and storage medium
US7724255B2 (en) Program, information storage medium, and image generation system
US11651548B2 (en) Method and apparatus for computer model rasterization
CN116206046B (en) Rendering processing method and device, electronic equipment and storage medium
CN112734896B (en) Environment shielding rendering method and device, storage medium and electronic equipment
CN117541704A (en) Model map processing method and device, storage medium and electronic equipment
CN116310022A (en) Flame special effect manufacturing method and device, storage medium and electronic equipment
JP2007026325A (en) Program, information storage medium, and image generation system
CN114581588A (en) Rendering method, device and system
CN116664738A (en) Image generation method and device, electronic equipment and computer readable medium
CN115937389A (en) Shadow rendering method, device, storage medium and electronic equipment
CN116309966A (en) Method and device for processing deformation of virtual object, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant