CN117274445A - Object generation method and device - Google Patents

Object generation method and device Download PDF

Info

Publication number
CN117274445A
CN117274445A CN202311264152.8A CN202311264152A CN117274445A CN 117274445 A CN117274445 A CN 117274445A CN 202311264152 A CN202311264152 A CN 202311264152A CN 117274445 A CN117274445 A CN 117274445A
Authority
CN
China
Prior art keywords
growth
node
target
dimension
tree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311264152.8A
Other languages
Chinese (zh)
Inventor
陈志聪
潘玮东
李宁谦
陈杰
何文雅
孟岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Kingsoft Digital Network Technology Co Ltd
Original Assignee
Zhuhai Kingsoft Digital Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Kingsoft Digital Network Technology Co Ltd filed Critical Zhuhai Kingsoft Digital Network Technology Co Ltd
Priority to CN202311264152.8A priority Critical patent/CN117274445A/en
Publication of CN117274445A publication Critical patent/CN117274445A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/36Software reuse
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application provides an object generation method and device, wherein the object generation method comprises the following steps: acquiring a node tree corresponding to the target object, and determining a corresponding growth influence factor according to the node attribute of the node tree; according to the growth influence factors, calculating the growth weight and the growth parameters of each node in the node tree, wherein the growth weight is used for indicating the growth probability of the corresponding node, and the growth parameters are used for indicating the growth information of the corresponding node; screening target nodes to be grown based on the growth weight of each node, and updating the node tree according to target growth parameters to obtain an updated node tree, wherein the target growth parameters are growth parameters corresponding to the target nodes; and generating a growth animation of the target object according to the updated node tree. The method comprises the steps of determining the growth influence factors corresponding to the node tree according to the environmental factors in the scene, generating the growth animation of the target object based on the growth influence factors, and generating the growth animation of the target object based on the environmental factors in the scene in the generation scene of the object growth animation.

Description

Object generation method and device
Technical Field
The application relates to the technical field of Internet, in particular to an object generation method. The present application also relates to an object generating apparatus, a computing device, and a computer-readable storage medium.
Background
In the game scene design, the existing game making flow generally generates a model offline in design software, and then introduces the model into a game engine for use, but the model introduced into the game engine cannot be directly modified in the game engine, and can only be reintroduced after the model is modified in the design software, so that real-time feedback cannot be realized.
For example, in digital content authoring (DCC, digital Content Creation) software, an object model with multiple fixed forms is prefabricated, after the object model is imported into a game engine, an art designer cannot adjust the form of the object, and only a growth animation corresponding to the object can be generated according to the object model with the fixed form.
Disclosure of Invention
In view of this, an embodiment of the present application provides an object generating method. The present application also relates to an object generating apparatus, a computing device, and a computer-readable storage medium, which solve the technical drawbacks existing in the prior art.
According to a first aspect of an embodiment of the present application, there is provided an object generating method, including:
acquiring a node tree corresponding to a target object, and determining a corresponding growth influence factor according to node attributes of the node tree;
calculating the growth weight and the growth parameter of each node in the node tree according to the growth influence factors, wherein the growth weight is used for indicating the growth probability of the corresponding node, and the growth parameter is used for indicating the growth information of the corresponding node;
screening target nodes to be grown based on the growth weights of the nodes, and updating the node tree according to target growth parameters to obtain an updated node tree, wherein the target growth parameters are growth parameters corresponding to the target nodes;
and generating a growth animation of the target object according to the updated node tree.
According to a second aspect of embodiments of the present application, there is provided an object generating apparatus, including:
The acquisition module is configured to acquire a node tree corresponding to the target object, and determine a corresponding growth influence factor according to the node attribute of the node tree;
the calculation module is configured to calculate the growth weight and the growth parameter of each node in the node tree according to the growth influence factors, wherein the growth weight is used for indicating the growth probability of the corresponding node, and the growth parameter is used for indicating the growth information of the corresponding node;
the screening module is configured to screen target nodes to be grown based on the growth weights of the nodes, and update the node tree according to target growth parameters to obtain an updated node tree, wherein the target growth parameters are growth parameters corresponding to the target nodes;
and the generation module is configured to generate a growth animation of the target object according to the updated node tree.
According to a third aspect of embodiments of the present application, there is provided a computing device comprising a memory, a processor and computer instructions stored on the memory and executable on the processor, the processor implementing the steps of the object generation method when executing the computer instructions.
According to a fourth aspect of embodiments of the present application, there is provided a computer readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the object generation method.
According to the object generation method, a node tree corresponding to a target object is obtained, and a corresponding growth influence factor is determined according to the node attribute of the node tree; calculating the growth weight and the growth parameter of each node in the node tree according to the growth influence factors, wherein the growth weight is used for indicating the growth probability of the corresponding node, and the growth parameter is used for indicating the growth information of the corresponding node; screening target nodes to be grown based on the growth weights of the nodes, and updating the node tree according to target growth parameters to obtain an updated node tree, wherein the target growth parameters are growth parameters corresponding to the target nodes; and generating a growth animation of the target object according to the updated node tree.
According to the method and the device, the growth weight and the growth parameters of all nodes in the corresponding node tree are calculated through the growth influence factors, the target node for growth is determined according to the growth weight, the updated node tree is obtained based on the growth parameters, the growth animation of the target object is generated according to the updated node tree, the growth influence factors corresponding to the node tree are determined automatically according to the environmental factors in the scene, and the growth animation of the target object is generated based on the growth influence factors, so that the growth animation of the target object is directly generated based on the environmental factors in the scene in the generation scene of the object growth animation, and the situation that the environment factors and the growth animation of the object are not adapted and need to be repeatedly modified is avoided.
Drawings
FIG. 1 is a flow chart of an object generation method according to an embodiment of the present application;
FIG. 2a is a process flow diagram of an object generation method for a game engine according to one embodiment of the present application;
FIG. 2b is a schematic diagram of a node tree in an object generating method according to an embodiment of the present application;
FIG. 3a is a schematic flow chart of node tree update in an object generation method according to an embodiment of the present application;
FIG. 3b is a schematic diagram of a ray dimension in an object generating method according to an embodiment of the present application;
FIG. 3c is a schematic diagram of spatial dimensions in an object generating method according to an embodiment of the present application;
fig. 4a is a schematic diagram of a tree model in an object generating method according to an embodiment of the present application;
FIG. 4b is a schematic diagram illustrating a tree model vertex shift in an object generating method according to an embodiment of the present disclosure;
FIG. 4c is a schematic diagram of tree model bending in an object generating method according to an embodiment of the present application;
fig. 5a is a first effect diagram of tree generation in an object generation method according to an embodiment of the present application;
FIG. 5b is a second effect diagram of a tree generated in an object generating method according to an embodiment of the present application;
Fig. 5c is a third effect diagram of tree generation in an object generation method according to an embodiment of the present application;
fig. 5d is a fourth effect diagram of tree generation in the object generating method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an object generating apparatus according to an embodiment of the present application;
FIG. 7 is a block diagram of a computing device according to one embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is, however, susceptible of embodiment in many other ways than those herein described and similar generalizations can be made by those skilled in the art without departing from the spirit of the application and the application is therefore not limited to the specific embodiments disclosed below.
The terminology used in one or more embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of one or more embodiments of the application. As used in this application in one or more embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present application refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that, although the terms first, second, etc. may be used in one or more embodiments of the present application to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, a first may also be referred to as a second, and similarly, a second may also be referred to as a first, without departing from the scope of one or more embodiments of the present application. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
First, terms related to one or more embodiments of the present application will be explained.
Game engine: is a software program or environment that provides the developer with tools and application programming interfaces (APIs, application Programming Interface) required to author electronic games, create graphics, and visualizations, including content ranging from artificial intelligence (AI, artificial Intelligence), animation to physical simulation and audio.
Digital content authoring (DCC, digital Content Creation): for two-dimensional/three-dimensional, audio/video clip composition, dynamic/interactive content authoring, image editing, etc.
In the game scene design, the existing game making flow generally generates a model offline in design software, and then introduces the model into a game engine for use, but the model introduced into the game engine cannot be directly modified in the game engine, and can only be reintroduced after the model is modified in the design software, so that real-time feedback cannot be realized.
For example, in digital content authoring (DCC, digital Content Creation) software, an object model with multiple fixed forms is prefabricated, after the object model is imported into a game engine, an artist designer cannot adjust the form of the object, and only generates a growth animation corresponding to the object according to the object model with the fixed form.
In order to solve the technical problems, in the embodiment of the application, the growth weight and the growth parameters of each node in the corresponding node tree are calculated through the growth influence factors, the target node for growth is determined according to the growth weight, the updated node tree is obtained based on the growth parameters, the growth animation of the target object is generated according to the updated node tree, the automatic determination of the growth influence factors corresponding to the node tree according to the environmental factors in the scene is realized, and the growth animation of the target object is generated based on the growth influence factors.
In the present application, an object generating method is provided, and the present application relates to an object generating apparatus, a computing device, and a computer-readable storage medium, which are described in detail in the following embodiments one by one.
Fig. 1 shows a flowchart of an object generating method according to an embodiment of the present application, which specifically includes the following steps:
step 102: and acquiring a node tree corresponding to the target object, and determining a corresponding growth influence factor according to the node attribute of the node tree.
The object generating method provided in the embodiments of the present application is generally executed by the server, but in other embodiments of the present application, the client may also have a similar function as the server, so as to execute the object generating method provided in the present application. In other embodiments, the object generating method provided in the embodiments of the present application may be performed by a client and a server together. In practical applications, the client and/or the server include applications with object generation functions, and the following description will take the server executing the object generation method provided in the present application as an example.
When the object generation requirement exists, the server side can acquire a node tree corresponding to the target object, and determine a corresponding growth influence factor according to the node attribute of the node tree.
Specifically, the target object refers to an object to be generated, for example, the target object may be natural vegetation, artificial vegetation, and the like, such as trees, grasses, flowers, wheat, rice, and the like, and the generation of the target object is affected by the environment in which the target object is located. The node tree is a tree composed of nodes corresponding to the generation form of the target object, and the node tree comprises at least one node and an association relation between the nodes.
The node attribute refers to a growth attribute of each node in the node tree, the node attribute may include a node identifier, a parent node, a child node, a growth length, a growth position, and the like of each node, and the node attribute may include a plurality of storage tables, for example, one storage table stores one attribute, specifically, one storage table stores the node identifier, one storage table stores the parent node, one storage table stores the child node, and the like, or the node attribute may include one storage table, that is, all the node attributes are stored in the one storage table. The node tree can be constructed by the node attributes, i.e. the node attributes are generally consistent with the node tree in terms of the amount of information contained. The growth factor refers to a factor that affects the generation of the target object, and includes an influence factor of the target object itself and an environmental influence factor.
The implementation manner of the node tree for obtaining the target object is different based on the current state of the target object, and is specifically determined according to the actual situation, which is not limited herein.
In one possible implementation manner of the method, the target object is an existing target object structure, and when secondary generation is needed, the node tree of the target object is obtained from the designated position of the node tree corresponding to the storage target object, and the node tree identifies the current growth condition.
In another possible implementation manner of the present application, if the target object is an object that has not yet grown, a node tree that includes only root nodes is obtained.
The implementation manner of determining the corresponding growth influence factors according to the node attributes of the node tree can be that the environment parameters of the current scene are obtained, and the growth influence factors corresponding to all the nodes are determined based on the environment parameters and the node attributes of the node tree; the method can also comprise the steps of obtaining environment parameters corresponding to all nodes in the node tree, and determining growth influence parameters corresponding to all the nodes based on the environment parameters and the node attributes corresponding to all the nodes.
Optionally, determining the corresponding growth influence factor according to the node attribute of the node tree may be obtaining the growth influence factor of each node in the node tree according to the node attribute of the node tree, or may be determining at least one designated node that needs to be grown from each node in the node tree, and determining the growth influence factor corresponding to the at least one designated node.
In an alternative embodiment of the present application, the step of determining the corresponding growth influencing factor according to the node attribute of the node tree includes the following steps:
searching at least one node to be grown from the node tree;
and determining the growth influence factors of the nodes to be grown according to the node attribute of at least one node to be grown.
Specifically, the node to be grown refers to a node meeting a growth condition in the node tree, the growth condition may be that the number of sub-nodes corresponding to the node does not reach a preset threshold, for example, the preset threshold is 4, the number of sub-nodes of the node L is 2, the number of sub-nodes of the node M is 3, the number of sub-nodes of the node N is 4, the node L and the node M can be used as the node to be grown for subsequent judgment, the growth node may also be that the nutrients provided by the environment exceeds a nutrient threshold, for example, the nutrient threshold is 80%, the nutrient of the node a is 90%, and the nutrient of the node B is 70%, and then the node a can be used as the node to be grown for subsequent treatment.
The implementation manner of searching at least one node to be grown from the node tree may be traversing each node in the node tree, determining a node meeting the growth condition, and determining the node meeting the growth condition as the node to be grown; at least one node meeting the growth condition in the node tree can be selected randomly as a node to be grown.
Illustratively, a growth time interval is set, the growth time interval comprising 10 frames, and a growth influencing factor is determined for the node to be grown corresponding to the 10 frames.
The implementation manner of determining the growth influence factor of each node to be grown according to the node attribute of at least one node to be grown may be to obtain the environmental parameter corresponding to at least one node to be grown, and determine the growth influence factor of each node to be grown based on the environmental parameter and the node attribute corresponding to each node to be grown.
By applying the scheme of the embodiment of the application, at least one node to be grown is searched from the node tree, the growth influence factors of the nodes to be grown are determined according to the node attribute of the at least one node to be grown, and the nodes in the node tree are screened, so that the growth influence factors are determined only for the at least one searched node to be grown, the efficiency of determining the growth influence factors is reduced, and further the subsequent determination of the target node and the calculation power consumption of growth are reduced.
Step 104: and calculating the growth weight and the growth parameter of each node in the node tree according to the growth influence factors, wherein the growth weight is used for indicating the growth probability of the corresponding node, and the growth parameter is used for indicating the growth information of the corresponding node.
Specifically, the growth weight is used to indicate the probability of node growth, and the greater the weight, the greater the probability of node growth, the growth weight can be further understood as the priority of node growth, and the greater the weight, the higher the priority. The growth parameters are used to indicate growth information of the corresponding node, for example, the growth parameters may include a growth direction, a growth length, etc. of the node when growing.
According to the growth influence factors, the implementation mode of calculating the growth weight and the growth parameters of each node in the node tree can be aimed at the growth influence factors of any dimension, calculating the growth weight and the growth parameters of each node in the node tree under the any dimension, and fusing the growth weight and the growth parameters corresponding to each node in each dimension to obtain the growth weight and the growth parameters of each node in the node tree.
Optionally, the implementation manner of calculating the growth weight and the growth parameter of each node in the node tree according to the growth influence factor may be that at least one node to be grown is searched from the node tree, and the growth weight and the growth parameter of each node to be grown in the node tree are calculated according to the growth influence factor.
Optionally, the growth influencing factors include growth influencing factors of at least one dimension, wherein the dimension can be an illumination dimension, a space dimension, a nutrient dimension and the like, and the corresponding dimension is selected according to the actual growth environment factors of the target object to judge the growth of the target object.
In an optional embodiment of the present application, the growth influencing factor corresponds to at least one dimension, and the growth influencing factor of each dimension includes an environmental parameter and a node parameter of the node tree, where the node parameter is determined by a node attribute of the node tree; the step of calculating the growth weight and the growth parameter of each node in the node tree according to the growth influence factors, and comprises the following steps:
aiming at a target growth node, calculating a growth sub-weight and a growth sub-parameter of the target growth node in each dimension according to a growth influence factor of at least one dimension, wherein the target growth node is any one of the nodes;
fusing the weights of the growers of the target growing nodes in each dimension to obtain the growing weights of the target growing nodes;
and fusing the growth sub-parameters of the target growth node in each dimension to obtain the growth parameters of the target growth node.
Specifically, the environmental parameter refers to an environment affecting the growth of the node tree in the current scene, for example, if the target object is a tree, the environmental parameter may be an obstacle for shielding illumination. The node parameters refer to parameters in the node tree that affect the growth in the node tree, for example, if the target object is a tree, the node parameters may be the number of parent nodes or child nodes connected to the node on the tree, the spatial distance between the parent nodes or child nodes and other nodes, and so on.
For the target growth node, according to the growth influence factor of at least one dimension, the implementation manner of calculating the growth sub-weight and the growth sub-parameter of the target growth node in each dimension may be to calculate the growth sub-weight and the growth sub-parameter of the target growth node in the target dimension according to the target growth node and the target dimension, wherein the target dimension is any one of the at least one dimension.
The implementation manner of fusing the growth sub-weights of the target growth node under each dimension to obtain the growth weight of the target growth node may be to perform weighted summation on the growth sub-weights of the target growth node under each dimension to obtain the growth weight of the target growth node, for example, each dimension corresponds to a corresponding growth coefficient, multiply the growth sub-weights of each dimension with the corresponding growth coefficients, and add the multiplication results to obtain the growth weight of the target growth node, where the growth coefficients are preset in a growth program to be used for adjusting the influence proportion of growth influence factors of different dimensions on the growth of the target object.
The implementation manner of fusing the growth sub-parameters of the target growth node under each dimension to obtain the growth parameters of the target growth node may be to classify the growth sub-parameters corresponding to each dimension, fuse the growth sub-parameters belonging to the same class to obtain the growth parameters of different classes of the target growth node, for example, divide the growth sub-parameters of each dimension into a direction class and a length class, fuse the parameters belonging to the direction class to obtain the growth direction of the target growth node, fuse the parameters belonging to the length direction to obtain the growth length of the target growth direction.
Alternatively, the way to fuse the parameters belonging to the direction category may be normalization, weighted summation, etc.
Illustratively, the at least one dimension includes an illumination dimension, a space dimension, and a nutrient dimension, and for the target growth node, calculating an illumination growth sub-weight and an illumination growth sub-parameter of the target growth node in the illumination dimension, calculating a space growth sub-weight and a space growth sub-parameter of the target growth node in the space dimension, and calculating a space growth sub-weight and a space growth sub-parameter of the target growth node in the space dimension; and fusing the illumination growth sub-weight, the space growth sub-weight and the nutrient growth sub-weight to obtain the growth weight of the target growth node, and fusing the illumination growth sub-parameter, the space growth sub-parameter and the nutrient growth sub-parameter to obtain the growth parameter of the target growth node.
By applying the scheme of the embodiment of the application, the growth influence factors correspond to at least one dimension, the growth influence factors of each dimension comprise environmental parameters and node parameters, when the growth weights and the growth parameters of all nodes in the node tree are calculated according to the growth influence factors, the growth sub-weights and the growth sub-parameters of any node under at least one dimension are fused to obtain the growth weights and the growth parameters of all nodes, and the generation results of the target objects are more practical and more true based on the obtained growth weights and the growth parameters of all nodes in the node tree by considering the growth influence factors of multiple dimensions.
In an alternative embodiment of the present application, the at least one dimension includes an illumination dimension; according to the growth influence factor of at least one dimension, the steps calculate the growth sub-weight and the growth sub-parameter of the target growth node in each dimension, and the steps include:
defining a first preset range of a target growth node, and performing stepping of a plurality of rays within the first preset range;
measuring the distance of the plurality of ray steps according to the growth influence factors of the illumination dimension;
and calculating the illumination growth sub-weight and the illumination growth sub-parameter of the target growth node in the illumination dimension based on the distance of the plurality of ray steps.
Specifically, the first preset range is a preset range, which is used for limiting the growth range of the node, so as to avoid that an included angle between a child node corresponding to the node and the growth direction of the node is greater than 180 degrees, for example, the first preset range is that the node is taken as a sphere center, and a hemisphere, a sector sphere and the like are defined by a father node of the node towards the direction of the node. Ray stepping refers to the emission of light at a point starting point.
The implementation method of defining the first preset range of the target growth node and performing stepping of a plurality of rays in the first preset range may be to define a hemispherical range with the target growth node as a center of sphere and the direction from the father node of the target node to the target node as an orientation, and to emit a plurality of rays in the hemispherical range with the target growth node as the center of sphere, where the radius corresponding to the hemispherical may be any radius, and the orientation of stepping of a plurality of rays may be stepping in a random direction in the hemispherical range.
The implementation manner of measuring the distance of the plurality of light steps according to the growth influence factor of the illumination dimension may be to determine the blocked condition of the plurality of stepping light according to the growth influence factor of the illumination dimension, and determine the distance of the plurality of light steps based on each blocked condition.
The method for calculating the illumination growth sub-weight and the illumination growth sub-parameter of the target growth node in the illumination dimension based on the distance of the plurality of ray steps may be based on a preset distance threshold, at least one item of target step rays with the distance larger than the preset distance threshold are selected from the plurality of ray steps, and the illumination growth sub-weight and the illumination growth sub-parameter of the target growth node in the illumination dimension are determined based on the at least one item of target step rays.
The implementation manner of determining the light growth sub-weight and the light growth sub-parameter of the target growth node in the light dimension based on the at least one item target stepping ray may be to determine the growth sub-weight based on the at least one item target stepping ray and the number of the plurality of stepping rays, and determine the growth sub-parameter based on a parameter of selecting an item target stepping ray meeting the selection condition from the at least one item target stepping ray.
Optionally, the implementation of determining the growth sub-weight based on the at least one entry target stepping ray and the number of the plurality of stepping rays may be that the number of the at least one entry target stepping ray divided by the number of the plurality of stepping rays obtains the growth sub-weight.
Optionally, the growth sub-parameter is a growth direction, and the direction with the longest distance is selected from at least one item of target step light rays to be determined as an illumination growth direction.
By applying the scheme of the embodiment of the application, when at least one dimension is an illumination dimension, a first preset range of the target growth node is defined, a plurality of light steps are performed in the first preset range, the distance between the plurality of light steps is measured, illumination growth sub-weights and illumination growth sub-parameters of the target growth node are determined based on the plurality of distances, and the influence of illumination on trees is influenced through straight lines, so that the light steps are performed based on the first preset range of the target growth node, and the growth sub-parameters and the growth sub-weights of the target growth node in the illumination dimension are determined according to a light step result, namely, the influence of illumination on the growth of the target object is fully considered in the process of determining the growth weights and the growth parameters.
In an alternative embodiment of the present application, at least one dimension comprises a spatial dimension; according to the growth influence factor of at least one dimension, the steps calculate the growth sub-weight and the growth sub-parameter of the target growth node in each dimension, and the steps include:
defining a second preset range of the target growth node, and generating a plurality of space detection points in the second preset range;
counting overlapped conditions of a plurality of space detection points according to the growth influence factors of the space dimension;
based on the overlapped condition of the plurality of spatial detection points, the spatial growth sub-weight and the spatial growth sub-parameter of the target growth node in the spatial dimension are calculated.
Specifically, the second preset range is used for limiting the growth direction of the node so as to avoid that the included angle between the child node corresponding to the node and the growth direction of the node is greater than 180 degrees, and the second preset range can be the same as the first preset range. The spatial detection points refer to points randomly distributed based on a preset manner, for example, a preset number of points are randomly scattered within a second preset range, and the preset number of points are determined as the spatial detection points. The overlapped condition refers to a condition that the spatial detection points are eliminated, for example, a second preset range is divided into a region 1 and a region 2, the region 2 is overlapped with a second preset range of other nodes, and/or an obstacle is overlapped, the spatial detection points in the second preset range are eliminated, the spatial detection points of the region 1 are determined to be unchanged, and the spatial detection points of the region 2 are eliminated as the overlapped condition.
The implementation manner of defining the second preset range of the target growth node and generating the plurality of spatial detection points in the second preset range may be to define a hemispherical range by taking the target growth node as a center of sphere and taking the direction from the father node of the target node to the target node as an orientation, and randomly distributing the plurality of spatial detection points in the hemispherical range, where the plurality of spatial detection points may be any positions in the hemispherical range, and the number of spatial detection points may be preset, or may be temporarily set according to the environmental parameters of the current scene.
According to the growth influence factors of the space dimension, the implementation mode of counting the overlapped condition of the plurality of space detection points can be to determine the position information of the obstacle in the scene and the second preset range of other nodes on the node tree according to the growth influence factors of the space dimension, determine the position information and the second preset range of other nodes as an overlapped area, eliminate the space detection points in the overlapped area according to the overlapped area and the second preset range, and obtain the overlapped condition of the plurality of space detection points, wherein the selection of the other nodes can be all nodes except the target growth node in the node tree, or can be to determine the node with the preset distance from the target growth node based on the position of the target growth node.
Based on the overlapped condition of the plurality of space detection points, the implementation manner of calculating the space growth sub-weight and the space growth sub-parameter of the target growth node in the space dimension can be based on the overlapped condition of the plurality of space detection points, counting the space detection points which are not eliminated, determining the space growth sub-weight of the target growth node in the space dimension based on the plurality of space detection points and the space detection points which are not eliminated, and determining the space growth sub-parameter of the target growth node in the space dimension based on the space detection points which are not eliminated.
Based on the plurality of spatial detection points and the non-eliminated spatial detection points, an implementation manner of determining the spatial growth sub-weight of the target growth node in the spatial dimension may be that the number of the non-eliminated spatial detection points is divided by the number of the plurality of spatial detection points to obtain the control growth sub-weight of the target growth node in the spatial dimension.
Based on the space detection points which are not eliminated, determining the implementation mode of the space growth subparameter of the target growth node in the space dimension can be to fuse the appointed parameters of the space detection points which are not eliminated to obtain the space growth subparameter of the target growth node in the space dimension, wherein the appointed parameters can be the growth direction, the growth direction of the space detection points is the direction from the target growth node to the position of the space detection points, and the fusion can be to average the growth direction of the space detection points which are not eliminated.
By applying the scheme of the embodiment of the application, when at least one dimension is a space dimension, a second preset range of the target growth node is defined, a plurality of space detection points are generated in the second preset range, overlapped conditions of the plurality of space detection points are counted, space growth sub-weights and space growth sub-parameters of the target growth node are determined based on the overlapped conditions, the influence of space on trees is influenced by surrounding space crowding degree, so that the space detection points are generated based on the second preset range of the target growth node, and the growth sub-parameters and the growth sub-weights of the target growth node in the space dimension are determined according to the overlapped conditions of the space detection points, namely, the influence of the space on the growth of the target object is fully considered in the process of determining the growth weights and the growth parameters.
In an alternative embodiment of the present application, at least one dimension comprises a nutrient dimension; according to the growth influence factor of at least one dimension, the steps calculate the growth sub-weight and the growth sub-parameter of the target growth node in each dimension, and the steps include:
calculating the distance between a target growth node and a root node in a node tree according to the growth influence factors of the nutrient dimensions;
Based on the distance, a nutrient growth sub-weight and a nutrient growth sub-parameter of the target growth node in the nutrient dimension are calculated.
Specifically, the nutrients refer to a general term for supplying nutrients to the target object, and the nutrients may include moisture, trace elements in the soil, and the like.
The implementation manner of calculating the distance between the target growth node and the root node in the node tree according to the growth influence factor of the nutrient dimension can be to determine the logical distance between the target growth node and the root node, the actual distance and the node type of the target growth node according to the growth influence factor of the nutrient dimension.
Based on the distance, the implementation manner of calculating the nutrient growth sub-weight and the nutrient growth sub-parameter of the target growth node in the nutrient dimension can be based on the logic distance and the actual distance between the target growth node and the root node, calculate the nutrient growth sub-weight of the target growth node in the nutrient dimension, and determine the nutrient growth sub-parameter of the target growth node in the nutrient dimension based on the node type of the target growth node.
Based on the logical distance and the actual distance between the target growth node and the root node, the implementation manner of calculating the nutrient growth sub-weight of the target growth node in the nutrient dimension can be that the logical distance between the target growth node and the root node is multiplied by the inverse of the actual distance to obtain the nutrient growth sub-weight of the target growth node in the nutrient dimension.
The implementation mode of determining the nutrient growth subparameter of the target growth node in the nutrient dimension based on the node type of the target growth node can be based on the node type of the target growth node, obtain the preset reference parameter corresponding to the node type, and determine the nutrient growth subparameter of the target growth node in the nutrient dimension based on the preset reference parameter, wherein the parameter can be a distance, and the nutrient growth subparameter is a nutrient growth subparameter distance.
By applying the scheme of the embodiment of the application, when at least one dimension is a nutrient dimension, the distance between the target growth node and a root node in a node tree is calculated according to the growth influence factor of the nutrient dimension, the nutrient growth sub-weight and the nutrient growth sub-parameter of the target growth node in the nutrient dimension are calculated based on the distance, and the influence of the nutrient on the tree is influenced by the distance between the root node and the target growth node, so that the growth sub-parameter and the growth sub-weight of the target growth node in the space dimension are determined based on the distance between the target growth node and the root node, namely, the influence of the nutrient on the growth of a target object is fully considered in the process of determining the growth weight and the growth parameter.
Step 106: and screening target nodes to be grown based on the growth weights of the nodes, and updating the node tree according to target growth parameters to obtain an updated node tree, wherein the target growth parameters are growth parameters corresponding to the target nodes.
Specifically, the updated node number is a node tree obtained by updating the node tree based on the target growth parameter, wherein the updating may be to modify a node attribute corresponding to the node tree, the modification to the node attribute may be to modify existing contents in the node attribute, for example, to determine a growth direction and a growth length of the target node that continues to grow, then modify the growth direction and the corresponding growth length of the target node in the node attribute, and the modification to the node attribute may also be to add new contents in the node attribute, for example, to determine the growth direction and the growth length of the target node that continues to grow, then add a child node to the target node, and newly add the node attribute contents related to the child node in the node attribute.
The implementation manner of screening the target node to be grown based on the growth weight of each node may be to compare the growth weights of each node, and select the node corresponding to the target growth weight as the target node to be grown.
The implementation manner of updating the node tree according to the target growth parameters may be to update the target nodes in the node tree according to the target growth parameters to obtain an updated node tree, and record the updated content to the node attribute to obtain the updated node attribute.
Optionally, the updating of the target node may be generating a child node of the target node, or may update a parameter corresponding to the target node, for example, the parameter may be a growth length, a growth direction, or the like.
In an optional embodiment of the present application, the step of screening the target node to be grown based on the growth weight of each node includes the following steps:
comparing the growth weights of the nodes, and determining target growth weights from the growth weights;
and determining the node corresponding to the target growth weight as a target node.
The implementation manner of comparing the growth weights of the nodes and determining the target growth weight from the growth weights may be to compare the values of the growth weights of the nodes and select the growth weight with the largest value as the target growth weight.
By applying the scheme of the embodiment of the application, the growth weights of the nodes are compared, the target growth weight is determined from the growth weights, and the node corresponding to the target growth weight is determined as the target node, so that the node corresponding to the target growth weight, namely the node with the best growth condition, is grown, automatic node screening is realized, and the subsequent growth of the target object is performed based on the node.
In an alternative embodiment of the present application, the step of updating the node tree according to the target growth parameter to obtain an updated node tree includes the following steps:
determining whether the target growth parameter indicates an increased node;
if yes, constructing child nodes of the target node in the node tree according to the target growth parameters, and obtaining an updated node tree.
Determining whether the target growth parameter indicates an implementation of adding a node may be by analyzing the target growth parameter to determine whether the target growth parameter indicates an adding node.
If so, constructing child nodes of the target node in the node tree according to the target growth parameters, and obtaining an implementation mode for updating the node tree, or if the target growth parameters indicate to increase the nodes, analyzing the target growth parameters, obtaining node attributes of the child nodes of the target node, and updating the node tree based on the node attributes.
Optionally, after determining whether the target growth parameter indicates an added node, modifying the node attribute of the target node in the node tree if the target growth parameter does not indicate an added node.
By applying the scheme of the embodiment of the application, whether the target growth parameters indicate the added nodes is determined, and under the condition of confirming the indication, the child nodes of the target nodes are built in the node tree according to the target growth parameters, so that the updated node tree is obtained, the node tree is reflected when the need of the added nodes exists, and the efficiency of generating the target objects based on the updated node tree is improved.
Alternatively, there are various implementations after updating the node tree according to the target growth parameter to obtain the updated node tree, and the determination is specifically made according to the actual situation, which is not limited herein.
In an optional embodiment of the present invention, after the step of updating the node tree according to the target growth parameter to obtain the updated node tree, obtaining an initial target object, and generating a growth animation of the target object based on the obtained updated node tree and the initial target object, where the growth animation is a growth animation of an object corresponding to the target node, for example, based on a node attribute of the target node in the updated node tree, continuing to grow the initial target object, and the growth animation is the animation that continues to grow.
The method comprises the steps of obtaining a target object, generating a target object growth animation directly based on the obtained updated node tree and the initial target object, and generating a target object growth animation once every time the node tree is updated based on the determined target node, so that real-time updating between the target node determination operation and the target object growth animation corresponding to the target node in the target object is realized.
In another optional embodiment of the present application, after the step of updating the node tree according to the target growth parameter to obtain the updated node tree, the method further includes the following steps:
And returning to execute the step of determining the corresponding growth influence factors according to the node attributes of the node tree until the execution stop condition is reached, and obtaining the final updated target node tree.
Specifically, the execution stop condition is used for defining the growth form of the target object, for example, the execution stop condition may set a number threshold for the number of nodes in the node tree, for example, the number threshold is 10, when the number of nodes in the node tree reaches 10, the step of determining the corresponding growth influence factor according to the node attribute of the node tree is not performed any more, the execution condition may be that the nutrients of all the nodes in the node tree do not exceed a nutrient threshold, for example, the nutrient threshold is 80% and the nutrient threshold of all the nodes is 80% or less, and the step of determining the corresponding growth influence factor according to the node attribute of the node tree is determined to be performed no more.
And returning to execute the step of determining the corresponding growth influence factors according to the node attributes of the node tree until the execution stop condition is reached, obtaining a final updated target node tree, wherein the display effect is that after the node tree is completely generated, the growth animation of the target object corresponding to the generated node tree is rendered step by step according to the node attributes of the finally obtained node tree, for example, the nodes in the node tree are layered, the corresponding growth animation is sequentially generated from the root node to the last layer of child nodes, the growth duration and the growth distance are determined for any layer, and the growth animation of the layer is generated based on the growth duration and the growth distance.
By applying the scheme of the embodiment of the application, after the node tree is updated according to the target growth parameters to obtain updated nodes, the step of determining the corresponding growth influence factors according to the node attributes of the node tree is returned to be executed until the execution stop condition is reached, and the node tree is determined to be the final updated target node tree at the same time, so that the target object growth animation is generated based on the whole node tree when the growth animation is generated subsequently, the consistency of the generated growth animation is improved, and the user experience is further improved.
Step 108: and generating a growth animation of the target object according to the updated node tree.
Specifically, the growth animation refers to a video of dynamic growth, for example, when the target object is a tree, the growth animation is a video of dynamic growth of the tree, and specifically, the growth animation can be a process from the capping of the trunk to the growing to the thickening.
Depending on the implementation manner of generating the growth animation of the target object by updating the node tree, the growth animation of the target object may be generated according to a growth animation rendering parameter and an update node tree, where the growth animation rendering parameter is used to define a growth rhythm of the target object, and the growth animation rendering parameter includes a single growth duration and a single growth distance.
In an alternative embodiment of the present application, the step of generating a growth animation of the target object according to the updated node tree includes the following steps:
obtaining a growth animation rendering parameter of a target object, wherein the animation rendering parameter comprises a single growth duration and a single growth distance;
and rendering the target node tree according to the single growth duration and the single growth distance to generate a growth animation of the target object.
The implementation manner of obtaining the growth animation rendering parameters of the target object may obtain the growth animation rendering parameters of the target object based on the target object from a storage position where the rendering parameters are stored in advance, where the growth rendering parameters may correspond to the target object or may correspond to the type of the target object.
Rendering the target node tree according to the single growth duration and the single growth distance, and generating the growth animation of the target object can be realized by acquiring an object model corresponding to the target node tree, establishing a corresponding relation between the object model and each node in the target node tree based on the single growth duration and the single growth distance, and generating the growth animation of the target object based on the object model, the target node tree and the corresponding relation.
The object model is a basic model of a target object, and is expanded, contracted, offset based on vertexes, and obtained by multiplying sin (x)/x converged on an x axis based on a random noise function. The trunk model may be a polygonal non-capping cylindrical model, such as a hexagonal non-capping cylindrical model, and the leaf model is a square patch.
By applying the scheme of the embodiment of the application, the growth animation rendering parameters of the target object are obtained, the target node tree is rendered according to the single growth time length and the single growth distance, and the growth animation of the target object is generated, so that the process of rendering the target node tree to obtain the growth animation of the target object is rendered according to the growth animation rendering parameters, the stability of rendering to obtain the growth animation is reflected, and the relative balance of all parts of the target object generating process is further realized.
According to one or more embodiments of the invention, through writing a logic judgment algorithm for generating a growth animation of a target object into a server, during actual execution, a worker only needs to adjust related parameters in the logic judgment algorithm, such as environment parameters and coefficients of different dimensions, the target object in a scene can simulate the actual object growth rule according to constraint rules of the parameters and factors such as illumination, nutrients and space, and objects of different forms are generated in a game scene, so that the combination of the types of the objects in the scene and the environment is more natural.
The method for generating an object is further described below with reference to fig. 2, taking an application of the method for generating a tree in a game engine as an example. Fig. 2 shows a process flow chart of an object generating method applied to a game engine according to an embodiment of the present application, which specifically includes the following steps:
and (3) a region M is defined in the game scene to generate trees, the number of the trees is set to be y, the number of nodes allocated to each tree is set to be 10, the index mark of each node in the node tree to which the node belongs is represented by x, and the coordinates (x, y) are used as the unique index of the node in the node tree. Any node corresponds to a plurality of buffers, including: parent node index buffer (store parent node index), child node index buffer (store child node index, one buffer can only store four forks at maximum), position buffer (store node position), direction buffer (store crotch growth direction), depth buffer (record depth of node in tree, convenient traversal), temporary buffer (record some temporary data for next frame calculation).
The buffer indicates the index of the buffer record at the parent node (parent node) and c indicates the index of the child node (child node) index buffer record.
Y points are scattered on the area M at random, y trees to be generated are determined, the number of nodes on each tree is set to be not more than 10, and illumination intensity, moisture, space and barriers are preset on the area M. The y trees in the region X are processed in parallel, and the execution flow in the object generation method is executed in parallel to obtain a growth animation corresponding to the y trees, wherein the flow of the object generation method in one or more embodiments of the application is executed by taking any 1 tree, and the crotch of each tree is 2 as an example:
step 202: and obtaining a node tree corresponding to the target object.
Obtaining a node tree corresponding to the tree 1 from a designated position of a storage node tree, wherein the node tree comprises a root node a, two child nodes B and C of the root node a, and two child nodes D and E of the root node a, wherein x (index identifier represented by the node a, B, C, D, E) indexes corresponding to the node a, B, D, E are 0,1,2,3,4, and the buffer corresponding to the node tree records the relationship between the parent node and the child node of each node, and specifically referring to fig. 2B, fig. 2B shows a node tree diagram in an object generating method provided in an embodiment of the present application:
the parent node of the A node is empty, the child nodes are the B node and the C node, and the corresponding buffer record is p= -1, c= {1,2}, wherein, -1 represents empty; similarly, the buffer record corresponding to the node B is p=0, c= {3,4}, the buffer record corresponding to the node C is p=0, c= { -1, -1}, the buffer record corresponding to the node D is p=1, c= { -1, -1}, and the buffer record corresponding to the node E is p=1, c= { -1, -1}.
Step 204: at least one node to be grown is searched from the node tree.
Based on the growth time interval of 0.25 seconds, 3 frames are corresponding to 0.25 seconds, 3 nodes which are not full of child nodes are randomly determined from the node tree, the 1 st frame is marked as the node D, the 2 nd frame is marked as the node E, and the 3 rd frame is marked as the node C.
Step 206: and determining the growth influence factors of the nodes to be grown according to the node attribute of at least one node to be grown.
From the buffers corresponding to nodes D, E, C in fig. 2b, the growth influencing factors of nodes D, E, C in the illumination dimension, the spatial dimension and the moisture dimension are determined.
Step 208: and determining the target node to be grown in the current growth time interval and the target growth parameters of the target node based on the growth influence factors of the nodes to be grown.
The target node to be grown in the current growth time interval and the target growth parameters of the target node are determined, and specifically, the node C determined in the current growth time interval and the target growth parameters for the node C, that is, the child node F of the node C, are determined as shown in fig. 3 a.
Referring to fig. 3a, fig. 3a is a schematic flow chart of node tree updating in an object generating method according to an embodiment of the present application.
Step 302: starting;
step 304: setting the maximum weight to 0;
step 306: defining a growth node;
step 308: setting the current node as a root node;
step 310: judging whether the current growth time interval is within;
if yes, go to step 312, if not, go to step 328;
step 312: judging whether a null node exists or not;
if not, go to step 314, if yes, go to step 316;
the empty child node means that the number of owned child nodes does not reach a preset threshold or no child node exists.
Step 314: the current node is one of the child nodes;
summarizing steps 310 to 314 above: for the node D of the 1 st frame, the path from the root node a to the node D includes a, B, D, and it is confirmed that the root node a is in the current time interval, step 312 is executed, step 316 is executed if it is determined that the root node a does not have an empty child node, the current node is set as the child node B of the root node a, step 316 is executed for the node B through the judgment of steps 310 and 312, and the child node D of the current node as the node B is set as the current node.
Step 316: calculating the illumination growth sub-weight and the illumination growth sub-parameter;
according to the growth influence factors of the illumination dimension, the illumination growth sub-weight and the illumination growth sub-parameter are calculated, specifically:
Setting a distance threshold to be 5 meters, defining a hemispherical range by taking a node D as a sphere center, and performing 8 ray stepping by taking the node D as the sphere center, wherein the 5 ray stepping distance is 3 meters, the 3 ray stepping distance exceeds 5 meters, the illumination sub-weight is determined to be 3/8, in the 3 ray stepping distance exceeds 5 meters, the 3 rd ray stepping distance is infinitely long, the 3 rd ray stepping direction is determined to be an illumination growth direction, the illumination growth direction is determined to be an illumination growth sub-parameter, and particularly, referring to fig. 3b, fig. 3b shows a schematic diagram of ray dimension in an object generation method provided by an embodiment of the application, wherein the 5 rays are blocked by an obstacle.
Step 318: calculating space growth sub-weight and space growth sub-parameter;
according to the growth influence factors of the space dimension, calculating the space growth sub-weight and the space growth sub-parameter, specifically:
the method comprises the steps of taking a node D as a sphere center, defining a hemispherical range, randomly scattering 100 points in the hemispherical range, taking other nodes in a node tree as sphere centers, defining the hemispherical range, comparing the hemispherical range of the node D with the hemispherical ranges of other nodes and positions of obstacles, eliminating the points at a superposition area, counting to obtain 60 points which are not eliminated, determining a space growth sub-weight as 3/5, counting the directions of the 60 points, averaging the 60 directions to obtain a space growth direction, determining the space growth direction as a space growth sub-parameter, and particularly referring to fig. 3c, wherein 1 point in the hemispherical in the figure represents 10 points.
Step 320: calculating the weight and the parameters of the moisture growth sub-particles;
according to the growth influence factor of the moisture dimension, calculating the moisture growth sub-weight and the moisture growth sub-parameter, specifically:
determining that the logical distance from the node D to the root node A is 3, obtaining that the actual distance from the root node A to the node D is 150 cm, determining that the weight of the moisture growth sub-is 3 (1/150), determining that the growth distance of the node D is 50 cm at most if the node D is a common node, and determining that the growth distance is the moisture growth sub-parameter.
Step 322: fusing the growth sub-weights and the growth sub-parameters of the three dimensions to obtain the growth weight and the growth parameter of the current node;
and multiplying the growth sub-weights of the three dimensions by the corresponding coefficients of the dimensions respectively, adding to obtain the growth weight of the node D, adding to the light growth direction and the space growth direction, normalizing to obtain the growth direction of the node D, and taking the water growth sub-parameters as the growth length of the node D, wherein the growth direction and the growth length are collectively called as growth parameters.
Step 324: judging whether the growth weight is larger than the maximum weight;
if yes, go to step 326, if not, go to step 310;
the initial maximum weight is 0, and the growth weight of the node D is determined to be greater than the maximum weight.
Step 326: setting the maximum weight as a growth weight, and setting a growth node as a current node;
setting the growth weight of the node D as the maximum weight, and setting the node D as a growth node;
based on the 2 nd frame, returning to execute step 310 until the current node is not in the current growing time interval, determining that the growing weight of the node C is maximum through the judgment of the 1 st frame, the 2 nd frame and the 3 rd frame, and executing step 328;
step 328: the growth node is a new node father node;
setting a growing node as C, growing a corresponding child node F based on the C node, taking the child node F as a new node, and setting the C node as a father node of the new node F;
step 330: the new node grows a node child node;
the new node F is a child node of the growth node;
step 332: the update buffer establishes a connection relationship between the new node and the parent node of the new node.
And establishing a parent-child connection relationship between the node C and the node F in the corresponding buffer of the node tree.
Step 210: and updating the node tree based on the target growth parameters to obtain an updated node tree.
And determining a plurality of target growth parameters through multiple rounds of judgment based on the growth time intervals, updating the node tree based on the target growth parameters of the plurality of target nodes, and obtaining updated node tree and node attribute of the node tree, wherein the node attribute is buffer corresponding to the node tree.
Step 212: and obtaining a growth animation rendering parameter of the target object, wherein the animation rendering parameter comprises a single growth duration and a single growth distance.
The single growth time length of the obtained tree 1 is 5 seconds, and the single growth distance is 1 logic distance.
Step 214: and rendering the target node tree according to the single growth duration and the single growth distance to generate a growth animation of the target object.
Referring to fig. 4a, fig. 4a shows a schematic diagram of a tree model in an object generating method according to an embodiment of the present application; FIG. 4b is a schematic diagram illustrating a tree model vertex shift in an object generating method according to an embodiment of the present disclosure; FIG. 4c is a schematic diagram illustrating bending of a tree model in an object generating method according to an embodiment of the present application;
the hexagonal non-capping cylindrical model in fig. 4a is the basic model of each branch, i.e. the trunk model, and the square patches are the basic model of the leaves, i.e. the leaf model. The model mapping shown in fig. 4b is performed based on the trunk model, specifically, the vertices of the trunk model are subjected to vertex shift and remapped into the shape of a branch, and the bending condition of the branch is obtained by multiplying a random noise function by a sin (x)/x function converged on the x axis.
Referring to fig. 4c, the animation is produced by first obtaining a central line (indicated by a dotted line), attaching the model to the central line when the model vertex is initialized, and spreading the vertex outwards with the lapse of time to obtain the animation effect that the branches become thicker and thicker.
Fig. 5a shows a first effect diagram of a tree generated in an object generating method according to an embodiment of the present application, fig. 5b shows a second effect diagram of a tree generated in an object generating method according to an embodiment of the present application, fig. 5c shows a third effect diagram of a tree generated in an object generating method according to an embodiment of the present application, and fig. 5d shows a fourth effect diagram of a tree generated in an object generating method according to an embodiment of the present application, wherein in fig. 5a, a shade is present in the mid-air, roots of the tree grow under the shade, the tree grows without the shade, the tree in the shade is shorter than the tree without the shade, and the trees on both sides of the shade all grow without the shade.
By applying the scheme of the embodiment of the application, the growth weight and the growth parameters of each node in the corresponding node tree are calculated through the growth influence factors, the target node for growth is determined according to the growth weight, the updated node tree is obtained based on the growth parameters, and the growth animation of the target object is generated according to the updated node tree, so that the growth influence factors corresponding to the node tree are determined automatically according to the environmental factors in the scene, and the growth animation of the target object is generated based on the growth influence factors, the growth animation of the target object is generated directly based on the environmental factors in the scene in the generation scene of the object growth animation, and the situation that the environment factors and the growth animation of the object are not adapted and need to be repeatedly modified is avoided.
Corresponding to the method embodiment, the present application further provides an object generating device embodiment, and fig. 6 shows a schematic structural diagram of an object generating device provided in an embodiment of the present application. As shown in fig. 6, the apparatus includes:
the obtaining module 602 is configured to obtain a node tree corresponding to the target object, and determine a corresponding growth influence factor according to a node attribute of the node tree;
a calculating module 604 configured to calculate a growth weight and a growth parameter of each node in the node tree according to the growth influence factor, wherein the growth weight is used for indicating a probability of growth of the corresponding node, and the growth parameter is used for indicating growth information of the corresponding node;
the screening module 606 is configured to screen target nodes to be grown based on the growth weights of the nodes, and update the node tree according to target growth parameters to obtain an updated node tree, wherein the target growth parameters are growth parameters corresponding to the target nodes;
a generation module 608 is configured to generate a growth animation of the target object from the updated node tree.
Optionally, the growth influencing factor corresponds to at least one dimension, and the growth influencing factor of each dimension comprises an environmental parameter and a node parameter of the node tree, wherein the node parameter is determined by a node attribute of the node tree; a calculation module 604 further configured to calculate, for a target growth node, a growth sub-weight and a growth sub-parameter of the target growth node in each dimension according to a growth impact factor of at least one dimension, wherein the target growth node is any one of the nodes; fusing the weights of the growers of the target growing nodes in each dimension to obtain the growing weights of the target growing nodes; and fusing the growth sub-parameters of the target growth node in each dimension to obtain the growth parameters of the target growth node.
Optionally, the at least one dimension comprises an illumination dimension; a calculation module 604, further configured to define a first preset range of the target growth node, and perform stepping of the plurality of rays within the first preset range; measuring the distance of the plurality of ray steps according to the growth influence factors of the illumination dimension; and calculating the illumination growth sub-weight and the illumination growth sub-parameter of the target growth node in the illumination dimension based on the distance of the plurality of ray steps.
Optionally, the at least one dimension comprises a spatial dimension; a calculation module 604, further configured to define a second preset range of target growth nodes, and generate a plurality of spatial detection points within the second preset range; counting overlapped conditions of a plurality of space detection points according to the growth influence factors of the space dimension; based on the overlapped condition of the plurality of spatial detection points, the spatial growth sub-weight and the spatial growth sub-parameter of the target growth node in the spatial dimension are calculated.
Optionally, the at least one dimension comprises a nutrient dimension; a calculation module 604 further configured to calculate a distance of the target growth node from a root node in the node tree based on the growth influencing factor of the nutrient dimension; based on the distance, a nutrient growth sub-weight and a nutrient growth sub-parameter of the target growth node in the nutrient dimension are calculated.
Optionally, the obtaining module 602 is further configured to find at least one node to be grown from the node tree; and determining the growth influence factors of the nodes to be grown according to the node attribute of at least one node to be grown.
Optionally, the screening module 606 is further configured to compare the growth weights of the nodes and determine a target growth weight from the growth weights; and determining the node corresponding to the target growth weight as a target node.
Optionally, the screening module 606 is further configured to determine whether the target growth parameter indicates an added node; if yes, constructing child nodes of the target node in the node tree according to the target growth parameters, and obtaining an updated node tree.
Optionally, the object generating apparatus further includes a return execution module configured to return to execute the step of determining the corresponding growth influence factor according to the node attribute of the node tree until the execution stop condition is reached, and obtain the final updated target node tree.
Optionally, the generating module 608 is further configured to obtain a growth animation rendering parameter of the target object, where the animation rendering parameter includes a single growth duration and a single growth distance; and rendering the target node tree according to the single growth duration and the single growth distance to generate a growth animation of the target object.
By applying the scheme of the embodiment of the application, the growth weight and the growth parameters of each node in the corresponding node tree are calculated through the growth influence factors, the target node for growth is determined according to the growth weight, the updated node tree is obtained based on the growth parameters, and the growth animation of the target object is generated according to the updated node tree, so that the growth influence factors corresponding to the node tree are determined automatically according to the environmental factors in the scene, and the growth animation of the target object is generated based on the growth influence factors, the growth animation of the target object is generated directly based on the environmental factors in the scene in the generation scene of the object growth animation, and the situation that the environment factors and the growth animation of the object are not adapted and need to be repeatedly modified is avoided.
The above is a schematic scheme of an object generating apparatus of the present embodiment. It should be noted that, the technical solution of the object generating apparatus and the technical solution of the object generating method belong to the same concept, and details of the technical solution of the object generating apparatus, which are not described in detail, can be referred to the description of the technical solution of the object generating method.
FIG. 7 illustrates a block diagram of a computing device provided in an embodiment of the present application. The components of computing device 700 include, but are not limited to, memory 710 and processor 720. Processor 720 is coupled to memory 710 via bus 730, and database 750 is used to store data.
Computing device 700 also includes access device 740, access device 740 enabling computing device 700 to communicate via one or more networks 760. Examples of such networks include public switched telephone networks (PSTN, public Switched Telephone Network), local area networks (LAN, local Area Network), wide area networks (WAN, wide Area Network), personal area networks (PAN, personal Area Network), or combinations of communication networks such as the internet. The access device 740 may include one or more of any type of network interface, wired or wireless, such as a network interface card (NIC, network Interface Controller), such as an IEEE802.11 wireless local area network (WLAN, wireless Local Area Network) wireless interface, a worldwide interoperability for microwave access (Wi-MAX, worldwide Interoperability for Microwave Access) interface, an ethernet interface, a universal serial bus (USB, universal Serial Bus) interface, a cellular network interface, a bluetooth interface, a near field communication (NFC, near Field Communication) interface, and so forth.
In one embodiment of the present application, the above-described components of computing device 700, as well as other components not shown in FIG. 7, may also be connected to each other, such as by a bus. It should be understood that the block diagram of the computing device illustrated in FIG. 7 is for exemplary purposes only and is not intended to limit the scope of the present application. Those skilled in the art may add or replace other components as desired.
Computing device 700 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), mobile phone (e.g., smart phone), wearable computing device (e.g., smart watch, smart glasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or personal computer (PC, personal Computer). Computing device 700 may also be a mobile or stationary server.
Wherein the processor 720 performs the steps of the object generation method when executing the computer instructions.
The foregoing is a schematic illustration of a computing device of this embodiment. It should be noted that, the technical solution of the computing device and the technical solution of the object generating method belong to the same concept, and details of the technical solution of the computing device, which are not described in detail, can be referred to the description of the technical solution of the object generating method.
An embodiment of the present application also provides a computer-readable storage medium storing computer instructions that, when executed by a processor, implement the steps of the object generation method as described above.
The above is an exemplary version of a computer-readable storage medium of the present embodiment. It should be noted that, the technical solution of the storage medium and the technical solution of the object generating method belong to the same concept, and details of the technical solution of the storage medium, which are not described in detail, can be referred to the description of the technical solution of the object generating method.
The foregoing describes specific embodiments of the present application. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The computer instructions include computer program code that may be in source code form, object code form, executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable medium can be increased or decreased appropriately according to the requirements of the patent practice, for example, in some areas, according to the patent practice, the computer readable medium does not include an electric carrier signal and a telecommunication signal.
It should be noted that, for the sake of simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily all necessary for the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
The above-disclosed preferred embodiments of the present application are provided only as an aid to the elucidation of the present application. Alternative embodiments are not intended to be exhaustive or to limit the invention to the precise form disclosed. Obviously, many modifications and variations are possible in light of the teaching of this application. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best understand and utilize the invention. This application is to be limited only by the claims and the full scope and equivalents thereof.

Claims (13)

1. An object generation method, comprising:
acquiring a node tree corresponding to a target object, and determining a corresponding growth influence factor according to node attributes of the node tree;
calculating the growth weight and the growth parameter of each node in the node tree according to the growth influence factors, wherein the growth weight is used for indicating the growth probability of the corresponding node, and the growth parameter is used for indicating the growth information of the corresponding node;
screening target nodes to be grown based on the growth weights of the nodes, and updating the node tree according to target growth parameters to obtain an updated node tree, wherein the target growth parameters are growth parameters corresponding to the target nodes;
and generating a growth animation of the target object according to the updated node tree.
2. The method of claim 1, wherein the growth influencing factor corresponds to at least one dimension, the growth influencing factor for each dimension comprising an environmental parameter and a node parameter of the node tree, the node parameter being determined by a node attribute of the node tree;
and calculating the growth weight and the growth parameter of each node in the node tree according to the growth influence factors, wherein the method comprises the following steps:
Aiming at a target growth node, calculating a growth sub-weight and a growth sub-parameter of the target growth node in each dimension according to the growth influence factor of at least one dimension, wherein the target growth node is any one of the nodes;
fusing the growth sub-weights of the target growth nodes in each dimension to obtain the growth weights of the target growth nodes;
and fusing the growth sub-parameters of the target growth node in each dimension to obtain the growth parameters of the target growth node.
3. The method of claim 2, wherein the at least one dimension comprises an illumination dimension;
the calculating the growth sub-weight and the growth sub-parameter of the target growth node in each dimension according to the growth influence factor of the at least one dimension comprises the following steps:
defining a first preset range of the target growth node, and performing stepping of a plurality of rays within the first preset range;
measuring the distance of the plurality of ray steps according to the growth influence factor of the illumination dimension;
and calculating the light growth sub-weight and the light growth sub-parameter of the target growth node in the light dimension based on the distance of the plurality of light stepping.
4. The method of claim 2, wherein the at least one dimension comprises a spatial dimension;
the calculating the growth sub-weight and the growth sub-parameter of the target growth node in each dimension according to the growth influence factor of the at least one dimension comprises the following steps:
defining a second preset range of the target growth node, and generating a plurality of space detection points in the second preset range;
counting overlapped conditions of the plurality of space detection points according to the growth influence factors of the space dimension;
based on the overlapped condition of the plurality of spatial detection points, calculating a spatial growth sub-weight and a spatial growth sub-parameter of the target growth node in the spatial dimension.
5. The method of claim 2, wherein the at least one dimension comprises a nutrient dimension;
the calculating the growth sub-weight and the growth sub-parameter of the target growth node in each dimension according to the growth influence factor of the at least one dimension comprises the following steps:
calculating the distance between the target growth node and a root node in the node tree according to the growth influence factors of the nutrient dimensions;
and calculating the nutrient growth sub-weight and the nutrient growth sub-parameter of the target growth node in the nutrient dimension based on the distance.
6. The method of claim 1, wherein said determining a corresponding growth influencing factor from node attributes of said node tree comprises:
searching at least one node to be grown from the node tree;
and determining the growth influence factors of the nodes to be grown according to the node attribute of the at least one node to be grown.
7. The method of claim 1, wherein the screening the target nodes to be grown based on the growth weights of the nodes comprises:
comparing the growth weights of the nodes, and determining target growth weights from the growth weights;
and determining the node corresponding to the target growth weight as a target node.
8. The method of claim 1, wherein updating the node tree according to the target growth parameter obtains an updated node tree, comprising:
determining whether the target growth parameter indicates an increased node;
if yes, constructing child nodes of the target node in the node tree according to the target growth parameters, and obtaining the updated node tree.
9. The method of claim 1, wherein after updating the node tree according to the target growth parameter to obtain an updated node tree, further comprising:
And returning to the step of executing the corresponding growth influence factors determined according to the node attributes of the node tree until the execution stopping condition is reached, and obtaining the final updated target node tree.
10. The method of claim 9, wherein generating a growth animation of the target object from the updated node tree comprises:
obtaining a growth animation rendering parameter of the target object, wherein the animation rendering parameter comprises a single growth duration and a single growth distance;
and rendering the target node tree according to the single growth duration and the single growth distance to generate a growth animation of the target object.
11. An object generating apparatus, comprising:
the acquisition module is configured to acquire a node tree corresponding to the target object, and determine a corresponding growth influence factor according to the node attribute of the node tree;
the calculation module is configured to calculate the growth weight and the growth parameter of each node in the node tree according to the growth influence factors, wherein the growth weight is used for indicating the growth probability of the corresponding node, and the growth parameter is used for indicating the growth information of the corresponding node;
The screening module is configured to screen target nodes to be grown based on the growth weights of the nodes, and update the node tree according to target growth parameters to obtain an updated node tree, wherein the target growth parameters are growth parameters corresponding to the target nodes;
and the generation module is configured to generate a growth animation of the target object according to the updated node tree.
12. A computing device, comprising:
a memory, a processor;
the memory is configured to store computer executable instructions and the processor is configured to execute the computer executable instructions to implement the steps of the method of any one of claims 1-10.
13. A computer readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the method of any one of claims 1-10.
CN202311264152.8A 2023-09-26 2023-09-26 Object generation method and device Pending CN117274445A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311264152.8A CN117274445A (en) 2023-09-26 2023-09-26 Object generation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311264152.8A CN117274445A (en) 2023-09-26 2023-09-26 Object generation method and device

Publications (1)

Publication Number Publication Date
CN117274445A true CN117274445A (en) 2023-12-22

Family

ID=89211918

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311264152.8A Pending CN117274445A (en) 2023-09-26 2023-09-26 Object generation method and device

Country Status (1)

Country Link
CN (1) CN117274445A (en)

Similar Documents

Publication Publication Date Title
Palubicki et al. Self-organizing tree models for image synthesis
Stonedahl et al. Finding forms of flocking: Evolutionary search in abm parameter-spaces
Muhar Three-dimensional modelling and visualisation of vegetation for landscape simulation
Tang et al. Optimal design of plant canopy based on light interception: a case study with loquat
US20090174703A1 (en) Particle-based method of generating and animating three-dimensional vegetation
Li et al. Learning to reconstruct botanical trees from single images
CN111617485B (en) Virtual terrain scene manufacturing method and device
Gain et al. EcoBrush: Interactive Control of Visually Consistent Large‐Scale Ecosystems
CN111524214B (en) Method and device for manufacturing vegetation biological community
Niese et al. Procedural urban forestry
CN102855661A (en) Large-scale forest scene quick generation method based on space similarity
CN110097529A (en) A kind of farmland Grading unit division methods and system based on semantic rules
CN108171793A (en) A kind of method for detecting lamination area triangle gridding
Alsweis et al. Modeling and visualization of symmetric and asymmetric plant competition
Bornhofen et al. Competition and evolution in virtual plant communities: a new modeling approach
Zhou et al. Deeptree: Modeling trees with situated latents
Huigen et al. Multiactor modeling of settling decisions and behavior in the San Mariano watershed, the Philippines: a first application with the MameLuke framework
CN117274445A (en) Object generation method and device
Moore Geographical vector agent-based simulation for agricultural land-use modelling
Galbraith et al. Implicit visualization and inverse modeling of growing trees
WO2023273134A1 (en) Game animation generation method and apparatus, storage medium, and computer device
Kohek et al. Interactive synthesis and visualization of self-organizing trees for large-scale forest succession simulation
Kurth et al. Sensitive growth grammars specifying models of forest structure, competition and plant-herbivore interaction
CN111191169B (en) Webpage-end-oriented large-scale plant community lightweight modeling and visualization method
CN114090715A (en) Map construction method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination