CN115761095A - Leaf model generation method and device, storage medium and electronic equipment - Google Patents

Leaf model generation method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN115761095A
CN115761095A CN202211484780.2A CN202211484780A CN115761095A CN 115761095 A CN115761095 A CN 115761095A CN 202211484780 A CN202211484780 A CN 202211484780A CN 115761095 A CN115761095 A CN 115761095A
Authority
CN
China
Prior art keywords
model
leaf
normal
vertex
leaf model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211484780.2A
Other languages
Chinese (zh)
Inventor
杜念航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211484780.2A priority Critical patent/CN115761095A/en
Publication of CN115761095A publication Critical patent/CN115761095A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Generation (AREA)

Abstract

The present disclosure relates to the field of image processing technologies, and in particular, to a leaf model generation method, a leaf model generation apparatus, a storage medium, and an electronic device. The leaf model generation method comprises the steps of configuring a simple model of a leaf model; wherein the simple model is composed of at least one unit model with a target shape; correcting the normal of each vertex in the unit model based on the bottom surface central point of the minimum bounding box corresponding to the unit model to obtain a first normal of the leaf model; performing scattering processing on the simple model to obtain point cloud, and giving an insert piece model to the point cloud to form a middle model of the leaf model; and transmitting the first normal of the leaf model to a target vertex of an insert sheet model in the middle mold of the leaf model to obtain a fine mold of the leaf model. The leaf model generation method provided by the disclosure can solve the problem of strong leaf model inserting feeling and enhance the rendering performance of the tree crown.

Description

Leaf model generation method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a leaf model generation method, a leaf model generation apparatus, a storage medium, and an electronic device.
Background
With the popularity of stylized open-world games such as Selda and Yuanshen, how to rapidly produce stylized trees in scenes in batches becomes a problem to be solved.
The production processes of stylized trees on the market are various, and most of the stylized trees are customized according to project requirements. However, although the work flow of art manual production can restore the form of the planar design to a great extent, a great deal of time and labor are consumed; in addition, trees manufactured by the common programmed modeling thought can only be seen far away, and the problem that the models are simple and crude or the leaf models are strong in leaf inserting sense can occur when people look near.
It is noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure and therefore may include information that does not constitute prior art that is already known to a person of ordinary skill in the art.
Disclosure of Invention
The present disclosure provides a leaf model generation method, a leaf model generation apparatus, a storage medium, and an electronic device, which are intended to solve the problem of strong leaf model insertion feeling and enhance the rendering performance of a crown.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the embodiments of the present disclosure, a method for generating a tree leaf model is provided, which includes: configuring a simple model of a leaf model; wherein the simple model is composed of at least one unit model with a target shape; correcting the normal of each vertex in the unit model based on the bottom surface central point of the minimum bounding box corresponding to the unit model to obtain a first normal of the leaf model; performing scattering processing on the simple model to obtain point cloud, and giving an insert piece model to the point cloud to form a middle model of the leaf model; and transmitting the first normal of the leaf model to a target vertex of an insert sheet model in the middle mold of the leaf model to obtain a fine mold of the leaf model.
According to a second aspect of the embodiments of the present disclosure, there is provided a tree leaf model generating apparatus including: the configuration module is used for configuring a simple model of the leaf model; wherein the simple model is composed of at least one unit model with a target shape; the normal correction module is used for correcting the normal of each vertex in the unit model based on the bottom surface central point of the minimum bounding box corresponding to the unit model to obtain a first normal of the leaf model; the point scattering module is used for conducting point scattering processing on the simple model to obtain point cloud, and endowing the insert piece model to the point cloud to form a middle model of the leaf model; and the model correction module is used for transmitting the first normal of the leaf model to the target vertex of the insert sheet model in the middle mould of the leaf model so as to obtain the fine mould of the leaf model.
According to a third aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a leaf model generation method as in the above embodiments.
According to a fourth aspect of an embodiment of the present disclosure, there is provided an electronic apparatus, including: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the leaf model generation method as in the above embodiments.
Exemplary embodiments of the present disclosure may have some or all of the following advantages:
in the technical solutions provided by some embodiments of the present disclosure, after the simple model of the leaf model is configured, on one hand, the normal of each unit model in the simple model is corrected based on the bottom center point of the minimum bounding box corresponding to the unit model to obtain the first normal of the leaf model, and then the first normal of the leaf model is used to correct the insert sheet model, so that the light receiving performance of the vegetation of the leaf model can show a trend from light to dark from top to bottom, so that the light receiving of the crown tends to be true, the strong sense of the insert sheet is avoided, and the rendering performance of the crown is enhanced; on the other hand, on the basis of configuring the simple model, the fine model of the leaf model is obtained finally according to the simple model scattering point, the inserting sheet and the normal correction, compared with art manual manufacturing, the fine model has the advantages that the manual manufacturing cost can be saved, and the model generation efficiency is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It should be apparent that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived by those of ordinary skill in the art without inventive effort. In the drawings:
FIG. 1 schematically illustrates a flow diagram of a method for generating a tree leaf model in an exemplary embodiment of the present disclosure;
FIG. 2 schematically illustrates a schematic diagram of a tree leaf model in an exemplary embodiment of the present disclosure;
FIG. 3 is a diagram schematically illustrating a minimum bounding box corresponding to a unit model in an exemplary embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating vertex normals of a cellular model in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a schematic view of a tapered insert in an exemplary embodiment of the present disclosure;
FIG. 6 schematically illustrates a schematic view of a cross tab in an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a schematic diagram of a voxel model in an exemplary embodiment of the disclosure;
FIG. 8 is a schematic diagram that schematically illustrates a voxel model spotting result in an exemplary embodiment of the present disclosure;
FIG. 9 is a schematic illustrating a point cloud before randomization in an exemplary embodiment of the present disclosure;
FIG. 10 schematically illustrates a schematic diagram after a stochastic process of a point cloud in an exemplary embodiment of the disclosure;
FIG. 11 schematically illustrates a diagram after adjustment of a die model in an exemplary embodiment of the disclosure;
FIG. 12 schematically illustrates a schematic diagram after normal transfer in an exemplary embodiment of the disclosure;
FIG. 13 is a schematic diagram illustrating a flow chart of a method for generating a leaf model template in an exemplary embodiment of the present disclosure;
FIG. 14 schematically illustrates a first lighting effect diagram of a model in an exemplary embodiment of the disclosure;
FIG. 15 is a diagram schematically illustrating a second lighting effect of a model in an exemplary embodiment of the present disclosure;
FIG. 16 schematically illustrates a schematic view of a transition curve in an exemplary embodiment of the disclosure;
FIG. 17 is a schematic diagram illustrating a vertical gradient rendering effect of a model in an exemplary embodiment of the present disclosure;
FIG. 18 is a schematic illustration of an interface for calculating a vertically graded material in an exemplary embodiment of the disclosure;
FIG. 19 schematically illustrates a third lighting effect diagram of a model in an exemplary embodiment of the disclosure;
FIG. 20 schematically illustrates a fourth lighting effect diagram of a model in an exemplary embodiment of the disclosure;
FIG. 21 is a schematic diagram illustrating a distance field map in an exemplary embodiment of the disclosure;
FIG. 22 is a schematic diagram illustrating a vertical gradient rendering effect of a model in an exemplary embodiment of the present disclosure;
FIG. 23 is a schematic illustration of an interface for calculating a vertically graded material in an exemplary embodiment of the disclosure;
FIG. 24 is a diagram schematically illustrating the rendering effect of a leaf model according to the prior art;
FIG. 25 is a diagram illustrating an effect rendered by a leaf model in an exemplary embodiment of the present disclosure;
fig. 26 schematically illustrates a composition diagram of a tree leaf model generation apparatus in an exemplary embodiment of the present disclosure;
FIG. 27 schematically illustrates a schematic diagram of a computer-readable storage medium in an exemplary embodiment of the disclosure;
fig. 28 schematically shows a structural diagram of a computer system of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
With the popularity of stylized open-world games such as Selda and Yuanshen, how to rapidly produce stylized trees in scenes in batches becomes a problem to be solved. The production processes of stylized trees on the market are various, most of the stylized trees are customized according to project requirements, and the production processes can be roughly divided into two production ideas.
One is to pursue highly customized and stylized hand models, where the leaves are inserted or engraved entirely by art. Although the work flow of art manual production can largely restore the form of the planar design, it needs to consume a lot of time and labor.
Another type is to make an automatic modeling flow that meets the requirements of the project using a programming modeling tool, which commonly includes the geometry nodes of the Blender and the programming tool of Houdini. The trees manufactured by the common programmed modeling thought can only be seen far away, and the problems of a crude model or a strong leaf model inserting feeling can occur when the trees are seen near, for example, the distribution of model surface patches of leaves is not sufficient in clustering feeling, the volume of a stylized special color block is short, and the like.
Therefore, aiming at the defects of the prior art, the invention provides a method for generating a tree leaf model, which can solve the problem of strong leaf model insert feeling, enhance the expression of tree crown color block feeling, automatically generate the tree crown model of the tree and maximally retain the artistic modeling design.
Implementation details of the technical solution of the embodiments of the present disclosure are set forth in detail below.
Fig. 1 schematically illustrates a flowchart of a tree leaf model generation method in an exemplary embodiment of the present disclosure. As shown in fig. 1, the leaf model generation method includes steps S101 to S104:
step S101, configuring a simple model of a leaf model; wherein the simple model is composed of at least one unit model with a target shape;
step S102, correcting the normal of each vertex in the unit model based on the bottom surface central point of the minimum bounding box corresponding to the unit model to obtain a first normal of the leaf model; and
step S103, scattering points of the simple model to obtain point cloud, and endowing an insert model to the point cloud to form a middle model of the leaf model;
and step S104, transmitting the first normal of the leaf model to a target vertex of an insert sheet model in the middle mold of the leaf model to obtain a fine mold of the leaf model.
In the technical scheme provided by some embodiments of the disclosure, after the simple model of the leaf model is configured, on one hand, the normals of all unit models in the simple model are corrected based on the bottom center point of the minimum bounding box corresponding to the unit model to obtain a first normal of the leaf model, and then the first normal of the leaf model is used for correcting the insert sheet model, so that the light receiving expression of vegetation of the leaf model can show a trend from top to bottom from light to dark, the light receiving of the crown tends to be real, the strong insert sheet feeling is avoided, and the rendering expression of the crown is enhanced; on the other hand, on the basis of configuring the simple mould, the fine mould of the leaf model is obtained finally according to the simple mould scattering point, the inserting sheet and the normal line correction, compared with art manual manufacturing, the manual manufacturing cost can be saved, and the model generation efficiency is improved.
Before the introduction, the concept of the model making process is briefly introduced. One model can be divided into a simple model, a middle model and a fine model according to the difference of model precision. The simple model is not high in precision, an approximate model is presented, the precision of the middle model is higher than that of the simple model and is lower than that of the fine model, and the fine model is a model with extremely high precision.
It should be noted that, the precision among the simple model, the middle model, and the precise model is relatively high, that is, in terms of precision, the precise model > the middle model > the simple model, in the leaf model generation method provided by the present disclosure, the simple model of the leaf model is first made, a preliminary sample of the leaf model is presented, then on this basis, the middle model of the leaf model is made, that is, the insert sheet model is made, and finally on this basis, the precise model of the leaf model is made, that is, the result after normal line transmission. In the process of manufacturing the leaf model, the precision is gradually improved from the simple mould to the middle mould to the fine mould.
It should be noted that the method for generating the leaf model provided by the application can be applied to the leaf part of the vegetation model, and the complete vegetation model can be obtained by combining the existing method for generating the trunk model. The method can also be applied to other models of virtual resources which need to present different light and shade changes according to the light.
Each step of the leaf model generation method in the present exemplary embodiment will be described in more detail with reference to the drawings and the embodiments.
In step S101, a simple model of the leaf model is configured; wherein the simple model is composed of at least one unit model having a target shape.
In one embodiment of the present disclosure, an artist who needs to make a game quickly engraves a silhouette of the crown of the leaf model in software to obtain a simple model. For example, software such as Zbrush or maya may be used.
The simple model is composed of at least one unit model with a target shape and represents a preliminary appearance of the leaf model. The unit model is a constituent unit of a simple model and has a certain shape, each vertex of the surface of the model is included in the unit model, the shape of the crown of the actual tree is referred, and the target shape can be an ellipse. Actually, the device can be configured to be composed of unit models in other preset shapes such as irregular polyhedrons according to actual requirements.
Fig. 2 schematically illustrates a schematic diagram of a tree leaf model in an exemplary embodiment of the present disclosure. Referring to fig. 2, a silhouette of the crown is formed by elliptical models, wherein the elliptical models are unit models forming simple models, and the plurality of unit models form simple models of leaf models as shown in fig. 2.
In the process of reconfiguring the tree leaf model simple model, the single unit models can be grouped and named respectively according to needs so as to be convenient for processing the unit models in sequence.
In step S102, the normals of the vertices in the unit model are corrected based on the center point of the bottom surface of the minimum bounding box corresponding to the unit model, so as to obtain a first normal of the leaf model.
In particular, in the production of stylized trees, the crown is usually treated with a spherical normal that is perpendicular to the surface vertices of an ellipsoid, but the vegetation produced in this way appears round when illuminated and is generally too distorted. Therefore, based on the observation of real scenes and pictorial works, the whole vegetation should show a trend from top to bottom and from light to dark due to the height of the sun, so the normal of each unit model in the simple model can be sequentially processed in the application, and the light receiving trend of the finally obtained model is fit to reality.
In an embodiment of the present disclosure, the step of correcting the normal in step S102 is specifically as follows:
step one, aiming at one unit model, determining a bottom surface central point of a minimum bounding box corresponding to the unit model;
secondly, upwardly offsetting the bottom center point by a first preset distance along the longitudinal axis direction to obtain an offset coordinate of the bottom center point;
respectively calculating the corrected normal of each vertex according to the coordinate of each vertex in the unit model and the offset coordinate of the bottom surface central point;
and fourthly, traversing all the unit models in the simple model, and repeating the normal correction process to obtain a first normal of the leaf model.
The above steps are explained in detail below.
In step one, a minimum bounding box of a cuboid which can completely wrap the unit model is determined according to the shape of the unit model.
FIG. 3 is a diagram schematically illustrating a minimum bounding box corresponding to a cell model in an exemplary embodiment of the present disclosure. Taking the cell model as an ellipse as an example, the minimum bounding box of the cell model is determined as shown in fig. 3.
In the second step, the bottom surface of the minimum bounding box is a rectangle, the center point C of the rectangle is found as the bottom surface center point, and C is shifted by a first preset distance along the longitudinal axis direction, i.e., vertically upward, to obtain C ', and the coordinate of C' is recorded as the shifted coordinate of the bottom surface center point. The first preset distance may be specifically configured according to requirements.
FIG. 4 is a schematic diagram illustrating vertex normals of a cell model in an exemplary embodiment of the disclosure. Referring to FIG. 4, the model surface of the unit model includes a plurality of vertices, each vertex corresponding to a respective normal.
In step three, the model surface of the unit model includes a plurality of vertices, and the normal is corrected for each vertex according to the coordinates of the vertex and the offset coordinates of the center point of the bottom surface. Taking one vertex P in the unit model as an example, the offset coordinate C' of the center point of the bottom surface is subtracted from the coordinate P of the vertex to obtain the normal orientation after the vertex correction of P. As shown with reference to figure 3 of the drawings,
the process from step one to step three can be completed by using Houdini, and the Houdini algorithm is as follows:
vector center=getbbox_center(1);
vector min=getbbox_min(1);
center.y=min.y+chf(“offset”)*100;
v@N=normalize(@P-center);
the normal of one vertex in the unit model is corrected, and the steps are repeated to correct the normal of all the vertices in the unit model.
In the fourth step, sequentially traversing each unit model in the simplified model, repeating the above-mentioned determining of the minimum bounding box and the normal correction of the vertex, and finally obtaining a new corrected normal direction, which is recorded as a first normal.
In step S103, the simple model is scattered to obtain a point cloud, and an insert model is assigned to the point cloud to form a middle model of the leaf model.
Specifically, the automatic generation of the crown and the generation of the middle model of the leaf are generally performed through two steps of scattering and inserting. Firstly, scattering points of the simple model to generate a pile of point clouds for marking the positions of single insert sheet models, and then placing the insert sheet models on the coordinates of the point clouds to obtain a crown combined by a plurality of insert sheet models.
Wherein, still need to utilize the inserted sheet mode to make the inserted sheet model. Common tab models are for example in the form of tapered tabs, cross tabs, etc. Fig. 5 schematically illustrates a schematic view of a tapered insert in an exemplary embodiment of the present disclosure, each insert model having a tapered shape as shown in fig. 5. Fig. 6 schematically illustrates a schematic diagram of a cross-shaped insert sheet in an exemplary embodiment of the disclosure, as shown in fig. 6, each insert sheet model is composed of three planes which are crossed, and the side surface is in a cross shape.
In an embodiment of the present disclosure, in step S103, performing a point scattering process on the simple model to obtain a point cloud, including: converting the simple model into a voxel model according to a preset voxel size; and performing scattering processing on the voxel model to obtain point cloud.
Because the common programmed vegetation generation generally scatters points on the surface of the sphere, the advantage of doing so is that the number of points can be saved, because the more points scattered, the higher the number of faces of the model is, the performance of the game reality can be affected. However, the disadvantage of this is also obvious, and the fact that only the insert model is arranged on the surface vertex of the simple model can lead to that some parts are hollowed out or the insert feeling is obvious in the actual rendering.
Therefore, in order to solve the technical defects, the leaf model generation method provided by the application can perform voxelization processing on the simple model, and perform scattering processing after converting the simple model into the voxel model.
Fig. 7 schematically illustrates a schematic diagram of a voxel model in an exemplary embodiment of the present disclosure. Specifically, an appropriate voxel size, that is, a voxel size is set in advance, and a simple model is converted into a voxel model storing a VDB format with a direction and a distance according to the voxel size, as shown in fig. 7. The VDB is a general-purpose brand-new data architecture for efficiently storing Volume data, and stores volumes with symbols and distances near the surface of a model.
Fig. 8 is a schematic diagram schematically illustrating a voxel model spotting result in an exemplary embodiment of the present disclosure. And (4) performing point scattering processing on the voxel model after the voxel formation to obtain a point cloud as shown in fig. 8, wherein the point cloud comprises a plurality of points.
In an embodiment of the present disclosure, after the scattering processing is performed on the simple model to obtain a point cloud, the method further includes: carrying out first random treatment on the size parameters capable of bearing the insert piece model at each point in the point cloud; and/or second stochastic processing of normals at points in the point cloud to update the point cloud.
Specifically, in order to ensure that the insert sheet model is placed on the point cloud according to different directions, the parameters corresponding to the point cloud can be randomly adjusted.
The random processing can be started from two aspects, namely, the size parameter which can bear the insert piece model at each point in the point cloud is randomized, and the normal of each point in the point cloud is randomized. Random numbers can be taken in a preset interval and then distributed, and therefore the parameters of the point cloud can be modified.
The point cloud system comprises a plurality of point cloud model models, wherein each point in the point cloud can be used for placing an insertion sheet model subsequently, the insertion sheet models have different sizes, and the size parameters of the insertion sheet models which can be borne by different points influence the size of the insertion sheet model placed at the point. While the normal at a different point affects the orientation of the insert model placed at that point.
Of course, in order to increase the randomness of the leaf model generation, the two random adjustment manners may be used as needed, one of them may be used, or a combination of the two manners, and the disclosure is not limited in detail herein.
Fig. 9 schematically shows a schematic diagram before a random process on a point cloud in an exemplary embodiment of the present disclosure, and fig. 10 schematically shows a schematic diagram after a random process on a point cloud in an exemplary embodiment of the present disclosure, where a manner of combining a first random process and a second random process is adopted. Comparing fig. 9 and fig. 10, it can be seen that the placement of the leaf model in fig. 10 is more random, and the authenticity of the final leaf model is further improved.
In one embodiment of the disclosure, after assigning a plug-in model to the point cloud to compose a middle mold of the leaf model, the method further comprises: respectively translating each insert model in the middle die by a second preset distance along the longitudinal axis direction to obtain a first position of each insert model; and respectively correcting the rotation direction of each insert piece model at the first position of each insert piece model so as to update the middle model of the leaf model.
Specifically, when the tab model generated in the cross tab mode is adopted, in order to avoid the situation that the tab model is given based on the point cloud and then is shown as 1001 in fig. 10, that is, the cross face of the tab model is vertically shown outside, such an arrangement mode can cause that the face feeling of the part is serious and the overall clustering feeling of the model is lost in the actual rendering.
Therefore, in order to solve the above-mentioned defects, the insert sheet models are first shifted once in the longitudinal axis direction, and then the rotation direction of each insert sheet model is corrected.
The process can be accomplished using Houdini, which has the following algorithm:
and (3) carrying out primary offset on each insert piece model along the direction of a longitudinal axis:
v@tangent=cross(@N,(1,0,1));
@N.y*=0.5;
correcting the rotating direction of each insert piece model:
vector y=normalize(@N);
vector x=cross(v@tangent,y);
vector z=cross(x,y);
float r=fit(rand(@ptnum*23985),0,1,-1,1);
matrix3 m=set(x,y,z);
rotate(m,radians((chf("rotate_X")*r)),x);
rotate(m,radians((chf("rotate_y")*r)),y);
rotate(m,radians((chf("rotate_z")*r)),z);
@orient=quaternion(m);
and then obtain the position of new inserted sheet model, the inserted sheet direction in the leaf model of this moment constitution can hardly have the condition on perpendicular to simple mould surface, and then the dough sheet sense when having reduced the leaf model and having better rendering performance.
Fig. 11 schematically illustrates a schematic diagram after adjustment of a card model in an exemplary embodiment of the disclosure. Comparing fig. 10 and 11, it can be seen that the situation of the updated insert direction perpendicular to the surface of the simple mold disappears and the insert is much softer.
And step S104, transmitting the first normal of the leaf model to a target vertex of an insert sheet model in the middle mold of the leaf model to obtain a fine mold of the leaf model.
In an embodiment of the disclosure, the first normal line after being adjusted based on the light receiving trend of the vegetation is transmitted to each insert sheet model, that is, the normal line of the insert sheet model is modified into the first normal line transmission, and then used for subsequent model rendering, so that the subsequent model rendering can be performed to obtain the trend from light to dark from top to bottom.
The specific process of finding the transfer in step S104 is as follows: aiming at a first normal line at a vertex in a unit model of the leaf model, determining a vertex of an insert sheet model intersected with a straight line as a target vertex to be transmitted according to the straight line where the first normal line is located; replacing the normal of the target vertex with the first normal; and traversing all the vertexes in all the unit models of the leaf model to finish the normal transfer process.
Specifically, the simple model of the leaf model includes a plurality of vertices in the unit model, and the middle model of the leaf model includes vertices in the plug-in sheet model. And aiming at the first normal of each vertex in the unit model, finding the vertex of the insert sheet model intersected with the straight line by taking the straight line where the first normal of the vertex is positioned as a reference, modifying the normal of the vertex of the insert sheet model into the first normal of the vertex in the unit model, traversing all the vertices in all the unit models of the leaf model, and performing normal transmission according to the same steps.
In the precision model of the leaf model after normal transmission, the corrected normal in S102 is stored, so that the leaf model can store the light receiving trend of the vegetation, and the light receiving trend of the vegetation can be reserved by rendering the precision model subsequently, so that the tree illumination effect is fit for reality.
Fig. 12 schematically illustrates a schematic diagram after normal vector transmission in an exemplary embodiment of the disclosure, and referring to fig. 12, the normal vector of each patch model in the leaf model is consistent with the first normal vector corrected in step S102.
Based on the method, only one simple model of the crown silhouette needs to be configured, and then the simple model can be automatically processed by using a proper tool to automatically generate the fine model of the leaf model.
FIG. 13 is a schematic diagram illustrating a process of generating a leaf model template in an exemplary embodiment of the present disclosure. As shown in fig. 13, the specific process is as follows:
step S1301, importing a simple mould for art manufacturing;
step S1302, processing the normal direction of the simple model, namely correcting the normal;
step S1303, converting the simple model into a VDB;
step S1304, scattering points according to the distance of the VDB to obtain point cloud;
step 1305, the direction and the size of the point cloud are randomized;
step 1306, manufacturing a single cross insert piece model;
step 1307, assigning the insert piece model to the point cloud data;
and step S1308, transmitting the processed simple model immortals to the plug-in sheet model to finally obtain a high-precision leaf model.
In an embodiment of the present disclosure, after obtaining the fine model of the leaf model, the method further includes: calculating a stylized material; the stylized material comprises a vertically gradually-changed material and/or an edge light material; and performing model rendering on the fine model of the leaf model based on the stylized material.
Specifically, the final model of the leaf model can be automatically generated according to steps S101 to S104, calculation can be performed by using Houdini, information of the generated final model of the leaf model is stored in the fbx file, and then is exported to the UE and read in the UE, and the final rendered leaf model can be obtained by rendering the final model with stylized material attached to the final model.
Stylizing the leaf model according to the normal line of the vertex (namely the vertex of the insert sheet model in the precision model) and AO (Ambient light shielding) information to enable sss (subsurface scattering) and illumination effects of the leaf model to present cluster forms, weakening the performance of a single leaf and reducing insert feeling.
Wherein the stylized material may comprise a vertically graded material. Namely, the finally rendered leaf model can present a rendering effect which is vertically and gradually changed from top to bottom based on the vertical gradient material.
In one embodiment of the present disclosure, calculating the vertical gradient material includes: multiplying a normal point at the vertex of each insert model in the fine mould by an upward vector to obtain a first light receiving direction vector; shifting the first light receiving direction vector to obtain a gradual change result, and mapping the gradual change result to a transition curve; and obtaining the vertical gradual change material according to the transition curve.
In order to introduce the calculation process, a sphere with a regular shape is adopted to introduce the calculation principle, and the same technical effect can be realized by replacing the sphere with a corresponding leaf model in practical application.
First, a normal point at each vertex in the model (corresponding to the vertex of each insert piece model in the fine mold) is multiplied by the upward vector to obtain a first light receiving direction vector directed upward. Where the upward vector may be (0, 1), or (0, n). Fig. 14 schematically illustrates a first lighting effect diagram of a model in an exemplary embodiment of the disclosure, with a distinct shading above and below the model, as shown in fig. 14.
Then, the first light receiving direction vector is shifted to obtain a gradual change result, and the first light receiving direction vector can be multiplied by a preset value during shifting, and then an offset is added, so that a soft up-down transition effect is obtained. For example, the first light receiving direction vector may be multiplied by 0.5, and then added with an offset value between (0-1), such as 0.2. Fig. 15 schematically illustrates a second lighting effect of a model in an exemplary embodiment of the disclosure, where the model has a gradual effect from top to bottom as shown in fig. 15.
And then, the gradual change result is taken as a numerical reference and is remapped to a custom transition curve. Fig. 16 schematically illustrates a schematic diagram of a transition curve in an exemplary embodiment of the disclosure, as shown in fig. 16, the transition curve has different light and shade degrees, and the whole transition curve has a gradual effect.
And finally, obtaining new up-down gradual change according to the transition curve, having a three-division shadow effect, and converting the three-division shadow effect into a vertical gradual change material. Fig. 17 is a schematic diagram illustrating a vertical gradient rendering effect of a model in an exemplary embodiment of the disclosure, and the rendering effect shown in fig. 17 can be obtained by applying a vertical gradient material to the model, where the model shows a phenomenon of vertical gradient from top to bottom.
FIG. 18 is a schematic diagram illustrating an interface for calculating a vertically graded material according to an exemplary embodiment of the disclosure. The UE5 is adopted for material calculation, vertexNormalmallWS is a vector obtained by multiplying a normal point of a vertex of a world space by a vertical upward direction, a vertically upward light receiving value (a value range of-1 to 1) according to the vertex of the model can be obtained, and the value (divided by 2 and added by 1) is remapped to the range of 0.5 to 1.5, so that the light receiving value is higher than that (0 to 1) of normal pbr by 0.5, and the material is finally displayed brighter and shows a trend of up-down transition.
In one embodiment of the present disclosure, calculating the edge light texture comprises: multiplying the normal point at the vertex of each insert model in the fine mould by the light source direction vector to obtain a second light receiving direction vector; calculating an edge light value by adopting a Fresnel effect formula based on the second light receiving direction vector; and multiplying the edge light value and the distance field map of the leaf model to obtain the edge light material.
Specifically, on the basis of the inherent color rendering, the calculation of the edge light is added. The edge light is used for setting two direct lights in a scene, the upward light source is judged once, and only the influence caused by the upward light source is collected forever.
Here, the double collimated light detection means that when a plurality of collimated light sources are present in the UE, each collimated light is counted from 0 according to index. In dynamic light scenes, there are typically two parallel lights simulating the alternating appearance of the sun and moon in the air. In order to determine which light source is above sea level in the current scene, one of the light source points may be multiplied by the upward normal vector to obtain a value of 0 to 1. When the current light source is higher than the sea level, the value is approximate to 1, and when the current light source is lower than the sea level, the value is approximate to 0, and then interpolation is carried out on the directions of the two light sources by taking the value as a threshold value, so that the direction of the light source of the current scene higher than the sea level can be obtained.
Similar to the calculation of vertical gradient materials, the calculation principle is introduced by adopting a sphere in a regular shape, and the same technical effect can be realized by replacing the sphere with a corresponding leaf model in practical application.
First, a second light receiving direction vector is obtained by multiplying a normal point at each vertex in the model (corresponding to the vertex of each patch model in the master) by the light source direction vector. FIG. 19 schematically illustrates a third lighting effect of a model in an exemplary embodiment of the disclosure, as shown in FIG. 19, showing a light receiving surface of the model.
Then, on the basis, based on the second light receiving direction vector, the edge light value of the light receiving surface is calculated by adopting a Fresnel effect formula. Wherein, the formula of Fresnel effect (Fresnel): fresnel value = pow (1-max (0,dot (N, V)), _ fresnel pow), N is the model vertex normal, V is the viewing direction, fresnel is the intensity of the fresnel effect. Fig. 20 is a schematic diagram illustrating a fourth lighting effect of a model in an exemplary embodiment of the disclosure, as shown in fig. 20, showing an edge lighting effect of a light receiving surface of the model.
Then, the edge light value is multiplied by the SDF (signed distance filtered) map to obtain the edge light texture. The SDF map can control the rendering transparency of tree leaves, and can be obtained by using photoshop to perform edge-to-center fuzzy transformation. FIG. 21 is a schematic diagram illustrating a distance field map, as shown in FIG. 21, showing an alpha channel map of a texture map in an exemplary embodiment of the disclosure. Fig. 22 is a schematic diagram illustrating a vertical gradient rendering effect of a model in an exemplary embodiment of the disclosure, and as shown in fig. 22, a vertical gradient effect of a blade edge of the model is demonstrated.
FIG. 23 is a schematic diagram illustrating an interface for calculating a vertically graded material in an exemplary embodiment of the disclosure. The UE5 is adopted for calculation, the screen space position screen position is sampled, the r value is taken, the r value is multiplied by the deviant to obtain a gradual change value from left to right from the depth to the light according to the screen space, the value is multiplied by the SDF mapping to add a gradual change of the depth to the mapping, the value and 1 are subjected to interpolation calculation by taking the result as a threshold value, and the material effect only with edge values can be obtained.
It should be noted that the two material renderings may be used alternatively or in combination.
Based on the method, the silhouette of the tree crown is roughly carved, then the reasonable tree crown is automatically generated by the tool according to the simple model of the silhouette, and then the leaf rendering is carried out by stylizing the insert model of the leaves according to the normal line and AO information provided by the model by the calculated material. Therefore, on one hand, the total number of the surfaces of the model generated by the cluster reduction effect in the insert mode is generally about thirty thousand, and the model is obviously optimized compared with the assets with similar effect, which are manually carved by pure models, of which the surfaces are hundreds of thousands of surfaces or even millions of surfaces; on the other hand, the traditional method for manually making a tree requires about three days, the manual making time can be shortened to half a day by using a scheme of simple model and automatic generation, and the tool running time can be practically ignored basically; on the other hand, compared with the common stylized vegetation, the model of the scheme through normal line correction and scattering point optimization is more natural.
FIG. 24 is a diagram schematically illustrating the rendering effect of a leaf model according to the prior art; FIG. 25 is a schematic diagram illustrating an effect of leaf model rendering according to an exemplary embodiment of the present disclosure, namely, a leaf model rendering effect through a first normal correction, a simplified model voxelization process, a point cloud randomization process, a tab model shift and rotation, and a vertical gradient material and an edge light material. Compared with the rendering results of fig. 24 and 25, the leaf model in the application can be obviously seen to have the advantages that the inserting piece feeling is weakened, the color block feeling is good in performance, and the user experience can be improved.
FIG. 26 schematically illustrates a composition diagram of a tree leaf model generation apparatus in an exemplary embodiment of the disclosure, and as shown in FIG. 26, this tree leaf model generation apparatus 2600 may include a configuration module 2601, a normal modification module 2602, a scatter module 2603, and a model modification module 2604. Wherein:
a configuration module 2601, configured to configure a simple model of a leaf model; wherein the simple model is composed of at least one unit model with a target shape;
the normal correction module 2602 is configured to correct the normal of each vertex in the unit model based on the bottom center point of the minimum bounding box corresponding to the unit model, so as to obtain a first normal of the leaf model; and
a point scattering module 2603, configured to perform point scattering processing on the simple model to obtain a point cloud, and assign an insert piece model to the point cloud to form a middle model of the leaf model;
the model modification module 2604 is configured to transfer the first normal of the leaf model to a target vertex of an insert sheet model in the middle mold of the leaf model to obtain a fine mold of the leaf model.
According to an exemplary embodiment of the disclosure, the normal modification module 2602 is further configured to determine, for a cell model, a bottom center point of the minimum bounding box corresponding to the cell model; upwardly offsetting the bottom center point by a first preset distance along the longitudinal axis direction to obtain an offset coordinate of the bottom center point; respectively calculating the corrected normal of each vertex according to the coordinate of each vertex in the unit model and the offset coordinate of the bottom surface central point; and traversing all the unit models in the simple model, and repeating the normal correction process to obtain a first normal of the leaf model.
According to an exemplary embodiment of the present disclosure, the spotting module 2603 is further configured to convert the simple model into a voxel model according to a preset voxel size; and performing scattering processing on the voxel model to obtain point cloud.
According to an exemplary embodiment of the disclosure, the spotting module 2603 is further configured to perform a first stochastic process on the size parameters that can bear the tab model at each point in the point cloud; and/or performing a second stochastic process on normals at each point in the point cloud to update the point cloud.
According to an exemplary embodiment of the disclosure, the scattering module 2603 is further configured to translate each insert sheet model in the middle mold along a longitudinal axis direction by a second preset distance, respectively, to obtain a first position of each insert sheet model; and respectively correcting the rotation direction of each insert piece model at the first position of each insert piece model so as to update the middle model of the leaf model.
According to an exemplary embodiment of the disclosure, the model modification module 2604 is further configured to, for a first normal at a vertex in the unit model of the leaf model, determine, according to a straight line where the first normal is located, a vertex of the plug-in sheet model intersecting with the straight line as a target vertex to be transferred; replacing the normal of the target vertex with the first normal; and traversing all the vertexes in all the unit models of the leaf model to finish the normal transfer process.
According to an exemplary embodiment of the present disclosure, the leaf model generation apparatus 2600 further comprises a rendering module for calculating stylized material; the stylized material comprises a vertical gradual change material and/or an edge light material; and performing model rendering on the fine model of the leaf model based on the stylized material.
According to an exemplary embodiment of the present disclosure, the rendering module is further configured to multiply a normal point at a vertex of each insert sheet model in the finishing mold by an upper vector to obtain a first light receiving direction vector; shifting the first light receiving direction vector to obtain a gradual change result, and mapping the gradual change result to a transition curve; and obtaining the vertical gradual change material according to the transition curve.
According to an exemplary embodiment of the present disclosure, the rendering module is further configured to multiply a normal point at a vertex of each insert sheet model in the finishing mold by a light source direction vector to obtain a second light receiving direction vector; calculating an edge light value by adopting a Fresnel effect formula based on the second light receiving direction vector; and multiplying the edge light value with the distance field map of the leaf model to obtain the edge light material.
The specific details of each module in the leaf model generation apparatus 2600 are already described in detail in the corresponding leaf model generation method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, there is also provided a storage medium capable of implementing the above-described method. Fig. 27 schematically illustrates a schematic diagram of a computer-readable storage medium in an exemplary embodiment of the disclosure, and as shown in fig. 27, a program product 2700 for implementing the above method according to an embodiment of the disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a mobile phone. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided. Fig. 28 schematically illustrates a structural diagram of a computer system of an electronic device in an exemplary embodiment of the present disclosure.
It should be noted that the computer system 2800 of the electronic device shown in fig. 28 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 28, the computer system 2800 includes a Central Processing Unit (CPU) 2801 that can perform various appropriate actions and processes in accordance with a program stored in a Read-Only Memory (ROM) 2802 or a program loaded from a storage section 2808 into a Random Access Memory (RAM) 2803. In the RAM 2803, various programs and data necessary for system operation are also stored. The CPU 2801, ROM 2802, and RAM 2803 are connected to each other via a bus 2804. An Input/Output (I/O) interface 2805 is also connected to the bus 2804.
The following components are connected to the I/O interface 2805: an input portion 2806 including a keyboard, a mouse, and the like; an output portion 2807 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage portion 2808 including a hard disk and the like; and a communication section 2809 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 2809 performs communication processing via a network such as the internet. A drive 2810 is also connected to the I/O interface 2805 as necessary. A removable medium 2811 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 2810 as necessary, so that a computer program read out therefrom is mounted in the storage portion 2808 as necessary.
In particular, the processes described below with reference to the flow diagrams may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via communication portion 2809, and/or installed from removable media 2811. When the computer program is executed by a Central Processing Unit (CPU) 2801, various functions defined in the system of the present disclosure are executed.
It should be noted that the computer readable medium shown in the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
As another aspect, the present disclosure also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may be separate and not incorporated into the electronic device. The computer readable medium carries one or more programs, which when executed by one of the electronic devices, cause the electronic device to implement the method described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice in the art to which the disclosure pertains.
It will be understood that the present disclosure is not limited to the precise arrangements that have been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (12)

1. A method for generating a leaf model, comprising:
configuring a simple model of a leaf model; wherein the simple model is composed of at least one unit model with a target shape;
correcting the normal of each vertex in the unit model based on the bottom surface central point of the minimum bounding box corresponding to the unit model to obtain a first normal of the leaf model; and
scattering points of the simple model to obtain point cloud, and endowing an insert piece model to the point cloud to form a middle model of the leaf model;
and transmitting the first normal of the leaf model to a target vertex of an insert sheet model in the middle mold of the leaf model to obtain a fine mold of the leaf model.
2. The method of claim 1, wherein the modifying the normals of the vertices in the unit model based on the center point of the bottom surface of the minimum bounding box corresponding to the unit model to obtain the first normal of the leaf model comprises:
aiming at one unit model, determining the bottom center point of the minimum bounding box corresponding to the unit model;
upwardly offsetting the bottom surface center point by a first preset distance along the longitudinal axis direction to obtain an offset coordinate of the bottom surface center point;
respectively calculating the corrected normal of each vertex according to the coordinate of each vertex in the unit model and the offset coordinate of the bottom surface central point;
and traversing all the unit models in the simple model, and repeating the normal correction process to obtain a first normal of the leaf model.
3. The leaf model generation method according to claim 1, wherein the scattering processing of the simple model to obtain a point cloud comprises:
converting the simple model into a voxel model according to a preset voxel size;
and performing scattering processing on the voxel model to obtain point cloud.
4. The leaf model generation method according to claim 1 or 3, wherein after the scattering processing of the simple model to obtain a point cloud, the method further comprises:
carrying out first random treatment on the size parameters of the insert piece model which can be borne at each point in the point cloud; and/or
And carrying out second random processing on the normal of each point in the point cloud so as to update the point cloud.
5. The leaf model generation method of claim 1, wherein after the tab model is given to the point cloud to compose a mesomold of the leaf model, the method further comprises:
respectively translating each insert model in the middle die by a second preset distance along the longitudinal axis direction to obtain a first position of each insert model;
and respectively correcting the rotation direction of each insert piece model at the first position of each insert piece model so as to update the middle model of the leaf model.
6. The leaf model generation method of claim 1, wherein the transferring the first normal of the leaf model to a target vertex of a mid-modulus leaf-in-plug model of the leaf model comprises:
aiming at a first normal line at a vertex in a unit model of the leaf model, determining a vertex of the insert piece model intersected with the straight line as a target vertex to be transmitted according to the straight line where the first normal line is located;
replacing the normal of the target vertex with the first normal;
and traversing all the vertexes in all the unit models of the leaf model to finish the normal transfer process.
7. The leaf model generation method of claim 1, wherein after obtaining the refinement of the leaf model, the method further comprises:
calculating a stylized material; the stylized material comprises a vertically gradually-changed material and/or an edge light material;
and performing model rendering on the fine model of the leaf model based on the stylized material.
8. The leaf model generation method of claim 7, wherein calculating the vertical gradient material comprises:
multiplying a normal point at the vertex of each insert model in the fine mould by an upward vector to obtain a first light receiving direction vector;
shifting the first light receiving direction vector to obtain a gradual change result, and mapping the gradual change result to a transition curve;
and obtaining the vertical gradual change material according to the transition curve.
9. The leaf model generation method of claim 7, wherein calculating the edge light texture comprises:
multiplying the normal point at the vertex of each insert model in the fine mould by the light source direction vector to obtain a second light receiving direction vector;
calculating an edge light value by adopting a Fresnel effect formula based on the second light receiving direction vector;
and multiplying the edge light value with the distance field map of the leaf model to obtain the edge light material.
10. A tree leaf model generation apparatus, comprising:
the configuration module is used for configuring a simple model of the leaf model; wherein the simple model is composed of at least one unit model with a target shape;
the normal correction module is used for correcting the normal of each vertex in the unit model based on the bottom surface central point of the minimum bounding box corresponding to the unit model to obtain a first normal of the leaf model; and
the point scattering module is used for conducting point scattering processing on the simple model to obtain point cloud, and endowing the insert piece model to the point cloud to form a middle model of the leaf model;
and the model correction module is used for transmitting the first normal of the leaf model to the target vertex of the insert sheet model in the middle mold of the leaf model so as to obtain the fine mold of the leaf model.
11. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, carries out the leaf model generation method of any one of claims 1 to 9.
12. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the leaf model generation method of any of claims 1 to 9.
CN202211484780.2A 2022-11-24 2022-11-24 Leaf model generation method and device, storage medium and electronic equipment Pending CN115761095A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211484780.2A CN115761095A (en) 2022-11-24 2022-11-24 Leaf model generation method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211484780.2A CN115761095A (en) 2022-11-24 2022-11-24 Leaf model generation method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115761095A true CN115761095A (en) 2023-03-07

Family

ID=85338585

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211484780.2A Pending CN115761095A (en) 2022-11-24 2022-11-24 Leaf model generation method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115761095A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118229932A (en) * 2024-05-23 2024-06-21 山东捷瑞数字科技股份有限公司 Method, system, device and medium for adjusting model position based on three-dimensional engine

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118229932A (en) * 2024-05-23 2024-06-21 山东捷瑞数字科技股份有限公司 Method, system, device and medium for adjusting model position based on three-dimensional engine

Similar Documents

Publication Publication Date Title
US11257286B2 (en) Method for rendering of simulating illumination and terminal
CN112316420B (en) Model rendering method, device, equipment and storage medium
CN108537861B (en) Map generation method, device, equipment and storage medium
CN111508053B (en) Rendering method and device of model, electronic equipment and computer readable medium
CN110458930A (en) Rendering method, device and the storage medium of three-dimensional map
CN109035381B (en) Cartoon picture hair rendering method and storage medium based on UE4 platform
CN103218846B (en) The ink and wash analogy method of Three-dimension Tree model
CN111476877B (en) Shadow rendering method and device, electronic equipment and storage medium
CN113240783B (en) Stylized rendering method and device, readable storage medium and electronic equipment
JP7522316B2 (en) Image-based lighting effects processing method, apparatus, device and storage medium
CN113658316B (en) Rendering method and device of three-dimensional model, storage medium and computer equipment
CN112619154A (en) Processing method and device of virtual model and electronic device
CN113409465B (en) Hair model generation method and device, storage medium and electronic equipment
CN115761095A (en) Leaf model generation method and device, storage medium and electronic equipment
CN111508054B (en) Terrain construction method, device and equipment
CN117745915B (en) Model rendering method, device, equipment and storage medium
CN111803942A (en) Soft shadow generation method and device, electronic equipment and storage medium
CN114119847A (en) Graph processing method and device, computer equipment and storage medium
CN114119848A (en) Model rendering method and device, computer equipment and storage medium
CN113838155A (en) Method and device for generating material map and electronic equipment
Lopez-Moreno et al. Non-photorealistic, depth-based image editing
CN116797701A (en) Diffusion effect rendering method and device, storage medium and electronic equipment
CN113936080A (en) Rendering method and device of virtual model, storage medium and electronic equipment
CN115063330A (en) Hair rendering method and device, electronic equipment and storage medium
CN114288671A (en) Method, device and equipment for making map and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination