CN115512030A - Model mapping method and device, electronic equipment and storage medium - Google Patents

Model mapping method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115512030A
CN115512030A CN202211124225.9A CN202211124225A CN115512030A CN 115512030 A CN115512030 A CN 115512030A CN 202211124225 A CN202211124225 A CN 202211124225A CN 115512030 A CN115512030 A CN 115512030A
Authority
CN
China
Prior art keywords
model
target
texture
applique
mapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211124225.9A
Other languages
Chinese (zh)
Inventor
赵鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211124225.9A priority Critical patent/CN115512030A/en
Publication of CN115512030A publication Critical patent/CN115512030A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The application provides a model mapping method, a model mapping device, electronic equipment and a storage medium, and relates to the technical field of image rendering. By manufacturing the model applique independent of the basic model and adding the model applique to the surface of the basic model, when the texture data of the target model is added to the surface of the basic model, the position of the texture data to be added can be quickly positioned from the surface of the basic model; the model applique is endowed with the material identification different from that of the basic model, so that when texture data of the target model is added to the basic model through the model applique, the added texture data cannot influence the original texture of the basic model, and the texture data is not required to be compressed and the like through the mode of transmitting the texture data into the model applique, so that the rendered texture has higher precision, and the method is more convenient and faster in replacing operation of the rendered texture without redrawing a new texture applique.

Description

Model mapping method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of image rendering, in particular to a model mapping method, a model mapping device, electronic equipment and a storage medium.
Background
When making a model related to some scene or character in a game, it may involve adding texture details to the model to achieve the effect of enriching the details of the model.
Most of the existing methods draw texture details to be added in mapping software to generate a texture map, introduce the texture map into a game engine, and add the texture map to the surface of a model by the game engine to realize rendering of the texture details.
However, since the size of the rendered map is fixed, scaling of the map size is required according to the actual situation when the rendered texture map is added to the model surface, which results in a lower clarity of the texture details rendered based on the scaled texture map.
Disclosure of Invention
In view of the above-mentioned deficiencies in the prior art, the present application provides a method and an apparatus for model mapping, an electronic device, and a storage medium, so as to improve the resolution or precision of texture details.
The technical scheme adopted by the embodiment of the application is as follows:
in a first aspect, an embodiment of the present application provides a model mapping method, including:
creating a model applique of a basic model, and binding a material identifier for the model applique; the base model comprises a hard surface model and/or a soft surface model;
adding the model applique to a surface of the base model; the material identification of the model applique is different from the material identification of the basic model;
creating a texture material ball corresponding to the target model; the texture material ball is at least used for representing texture data of the target model;
and adding texture material balls of the target model to corresponding positions of the model applique so as to add texture data of the target model to the surface of the base model.
In a second aspect, an embodiment of the present application further provides a model mapping apparatus, including: creating a module and adding the module;
the building module is used for building model appliques of the basic model and binding material marks for the model appliques; the base model comprises a hard surface model and/or a soft surface model;
the adding module is used for adding the model applique to the surface of the base model; wherein, the material identification of the model applique is different from the material identification of the basic model;
the creating module is used for creating a texture material ball corresponding to the target model; the texture material ball is at least used for representing texture data of the target model;
and the adding module is used for adding the texture material ball of the target model to the corresponding position of the model applique so as to add the texture data of the target model to the surface of the basic model.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the electronic device is run, the processor executing the machine-readable instructions to perform the steps of the model mapping method as provided in the first aspect when executed.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the model mapping method provided in the first aspect.
The beneficial effect of this application is:
the application provides a model mapping method, a device, an electronic device and a storage medium, wherein model decals independent of a basic model are manufactured, and the model decals are added to the surface of the basic model, so that when texture data of a target model are added to the surface of the basic model, the positions of the texture data to be added can be quickly positioned from the surface of the basic model; the model applique is endowed with the material identification different from that of the basic model, so that when texture data of the target model is added to the basic model through the model applique, the added texture data cannot influence the original texture of the basic model, and in addition, compared with the traditional texture mapping mode for rendering the texture, the method has the advantages that the texture data is not required to be compressed and the like, so that the rendered texture has higher precision, and the method enables the rendered texture to be more convenient and fast in replacement operation without redrawing a new texture mapping.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, and it should be understood that the following drawings only show some embodiments of the present application, and therefore should not be considered as limiting the scope, from which other related figures can be derived by those of ordinary skill in the art without inventive faculty.
Fig. 1 is a first flowchart illustrating a model mapping method according to an embodiment of the present disclosure;
FIG. 2 is a second flowchart illustrating a model mapping method according to an embodiment of the present disclosure;
FIG. 3 is a third flowchart illustrating a model mapping method according to an embodiment of the present disclosure;
FIG. 4 is a fourth flowchart illustrating a model mapping method according to an embodiment of the present disclosure;
FIG. 5 is a fifth flowchart illustrating a model mapping method according to an embodiment of the present disclosure;
fig. 6 is a sixth schematic flowchart of a model mapping method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a model charting apparatus according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the term "comprising" will be used in the embodiments of the present application to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
Fig. 1 is a first flowchart illustrating a model mapping method according to an embodiment of the present disclosure; the execution subject of the method can be a terminal device or a server. As shown in fig. 1, the method may include:
s101, creating model appliques of a basic model, and binding material identifiers for the model appliques; the base model comprises a hard surface model and/or a soft surface model;
the model applique is a model mapping technology, which means that a layer of independent model is covered on a model and an independent mapping is given to achieve the effect of mixing with the bottom model mapping. The model decal herein refers to a model to which a separate decal is assigned.
The basic model refers to an existing model, and in this embodiment, may refer to a model to which a texture needs to be added. The base model may be a pre-created model that is directly imported into the game engine when the method is executed.
The basic model may include a hard surface model or a soft surface model, and the hard surface model generally refers to an object model with a hard surface, such as a vehicle, weaponry, a building structure and the like, which is encountered in computer animation (cg) production and is commonly found in science fiction subject works. While soft surface models generally refer to models of objects with curved surfaces, such as game clothing and fabrics, encountered in computer animation.
Optionally, a material identifier may be bound to the created model applique, specifically, a material ball may be assigned to the model applique, the material ball may correspond to the material identifier, and texture and material information may be assigned to the material ball corresponding to the model applique through the material identifier.
S102, adding a model applique on the surface of a basic model; wherein, the material identification of the model applique is different from the material identification of the basic model.
The configuration operation here may be a UV expansion operation on the model decals to facilitate the addition of the model decals in a reasonable position on the base model surface.
Wherein UV refers to two-dimensional texture coordinates UV: vertex component information with a polygon and tessellated mesh for mapping the two-dimensional map onto the mesh of the three-dimensional model.
UV texture space: a two-dimensional texture coordinate system for defining UV, the UV texture space using letters U and V to indicate axes in the two-dimensional space.
It should be noted that the model applique and the base model have different material identifiers, so that the texture rendering operation on the base model does not cover the original texture of the base model, but attaches the newly added texture to the texture of the base model, thereby achieving higher precision of the new texture rendered on the base model.
S103, creating a texture material ball corresponding to the target model; the texture material ball is used for representing texture data of the target model at least.
The target model may refer to a model corresponding to a texture to be added to the base model, and taking the manufacturing of the machine top of the virtual character as an example, when the machine top is manufactured, a large amount of requirements for drawing normal map details are often met, such as manufacturing rivets or knobs on the machine top, then the base model may refer to the machine top, and the target model may refer to the rivets or knobs to be added.
The texture material ball contains texture data required for making the rivet, which may include: illumination information, normal information, etc.
When the target models are different, the corresponding texture material balls are different, and the target models can be rendered through texture data of the texture material balls.
In some embodiments, the target model may be a thread, a nut, a rivet, a pattern on a fabric, a window decoration on glass, etc., and may also be snow, frost, dust, etc., in a game scene.
And S104, adding the texture material ball of the target model to the corresponding position of the model applique so as to add the texture data of the target model to the surface of the basic model.
Based on the above description, the model applique can be understood as a pad attached to the airframe at the location of the rivet to be added, and by adding the model applique to the base model, the location of the target model to be added can be quickly located from the base model. Since the texture material ball contains the texture data of the target model, the texture data of the target model can be added to the surface of the base model, specifically to the position on the surface of the base model where the model applique is added, by adding the texture material ball of the target model to the corresponding position of the model applique. Since the model decal has a material identification independent of the base model, when the texture data of the target model is added to the surface of the base model by the model decal, the added texture data does not affect the texture of the base model.
Taking the basic model as a hard surface model such as a machine armor, taking the target model as a rivet, and taking the model applique added to the position a of the basic model as an example, through the way of the embodiment, the final rendering is as follows: a rivet is shown in position a on the surface of the vest.
In summary, in the model mapping method provided in this embodiment, the model applique independent of the base model is manufactured, and the model applique is added to the surface of the base model, so that when texture data of the target model is added to the surface of the base model, the position of the texture data to be added can be quickly located from the surface of the base model; the model applique is endowed with the material identification different from that of the basic model, so that when texture data of the target model is added to the basic model through the model applique, the added texture data cannot influence the original texture of the basic model, and in addition, compared with the traditional texture mapping mode for rendering the texture, the method has the advantages that the texture data is not required to be compressed and the like, so that the rendered texture has higher precision, and the method enables the rendered texture to be more convenient and fast in replacement operation without redrawing a new texture mapping.
FIG. 2 is a second flowchart illustrating a model mapping method according to an embodiment of the present disclosure; optionally, in step S101, binding a material identifier for the model applique may include:
and S201, creating a material interface corresponding to the model applique according to the material creating instruction.
The method may respond to an input material interface creating operation, and the material interface creating operation may be, for example, a right click operation when a cursor is located on a model decal, so that a property configuration interface corresponding to the model decal may be opened, and a material interface corresponding to the model decal is configured in the property configuration interface.
S202, binding corresponding material marks for the material interfaces.
The material interface is also the material ball, and the material interface created in the form of the material ball can facilitate the replacement and modification of the material interface.
In some embodiments, the material interface that can be created on one model applique is not limited to one, and may be multiple, where one material interface may be correspondingly bound with a material identifier, and the material identifiers bound by different material interfaces are different, and when texture data is transferred, the texture data can be transmitted into the corresponding material interface through the corresponding material identifier, so that the corresponding texture can be displayed at the corresponding position on the model applique.
Optionally, in step S102, adding a model applique to the surface of the base model may include: and performing texture mapping expansion on the model applique so as to add the texture mapping expanded model applique to the surface of the base model.
In some embodiments, the model decal may be texture mapped and expanded, i.e., UV expanded, in a modeling software such as Maya (three-dimensional modeling and animation software), where it is to be noted that UV refers to a type of built-in data in the model, which is essentially a coordinate system of a tangent space of the model, and the data may be used to attach the decal data to the surface of the model.
And the UV unfolding is the UV operation of the model, so that the chartlet can be conveniently displayed on the surface of the model according to a reasonable position. The UV expansion of the model applique can be understood as the model expansion of the model applique to obtain a coordinate mapping relationship between each point in the model before expansion and each point in the model after expansion.
For example: and (3) obtaining a three-dimensional model before the model applique is unfolded, obtaining a two-dimensional plane model after the model applique is unfolded, and then obtaining the coordinate mapping relation between each point in the three-dimensional model of the model applique and each point in the two-dimensional plane model after the model applique is unfolded.
In this embodiment, the model applique is a two-dimensional planar model before UV expansion, and thus after UV expansion, a two-dimensional planar model is obtained.
Alternatively, the model applique may be enabled to be added to the surface of the base model after the UV expansion operation has been performed on the model applique.
Optionally, in step S102, adding a model applique to the surface of the base model may include: and adding the model applique to the surface of the base model according to the target display style of the target model and the material type of the target model.
In the case of the target model determination, the target presentation style and the material type of the target model are determined. The target display style may refer to a display style of the target model on the base model, and the target display style of the target model may indicate a display position of the target model on the base model.
The material type of the target model may be determined from the material information of the target model, or the material type may be unique, and what model the target model is specifically based on the material type.
Each model corresponds to different material types, and the type of the model can be determined after the material types are obtained, for example, the target model can be determined to be threads, patterns on fabrics, snow, dust and the like in a game scene according to the material types.
Alternatively, based on the target presentation style of the target model and the material type of the target model, a position where the model applique is to be added to the base model may be determined, and the model applique is added to the surface of the base model according to the determined position.
FIG. 3 is a third flowchart illustrating a model mapping method according to an embodiment of the present disclosure; optionally, the adding the model applique to the surface of the base model according to the target display style of the target model and the material type of the target model in the above step may include:
s301, determining the spatial position relation between the model applique and the basic model according to the material type of the target model.
In this embodiment, in order to make the display effect of the target model rendered on the base model more realistic, when adding the model decal to the base model, the spatial position relationship between the model decal and the base model may also be determined.
For example: when the target model is snow, in order to make the rendered snow have a certain thickness, it can be realized by adding a model applique at a certain distance below the base model. When the target pattern is a pattern on a fabric, then, in order to make the pattern more three-dimensional, it can be achieved by adding a pattern decal at a distance above the base pattern.
It is also explained above that the type of the target model is determined by the material type of the target model, and then the spatial position relationship between the model applique and the base model can be further determined by determining the specific model (thread or snow or pattern, etc.) of the target model based on the material type of the target model.
S302, determining the target adding position of the model applique on the surface of the basic model according to the target display style and the spatial position relation of the target model.
And determining the position of the target model to be added to the base model according to the target display pattern of the target model, for example, if the target display pattern is that the target model is to be displayed in the upper right corner of the base model, then determining the position of the target model to be added to the base model as the upper right corner of the base model, and further combining the determined spatial position relationship, assuming that the spatial position relationship is 2 cm above the base model, determining that the model applique is to be added to the upper right corner of the base model and 2 cm above the surface of the base model, that is, the determined target addition position is represented by the upper right corner of the base model and 2 cm above the surface of the base model.
And S303, adding the model applique to a target adding position on the surface of the basic model.
Alternatively, based on the target addition position determined as described above, a model decal may be added to the target addition position.
FIG. 4 is a fourth flowchart illustrating a model mapping method according to an embodiment of the present disclosure; in step S103, creating a texture material ball corresponding to the target model may include:
s401, according to the input texture and material mapping, mapping information of the texture and material mapping is obtained, wherein the mapping information comprises one or more of the following items: material identification, texture data, and transparency data.
Alternatively, the texture and material map may be obtained directly from a database, and may be specific model detail map materials, for example: the input is screw detail map material, pattern detail map material, etc.
The input texture material map contains information representing the texture details of the model, which may be called map information, and the map information of different models is different.
The texture interface is used for indicating texture and transparency data of the texture map, and the texture interface is used for transmitting texture and transparency data of the texture map.
For different target models, the mapping information of texture mapping contains texture data, but the texture data of different target models are different. For the models with transparency requirements for patterns and the like on the fabric, the map information of the models can also contain transparency data.
The data type contained in the mapping information of each specific target model can be adaptively adjusted, so that the target model can be obtained through more real rendering.
S402, according to the mapping information of the texture material mapping, a texture material ball corresponding to the target model is established.
Optionally, based on the map information obtained from the texture material map, a texture material ball corresponding to the target model may be created, where the created texture material ball includes the rendering data related to the target model.
FIG. 5 is a fifth flowchart illustrating a model mapping method according to an embodiment of the present disclosure; in step S104, adding the texture material ball of the target model to the corresponding position of the model applique may include:
and S501, responding to the selection operation of the target texture material ball, and determining a target material interface of the target texture material ball on the model applique according to the material identifier in the target texture material ball.
Generally, since the texture material ball corresponding to the target model may include a plurality of texture material balls, in a specific rendering, only the target texture material ball needs to be determined from the plurality of texture material balls, and the target texture material ball is determined from the plurality of texture material balls in response to a selection operation of the target texture material ball, a click operation of the target texture material ball, or the like.
Optionally, based on the selected target texture material ball, a material identifier included in the target texture material ball may be obtained, and the material identifier may be used to determine a material interface to be added to the model applique by the target texture material ball.
S502, inputting texture data and/or transparency data in the target texture marker material ball into the target material interface.
Based on the determined target material interface, texture data and/or transparency data in the target texture target material ball may be transferred to the target material interface. The method can be characterized in that the dragging operation of the target texture material ball is responded, the target texture material ball is dragged to the position of a target material interface of the model applique, and therefore texture data and/or transparency data in the target texture material ball are input to the target material interface.
When the target texture material ball only contains texture data, the texture data is transferred to the target material interface, and when the target texture material ball contains texture data and transparency data, both the texture data and the transparency data are transferred to the target material interface.
Fig. 6 is a sixth schematic flowchart of a model mapping method according to an embodiment of the present application; in step S104, adding the texture material ball of the target model to the corresponding position of the model applique may include:
s601, obtaining offset mapping information corresponding to the model applique, wherein the offset mapping information comprises: the offset of each vertex in the model decal.
In some embodiments, for some target models, in addition to texture mapping information, offset mapping information may be included during rendering, for example, when the target model is snow, the thickness of the rendered snow may be generated by the offset mapping information, and at the same time, a gradual change effect of the thickness of the snow may be generated.
The offsets herein may refer to offsets of vertices in the model, and the offset map information is used to record the amount of offset in world space (the positive Z-axis direction of world space) of each vertex in the created individual model decal.
The offset map information may be generated according to the effect of the drawn snow, the amount of offset of each vertex is determined according to the effect of the snow actually intended to be rendered, and the target offset amount of each vertex can be obtained based on the offset amount of each vertex in the model decal acquired in the offset map information multiplied by a preset coefficient.
And S602, responding to the selection operation of the target texture material ball, and determining a target material interface of the target texture material ball on the model applique according to the material identifier in the target texture material ball.
In the same manner as step S501, in such a scenario that further includes offset mapping information, the target texture interface may also be determined according to the material identifier included in the selected target texture target material ball.
And S603, inputting texture data and/or transparency data in the target texture target material ball and the offset of each vertex in the model applique in the offset applique information into the target material interface.
In this scenario, different from step S502, the texture data and/or transparency data in the target texture marker material ball and the information of the offset amount of each vertex in the model decal in the offset map information may be both transferred to the target material interface.
Of course, the above description only lists the map information that may be involved in rendering the target model, and in practical applications, there may be other map information, and specifically, when the target model is determined according to the target model, each map information corresponding to the target model is also determined, and based on the determined map information, the map information is transferred to the target material interface corresponding to the target model, so as to render the target model on the base model.
Optionally, the method may further comprise: and responding to the selection operation of the new target texture material ball, and updating the currently added texture material ball at the corresponding position of the model applique according to the new target texture material ball.
According to the method, texture data of different models are created in a manner of creating texture material balls, model decals are added on a basic model, and material identifiers are bound for the model decals, so that the texture to be added on the basic model can be quickly replaced in a manner of replacing the texture material balls transferred to material interfaces corresponding to the material identifiers.
Optionally, the new target texture material ball may be selected in response to a click operation on the new target texture material ball, and the new target texture material ball is dragged to the texture material ball to be replaced, so as to update the texture material ball added at the corresponding position on the model overlay, thereby updating the texture of the target model rendered on the base model.
Optionally, in step S102, adding a model applique to the surface of the base model may include: and scaling the model applique according to the actual size proportion of the target model and the base model, and adding the scaled model applique on the surface of the base model.
In some embodiments, the model appliques may be appropriately scaled first when they are applied to the base model surface, the scaling being dependent on the actual dimensions of the target model, e.g., rivets, screws.
Optionally, a model decal is added to the surface of the base model based on the scaled model decal.
Additionally, the creation of the model applique in the present method may include: in response to the creation operation of the model decal, a planar model is created as the model decal.
Alternatively, it may include: responding to the surface copying operation of the target area on the basic model, and acquiring a target surface where the target area is located; a model decal is generated from the target surface.
In the embodiment, two ways of creating model decals are provided, and for some hard surface base models, when adding model decals thereto, a new plane model can be directly created as a model decal.
For some soft surface basic models, because the surface of the soft surface basic model has a certain curvature, the model applique is generated in a surface copying mode, so that the model applique can be better attached to the surface of the basic model, the problem that the gap exists between the model applique and the basic model cannot be caused, and the texture of the finally rendered target model can be more real.
The method will be explained below by specific examples.
Firstly, the method comprises the following steps: take the basic model as the character machine A and the target model as the rivet on the machine A as an example.
The method comprises the following steps: a flat model can be created as a model decal in the modeling software (maya is used here), and the model decal is subjected to UV unfolding operation (UV is divided) in the maya software, and a separate material interface is given to bind a material identifier.
Step two: and (4) manufacturing rivets on the model, and shrinking the model applique manufactured in the step one to a proper size and putting the model applique at a proper position on the basic model body.
Step three: the method comprises the steps of introducing rivet detail mapping materials into a game Engine UE4 (Unreal Engine 4, 4), and manufacturing a rivet texture material ball for model applique, wherein the detail mapping materials are basically the simplest PBR (physical Rendering-Based) materials, the PBR materials generally comprise inherent colors, roughness, metallization degree and normal mapping, and the default materials of the UE4 are the PBR materials.
Step four: and (4) importing the final model manufactured in the step two into a game engine UE4, dragging the rivet material ball in the step three into the material interface specified by the model applique in the step one, and then seeing the final detail effect, namely displaying the rivet at the target position on the armor.
Secondly, the method comprises the following steps: take the basic model as the fabric garment and the target model as the pattern on the fabric garment as an example.
The method comprises the following steps: in the modeling software (maya, used here), a finished fabric garment model is opened.
Step two: and entering a plane selection mode of a maya software editing mode, selecting the part of the fabric garment model needing to be added with patterns in the step one, executing a plane copying command, copying an independent model serving as a model applique, displacing the copied plane to a position with a certain gap above the surface of the original garment model, and endowing a new material ball.
Step three: the model in step two was subjected to a UV unfolding operation in maya software.
Step four: the game engine UE4 introduces a texture mapping material, and creates a plurality of texture material balls for model decal use. In addition, transparency is a problem due to the soft-surface fabric, and the pattern texture map may also include transparency data.
Step five: and (4) importing the final model manufactured in the step two into a game engine UE4, dragging the texture material ball in the step three into a material interface appointed by the model applique copied in the step two, and seeing the final detail effect, namely displaying the patterns on the target position of the fabric garment.
Thirdly, the method comprises the following steps: take the example that the basic model is rock and the target model is snow on the rock.
The method comprises the following steps: an existing scene asset model (here using a piece of rock) is opened in the modeling software (e.g., maya).
Step two: entering a face selection mode of a maya software editing mode, selecting a part needing snow accumulation on the rock model in the step I, executing a face copying command, copying a single model to be used as a model applique, entering a vertex selection mode of the maya software editing mode, selecting all vertexes of the model applique, using a vertex displacement tool of the maya to displace all vertexes to a position 2 mm below the surface of the original scene asset model, and giving a new material ball to the model applique.
Step three: the model in step two was subjected to a UV unfolding operation in maya software.
Step four: a snow texture ball for use by the model applique is made in game engine UE4 for use by the model applique created in step two. In addition, in order to render the thickness of the snow so that the snow is more realistic, offset map information is also introduced into the game engine UE 4.
Step five: and (4) importing the final model manufactured in the step two into a game engine UE4, importing the snow texture material ball and the offset mapping information in the step four into a material interface appointed by the model applique in the step two, and then seeing the final snow effect, namely covering snow with a certain thickness on the rock.
In summary, in the model mapping method provided in this embodiment, the model applique independent of the base model is manufactured, and the model applique is added to the surface of the base model, so that when texture data of the target model is added to the surface of the base model, the position of the texture data to be added can be quickly located from the surface of the base model; the model applique is endowed with the material identification different from that of the basic model, so that when texture data of the target model is added to the basic model through the model applique, the added texture data cannot influence the original texture of the basic model, and in addition, compared with the traditional texture mapping mode for rendering the texture, the method has the advantages that the texture data is not required to be compressed and the like, so that the rendered texture has higher precision, and the method enables the rendered texture to be more convenient and fast in replacement operation without redrawing a new texture mapping.
The following describes apparatuses, devices, and storage media for executing the model mapping method provided in the present application, and specific implementation processes and technical effects thereof are referred to above, and will not be described again below.
Fig. 7 is a schematic diagram of a model mapping apparatus according to an embodiment of the present application, where functions implemented by the model mapping apparatus correspond to steps executed by the foregoing method. The apparatus may be understood as the terminal device or the server or the processor of the server, or may be understood as a component which is independent of the server or the processor and implements the functions of the present application under the control of the server, as shown in fig. 7, the apparatus may include: a creating module 710 and an adding module 720;
a creating module 710, configured to create a model applique of a base model, and bind a material identifier to the model applique; the base model comprises a hard surface model and/or a soft surface model;
an adding module 720 for adding a model decal to a surface of the base model; wherein, the material identification of the model applique is different from the material identification of the basic model;
a creating module 710, configured to create a texture material sphere corresponding to the target model; the texture material ball is at least used for representing texture data of the target model;
and an adding module 720, configured to add the texture material ball of the target model to a corresponding position of the model applique, so as to add the texture data of the target model to the surface of the base model.
Optionally, the creating module 710 is specifically configured to create a material interface corresponding to the model applique according to the material creating instruction;
and binding corresponding material marks for the material interfaces.
Optionally, the adding module 710 is specifically configured to perform texture mapping expansion on the model applique, so as to add the texture mapping expanded model applique to the surface of the base model.
Optionally, the adding module 710 is specifically configured to add the model applique to the surface of the base model according to the target display style of the target model and the material type of the target model.
Optionally, the adding module 710 is specifically configured to determine a spatial position relationship between the model applique and the base model according to the material type of the target model;
determining a target adding position of the model applique on the surface of the basic model according to the target display style of the target model and the spatial position relation;
model decals are added to target addition locations on the surface of the base model.
Optionally, the creating module 720 is specifically configured to obtain mapping information of the texture material mapping according to the input texture material mapping, where the mapping information includes one or more of the following items: material identification, texture data and transparency data;
and creating texture material balls corresponding to the target model according to the mapping information of each texture material mapping.
Optionally, the adding module 710 is specifically configured to respond to the selection operation on the target texture material ball, and determine, according to the material identifier in the target texture target material ball, a target material interface of the target texture material ball, which corresponds to the model applique;
and inputting texture data and/or transparency data in the target texture marker material ball into the target material interface.
Optionally, the adding module 710 is specifically configured to obtain offset mapping information corresponding to the model decal, where the offset mapping information includes: offset of each vertex in the model applique;
responding to the selection operation of the target texture material ball, and determining a target material interface of the target texture material ball corresponding to the model applique according to the material identification in the target texture material ball;
and inputting texture data and/or transparency data in the target texture marker material ball and the offset of each vertex in the model applique in the offset map information into a target material interface.
Optionally, the method further comprises: an update module;
and the updating module is used for responding to the selection operation of the new target texture material ball and updating the currently added texture material ball at the corresponding position of the model applique according to the new target texture material ball.
Optionally, the adding module 710 is specifically configured to scale the model decal according to the actual size ratio of the target model and the base model, and add the scaled model decal on the surface of the base model.
By the method, when the terminal equipment is provided with the graphical user interface and displays the basic model, the server can add the model applique to the surface of the basic model by manufacturing the model applique independent of the basic model, so that when texture data of the target model is added to the surface of the basic model, the position of the texture data to be added can be quickly positioned from the surface of the basic model; the model applique is endowed with the material identification different from that of the basic model, so that when texture data of the target model is added to the basic model through the model applique, the added texture data cannot influence the original texture of the basic model, and in addition, compared with the traditional texture mapping mode for rendering the texture, the method has the advantages that the texture data is not required to be compressed and the like, so that the rendered texture has higher precision, and the method enables the rendered texture to be more convenient and fast in replacement operation without redrawing a new texture mapping.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. As another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
The modules may be connected or in communication with each other via a wired or wireless connection. The wired connection may include a metal cable, an optical cable, a hybrid cable, etc., or any combination thereof. The wireless connection may include a connection via a LAN, WAN, bluetooth, zigBee, NFC, or the like, or any combination thereof. Two or more modules may be combined into a single module, and any one module may be divided into two or more units. It can be clearly understood by those skilled in the art that, for convenience and simplicity of description, the specific working process of the system and the apparatus described above may refer to the corresponding process in the method embodiment, and is not described in detail in this application.
Fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application, including: a processor 801, a storage medium 802 and a bus 803, wherein the storage medium 802 stores machine-readable instructions executable by the processor 801, when the electronic device executes a model mapping method as in the embodiment, the processor 801 communicates with the storage medium 802 via the bus 803, and the processor 801 executes the machine-readable instructions to perform the following steps:
creating a model applique of a basic model, and binding a material identifier for the model applique; the base model comprises a hard surface model and/or a soft surface model;
adding a model applique on the surface of the base model; wherein, the material identification of the model applique is different from the material identification of the basic model;
creating a texture material ball corresponding to the target model; the texture material ball is at least used for representing texture data of the target model;
and adding the texture material ball of the target model to the corresponding position of the model applique so as to add the texture data of the target model to the surface of the base model.
In one possible embodiment, the processor 801, when executing the label for the model decal binding material, is specifically configured to: according to the material creating instruction, creating a material interface corresponding to the model applique;
and binding corresponding material marks for the material interfaces.
In one possible embodiment, the processor 801, when performing the adding of the model decal to the surface of the base model, is specifically configured to: and performing texture mapping expansion on the model applique so as to add the model applique subjected to the texture mapping expansion onto the surface of the base model.
In one possible embodiment, the processor 801, when performing the adding of the model decal to the surface of the base model, is specifically configured to: and adding the model applique to the surface of the base model according to the target display style of the target model and the material type of the target model.
In one possible embodiment, the processor 801, when executing the adding of the model applique to the surface of the base model according to the target display style of the target model and the material type of the target model, is specifically configured to: determining the spatial position relation between the model applique and the basic model according to the material type of the target model;
determining a target adding position of the model applique on the surface of the basic model according to the target display style of the target model and the spatial position relation;
a model decal is added to the target addition location on the surface of the base model.
In one possible embodiment, the processor 801, when executing creating the texture material ball corresponding to the target model, is specifically configured to: obtaining mapping information of the texture material mapping according to the input texture material mapping, wherein the mapping information comprises one or more of the following items: material identification, texture data and transparency data;
and creating texture material balls corresponding to the target model according to the mapping information of each texture material mapping.
In one possible embodiment, the processor 801, when executing the adding of the texture material ball of the target model to the corresponding location of the model applique, is specifically configured to: responding to the selection operation of the target texture material ball, and determining a target material interface of the target texture material ball on the model applique according to the material identifier in the target texture material ball;
and inputting texture data and/or transparency data in the target texture marker material ball into the target material interface.
In one possible embodiment, the processor 801, when executing the adding of the texture material balls of the target model to the corresponding positions of the model applique, is specifically configured to: obtaining offset mapping information corresponding to the model applique, wherein the offset mapping information comprises: offset of each vertex in the model applique;
responding to the selection operation of the target texture material ball, and determining a target material interface of the target texture material ball corresponding to the model applique according to the material identification in the target texture material ball;
and inputting texture data and/or transparency data in the target texture marker material ball and the offset of each vertex in the model applique in the offset map information into a target material interface.
In one possible embodiment, the processor 801 is further configured to: and responding to the selection operation of the new target texture material ball, and updating the texture material ball currently added at the corresponding position of the model applique according to the new target texture material ball.
In one possible embodiment, the processor 801, when performing the adding of the model decal to the surface of the base model, is specifically configured to: and scaling the model applique according to the actual size proportion of the target model and the base model, and adding the scaled model applique on the surface of the base model.
By the method, when the terminal equipment is provided with the graphical user interface and displays the basic model, the server can add the model applique to the surface of the basic model by manufacturing the model applique independent of the basic model, so that when texture data of the target model is added to the surface of the basic model, the position of the texture data to be added can be quickly positioned from the surface of the basic model; the model applique is endowed with the material identification different from that of the basic model, so that when texture data of the target model is added to the basic model through the model applique, the added texture data cannot influence the original texture of the basic model, and in addition, compared with the traditional texture mapping mode for rendering the texture, the method has the advantages that the texture data is not required to be compressed and the like, so that the rendered texture has higher precision, and the method enables the rendered texture to be more convenient and fast in replacement operation without redrawing a new texture mapping.
The storage medium 802 stores therein program code, which, when executed by the processor 801, causes the processor 801 to perform various steps in the model mapping method according to various exemplary embodiments of the present application described in the above-mentioned "exemplary method" section of the present specification.
The Processor 801 may be a general-purpose Processor, such as a Central Processing Unit (CPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware components, and may implement or execute the methods, steps, and logic blocks disclosed in the embodiments of the present Application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
Storage medium 802, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules. The Memory may include at least one type of storage medium, and may include, for example, a flash Memory, a hard disk, a multimedia card, a card-type Memory, a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Programmable Read Only Memory (PROM), a Read Only Memory (ROM), a charged Erasable Programmable Read Only Memory (EEPROM), a magnetic Memory, a magnetic disk, an optical disk, and so on. The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The storage medium 802 in the embodiments of the present application may also be a circuit or any other device capable of implementing a storage function for storing program instructions and/or data.
Optionally, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the processor performs the following steps:
creating a model applique of a basic model, and binding a material identifier for the model applique; the base model comprises a hard surface model and/or a soft surface model;
adding a model applique to the surface of the base model; wherein, the material identification of the model applique is different from the material identification of the basic model;
creating a texture material ball corresponding to the target model; the texture material ball is at least used for representing texture data of the target model;
and adding the texture material ball of the target model to the corresponding position of the model applique so as to add the texture data of the target model to the surface of the base model.
In one possible embodiment, the processor 801, when executing the label for the model decal binding material, is specifically configured to: according to the material establishing instruction, establishing a material interface corresponding to the model applique;
and binding corresponding material marks for the material interfaces.
In one possible embodiment, the processor 801, when performing the adding of the model decal to the surface of the base model, is specifically configured to: and performing texture mapping expansion on the model applique so as to add the texture mapping expanded model applique to the surface of the base model.
In one possible embodiment, the processor 801, when performing the adding of the model decal to the surface of the base model, is specifically configured to: and adding the model applique to the surface of the base model according to the target display style of the target model and the material type of the target model.
In one possible embodiment, the processor 801, when executing the adding of the model applique to the surface of the base model according to the target display style of the target model and the material type of the target model, is specifically configured to: determining the spatial position relation between the model applique and the basic model according to the material type of the target model;
determining a target adding position of the model applique on the surface of the basic model according to the target display style of the target model and the spatial position relation;
model decals are added to target addition locations on the surface of the base model.
In one possible embodiment, the processor 801, when executing creating the texture material ball corresponding to the target model, is specifically configured to: according to the input texture and material mapping, mapping information of the texture and material mapping is obtained, wherein the mapping information comprises one or more of the following items: material identification, texture data and transparency data;
and creating texture material balls corresponding to the target model according to the mapping information of each texture material mapping.
In one possible embodiment, the processor 801, when executing the adding of the texture material ball of the target model to the corresponding location of the model applique, is specifically configured to: responding to the selection operation of the target texture material ball, and determining a target material interface of the target texture material ball on the model applique according to the material identifier in the target texture material ball;
and inputting texture data and/or transparency data in the target texture marker material ball into the target material interface.
In one possible embodiment, the processor 801, when executing the adding of the texture material ball of the target model to the corresponding location of the model applique, is specifically configured to: obtaining offset mapping information corresponding to the model applique, wherein the offset mapping information comprises: offset of each vertex in the model applique;
responding to the selection operation of the target texture material ball, and determining a target material interface of the target texture material ball on the model applique according to the material identifier in the target texture material ball;
and inputting texture data and/or transparency data in the target texture marker material ball and the offset of each vertex in the model applique in the offset applique information into the target material interface.
In one possible embodiment, the processor 801 is further configured to: and responding to the selection operation of the new target texture material ball, and updating the currently added texture material ball at the corresponding position of the model applique according to the new target texture material ball.
In one possible embodiment, the processor 801, when performing the adding of the model decal to the surface of the base model, is specifically configured to: and scaling the model applique according to the actual size proportion of the target model and the base model, and adding the scaled model applique on the surface of the base model.
By the method, when the terminal equipment is provided with the graphical user interface and displays the basic model, the server can add the model applique to the surface of the basic model by manufacturing the model applique independent of the basic model, so that when texture data of the target model is added to the surface of the basic model, the position of the texture data to be added can be quickly positioned from the surface of the basic model; the model applique is endowed with the material identification different from that of the basic model, so that when texture data of the target model is added to the basic model through the model applique, the added texture data cannot influence the original texture of the basic model, and in addition, compared with the traditional texture mapping mode for rendering the texture, the method has the advantages that the texture data is not required to be compressed and the like, so that the rendered texture has higher precision, and the method enables the rendered texture to be more convenient and fast in replacement operation without redrawing a new texture mapping. In the embodiments of the present application, when being executed by a processor, the computer program may further execute other machine-readable instructions to perform other methods as described in the embodiments, and for the method steps and principles of specific execution, reference is made to the description of the embodiments, and details are not repeated here.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to perform some steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (13)

1. A method of model mapping, comprising:
creating a model applique of a basic model, and binding a material identifier for the model applique; the base model comprises a hard surface model and/or a soft surface model;
adding the model applique to a surface of the base model; wherein the material identification of the model applique is different from the material identification of the basic model;
creating a texture material ball corresponding to the target model; the texture material ball is at least used for representing texture data of the target model;
and adding texture material balls of the target model to corresponding positions of the model applique so as to add texture data of the target model to the surface of the base model.
2. The method of claim 1, wherein the binding a material identifier to the model decal comprises:
according to the material establishing instruction, establishing a material interface corresponding to the model applique;
and binding corresponding material marks for the material interfaces.
3. The method of claim 1, wherein adding the model applique to a surface of the base model comprises:
and performing texture mapping expansion on the model applique so as to add the model applique after the texture mapping expansion to the surface of the base model.
4. The method of claim 1, wherein adding the model applique to a surface of the base model comprises:
and adding the model applique to the surface of the base model according to the target display style of the target model and the material type of the target model.
5. The method of claim 4, wherein the adding the model applique to the surface of the base model according to the target display style of the target model and the material type of the target model comprises:
determining the spatial position relationship between the model applique and the basic model according to the material type of the target model;
determining a target adding position of the model applique on the surface of the base model according to the target display style of the target model and the spatial position relation;
adding the model applique to a target addition location on a surface of the base model.
6. The method of claim 1, wherein the creating a texture material sphere corresponding to the target model comprises:
obtaining mapping information of the texture material mapping according to the input texture material mapping, wherein the mapping information comprises one or more of the following items: material identification, texture data and transparency data;
and creating texture material balls corresponding to the target model according to the mapping information of each texture material mapping.
7. The method of claim 1, wherein the adding a textured material ball of the target model to a corresponding location of the model decal comprises:
responding to the selection operation of a target texture material ball, and determining a target material interface of the target texture material ball corresponding to the model applique according to the material identifier in the target texture material ball;
and inputting texture data and/or transparency data in the target texture marker material ball into the target material interface.
8. The method of claim 1, wherein the adding a textured material ball of the target model to a corresponding location of the model decal comprises:
obtaining offset mapping information corresponding to the model applique, wherein the offset mapping information comprises: the offset of each vertex in the model applique;
responding to the selection operation of a target texture material ball, and determining a target material interface of the target texture material ball corresponding to the model applique according to the material identifier in the target texture material ball;
and inputting texture data and/or transparency data in the target texture target material ball and the offset of each vertex in the model applique in the offset map information into the target material interface.
9. The method of claim 1, further comprising:
and responding to the selection operation of the new target texture material ball, and updating the texture material ball currently added at the corresponding position of the model applique according to the new target texture material ball.
10. The method of claim 1, wherein the adding the model applique to the surface of the base model comprises:
and scaling the model applique according to the actual size proportion of the target model and the base model, and adding the scaled model applique on the surface of the base model.
11. A model charting apparatus, comprising: creating a module and adding the module;
the creation module is used for creating a model applique of a basic model and binding a material identifier for the model applique; the base model comprises a hard surface model and/or a soft surface model;
the adding module is used for adding the model applique to the surface of the base model; the material identification of the model applique is different from the material identification of the basic model;
the creation module is used for creating a texture material ball corresponding to the target model; the texture material ball is at least used for representing texture data of the target model;
and the adding module is used for adding the texture material ball of the target model to the corresponding position of the model applique so as to add the texture data of the target model to the surface of the base model.
12. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing program instructions executable by the processor, the processor and the storage medium communicating via the bus when the electronic device is running, the processor executing the program instructions to perform the model mapping method according to any one of claims 1 to 10 when executed.
13. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, performs the model mapping method according to any one of claims 1 to 10.
CN202211124225.9A 2022-09-15 2022-09-15 Model mapping method and device, electronic equipment and storage medium Pending CN115512030A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211124225.9A CN115512030A (en) 2022-09-15 2022-09-15 Model mapping method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211124225.9A CN115512030A (en) 2022-09-15 2022-09-15 Model mapping method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115512030A true CN115512030A (en) 2022-12-23

Family

ID=84503707

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211124225.9A Pending CN115512030A (en) 2022-09-15 2022-09-15 Model mapping method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115512030A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117115805A (en) * 2023-10-25 2023-11-24 园测信息科技股份有限公司 Random irregular object identification method and device under Unreal Engine platform

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117115805A (en) * 2023-10-25 2023-11-24 园测信息科技股份有限公司 Random irregular object identification method and device under Unreal Engine platform
CN117115805B (en) * 2023-10-25 2024-02-09 园测信息科技股份有限公司 Random irregular object identification method and device under Unreal Engine platform

Similar Documents

Publication Publication Date Title
Arora et al. Symbiosissketch: Combining 2d & 3d sketching for designing detailed 3d objects in situ
CN107636585B (en) Generation of three-dimensional fashion objects by drawing inside a virtual reality environment
CN104183005B (en) Graphics processing unit and rendering intent based on segment
US10108750B2 (en) Method for designing a geometrical three-dimensional modeled object
US20110267347A1 (en) Systems and methods for primitive intersection in ray tracing
CN111324837B (en) Three-dimensional chart visualization method and device based on GIS system at web front end
CN101866379B (en) Method, program and product edition system for visualizing objects displayed on a computer screen
CN115512030A (en) Model mapping method and device, electronic equipment and storage medium
Pietriga et al. Representation-independent in-place magnification with sigma lenses
CN107705363A (en) A kind of road Visualization Modeling method and device
CN108090952A (en) 3 d modeling of building method and apparatus
KR102393765B1 (en) Method to provide design information
WO2023091086A2 (en) Apparatus and method for simulating a three-dimensional object
CN106204408A (en) Drawing processes circuit and preposition depth information process method thereof
CN105046738A (en) Clothes dynamic three-dimension making method and making apparatus
US20190311424A1 (en) Product visualization system and method for using two-dimensional images to interactively display photorealistic representations of three-dimensional objects based on smart tagging
CN113870406A (en) Free-form model making and material pasting method and readable storage medium
CN114529674A (en) Three-dimensional model texture mapping method, device and medium based on two-dimensional slice model
CN108171784B (en) Rendering method and terminal
Chiricota Three‐dimensional garment modelling using attribute mapping
JP5481751B2 (en) Concealment processing program, visualization processing method and apparatus
CN113196343A (en) Three-dimensional modeling method for clothes
CN116543093B (en) Flexible object rendering method, device, computer equipment and storage medium
JPS62162173A (en) Graphic display method
TWI459323B (en) Virtual Reality Object Catch Making Method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination