CN114612641A - Material migration method and device and data processing method - Google Patents

Material migration method and device and data processing method Download PDF

Info

Publication number
CN114612641A
CN114612641A CN202011418614.3A CN202011418614A CN114612641A CN 114612641 A CN114612641 A CN 114612641A CN 202011418614 A CN202011418614 A CN 202011418614A CN 114612641 A CN114612641 A CN 114612641A
Authority
CN
China
Prior art keywords
dimensional object
image
mapping
migration
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011418614.3A
Other languages
Chinese (zh)
Inventor
潘健雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202011418614.3A priority Critical patent/CN114612641A/en
Publication of CN114612641A publication Critical patent/CN114612641A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/16Cloth

Abstract

The application discloses a material migration method and device and a data processing method. Wherein, the method comprises the following steps: extracting a first material map of a first three-dimensional object from the first image; receiving a mapping instruction for mapping a second three-dimensional object; responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; and rendering the second pixel map to the second three-dimensional object. The invention solves the technical problem that in the prior art, in the process of designing the three-dimensional image of an article, the material of a two-dimensional image can generate a corresponding three-dimensional model map through a deep learning network, and then the three-dimensional model map is rendered back to the three-dimensional model of the article, and the efficiency of the material migration process is low because the migration of the material is only limited by the migration between articles with the same style.

Description

Material migration method and device and data processing method
Technical Field
The invention relates to the field of image processing, in particular to a material migration method and device and a data processing method.
Background
In the field of garment design, designers often devote significant effort to the design of garment textures. One common design idea is to collect scene pictures, street photographs, commodity pictures, etc. related to clothing, and extract popular texture elements from them for design. For traditional clothing texture design, the verification concept is usually based on two-dimensional images, and original texture materials are required. However, in most cases, the designer has a certain difficulty in obtaining the original texture material, and a great deal of effort is required. Therefore, if a designer wants to verify the texture effect of a certain popular garment, the designer needs to manually process the material from the sources such as the corresponding scene graph, the street photo graph and the like to obtain the texture of the garment and verify the design scheme, so that the design difficulty is improved, and the design cost is increased.
At present, some schemes are available, which can automatically generate corresponding three-dimensional model UV maps from the front and back textures of the garment image through a deep learning network, and then re-render the UV maps to the three-dimensional garment model. However, the scheme has the disadvantage that only the transfer of the texture of the same style of clothes can be realized, for example, if a user wants to transfer the texture pattern on the T-shirt to the three-dimensional model, the corresponding three-dimensional model is required to be the three-dimensional model of the T-shirt, and the transfer of the texture of the T-shirt to the three-dimensional model of the one-piece dress cannot be realized. The range of use is therefore very limited.
Aiming at the problem that in the prior art, in the process of designing the three-dimensional image of an article, a material of a two-dimensional image can generate a corresponding three-dimensional model map through a deep learning network, and then the three-dimensional model map is rendered back to the three-dimensional model of the article, because the migration of the material is only limited to the migration between articles with the same style, the efficiency of the migration process of the material is low, an effective solution is not provided at present.
Disclosure of Invention
The embodiment of the invention provides a material migration method and device and a data processing method, which at least solve the technical problem that in the prior art, in the process of designing a three-dimensional image of an article, a material of a two-dimensional image can generate a corresponding three-dimensional model map through a deep learning network, and then the three-dimensional model map is rendered back to a three-dimensional model of the article, and the efficiency of the material migration process is low because the material migration is limited only aiming at the migration between articles with the same style.
According to an aspect of the embodiments of the present invention, there is provided a method for migrating a material, including: extracting a first material map of a first three-dimensional object from the first image; receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in a first material mapping need to be replaced on the surface of the second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object; responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; and rendering the second pixel map to the second three-dimensional object.
According to another aspect of the embodiments of the present invention, there is also provided a material migration method, including: displaying a first image and a second image in a display interface, wherein the displayed object in the first image is a first three-dimensional object, the displayed object in the second image is a second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object; selecting a first image, and extracting materials of a first three-dimensional object from the first image to obtain a first material map; receiving a mapping instruction for mapping the second three-dimensional object, wherein the mapping instruction is used for indicating that materials in the first material mapping need to be replaced on the surface of the second three-dimensional object; responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; a second image of the second pixel map rendered onto the second three-dimensional object is shown.
According to another aspect of the embodiments of the present invention, there is also provided a material migration apparatus, including: the extraction module is used for extracting a first material map of the first three-dimensional object from the first image; the triggering module is used for receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in a first material mapping need to be replaced on the surface of the second three-dimensional object, and the styles of the first three-dimensional object and the second three-dimensional object are different; the response module is used for responding to the mapping instruction and adopting the material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; and the rendering module is used for rendering the second pixel map to the second three-dimensional object.
According to another aspect of the embodiments of the present invention, there is also provided a material migration apparatus, including: the first display module is used for displaying a first image and a second image in a display interface, wherein the displayed object in the first image is a first three-dimensional object, the displayed object in the second image is a second three-dimensional object, and the styles of the first three-dimensional object and the second three-dimensional object are different; the extraction module is used for selecting the first image and extracting materials of the first three-dimensional object from the first image to obtain a first material map; receiving a mapping instruction for mapping the second three-dimensional object, wherein the mapping instruction is used for indicating that materials in the first material mapping need to be replaced on the surface of the second three-dimensional object; the response module is used for responding to the mapping instruction and adopting the material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; and the second display module is used for displaying a second image rendered to the second three-dimensional object by the second pixel map.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium including a stored program, wherein when the program is executed, an apparatus in which the storage medium is located is controlled to perform migration of the above-described materials.
According to another aspect of the embodiments of the present invention, there is also provided a processor, configured to execute a program, where the program executes to perform the migration of the above-mentioned materials.
According to another aspect of the embodiments of the present invention, there is also provided a material migration system, including: a processor; and a memory coupled to the processor for providing instructions to the processor for processing the following processing steps: extracting a first material map of a first three-dimensional object from the first image; receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in a first material mapping need to be replaced on the surface of the second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object; responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; and rendering the second pixel map to the second three-dimensional object.
According to another aspect of the embodiments of the present invention, there is also provided a data processing method, including: acquiring a first image; extracting a first material map of a first three-dimensional object from the first image; rendering the first material map onto a second three-dimensional object based on a geometric transformation relationship between the first three-dimensional object and the second three-dimensional object.
According to another aspect of the embodiments of the present invention, there is also provided a data processing method, including: displaying the first image and the second image; receiving a material extracting instruction, and extracting a first material map of a first three-dimensional object from the first image according to the material extracting instruction; receiving a material transfer instruction, and rendering the first material map to a second three-dimensional object in the second image according to the material transfer instruction based on a geometric transformation relation between the first three-dimensional object and the second three-dimensional object.
In the embodiment of the application, a first material map of a first three-dimensional object is extracted from a first image; receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in a first material mapping need to be replaced on the surface of the second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object; responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; and rendering the second pixel map to the second three-dimensional object. According to the scheme, the texture of the object is obtained from the image, the material chartlet of the three-dimensional model in the same style is recovered, then the material chartlet is sent to the material transfer module, the material chartlet of the object in other styles is obtained, and then the transfer of the material between the objects in different styles can be completed by rendering based on the newly generated material chartlet, so that the texture transfer is realized between the objects in different styles, the technical problem that in the process of designing the three-dimensional image of the object in the prior art, the material of the two-dimensional image can be used for generating the corresponding three-dimensional model chartlet through a deep learning network, and then the three-dimensional model chartlet is re-rendered to the three-dimensional model of the object is solved, and the technical problem that the efficiency of the transfer process of the material is low due to the fact that the transfer of the material is limited to the transfer between the objects in the same style is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 shows a hardware configuration block diagram of a computing device (or mobile device) for implementing a migration method of material;
fig. 2 is a flow of a material migration method according to embodiment 1 of the present application;
FIG. 3 is a schematic diagram of a process of extracting a story map according to embodiment 1 of the present application;
FIG. 4 is a schematic illustration of a material migration according to embodiment 1 of the present application;
fig. 5 is a flowchart of another material migration method according to embodiment 2 of the present application;
fig. 6 is a schematic view of a material transfer apparatus according to embodiment 3 of the present application;
fig. 7 is a schematic view of a material transfer apparatus according to embodiment 4 of the present application;
FIG. 8 is a block diagram of a computing device, according to an embodiment of the invention;
fig. 9 is a flowchart of a data processing method according to embodiment 7 of the present application;
FIG. 10 is a flowchart of another data processing method according to embodiment 8 of the present application;
fig. 11 is a schematic diagram of a data processing apparatus according to embodiment 9 of the present application;
fig. 12 is a schematic diagram of a data processing apparatus according to embodiment 10 of the present application.
Detailed Description
In order to make those skilled in the art better understand the technical solutions of the present invention, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, some terms or terms appearing in the description of the embodiments of the present application are applicable to the following explanations:
three-dimensional garment (3D path): the method is used for representing the three-dimensional garment model synthesized based on three-dimensional modeling, physical simulation, manual modeling and other schemes, and has grid and texture information.
UV mapping (UV Map): UV is short for texture map coordinates. Which defines information of the position of each point on the picture. These points are correlated with the three-dimensional model to determine the location of the surface texture map. The clothes are like plane cloth, and the clothes are cut into three-dimensional clothes. UV is the exact mapping of each point on the image to the surface of the model object. The position of the gap between the point and the point is subjected to image smoothing interpolation processing by software. This is the so-called UV mapping.
Texture Transfer (Texture Transfer): the texture of the object in the original image is migrated to the target object, which may be present in the three-dimensional model or the two-dimensional image. The innovative scheme particularly refers to the method for transferring the texture of the clothes in a single image to the target three-dimensional clothes model.
Texture completion (Texture Inpaint): for a missing part on a texture image, how to complement the missing part with other information and make the complemented part indistinguishable to human eyes. The texture in the innovative scheme is particularly used for UV mapping.
Intrinsic decompaction (intrinsic decompaction): an image is decomposed into two parts, a reflectance map and an illumination map, the reflectance map being the portion of the image that remains unchanged under varying lighting conditions. The image is the image reflecting the illumination condition of the original image. The innovative scheme particularly decomposes the clothing illumination and fold information in a single image.
Texture hyper-score (Texture Superresolution ion): a technique for improving the resolution of a texture image.
Example 1
There is also provided, in accordance with an embodiment of the present invention, an embodiment of a method for migrating material, it being noted that the steps illustrated in the flowchart of the accompanying drawings may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order different than that described herein.
The method provided by the first embodiment of the present application may be executed in a mobile terminal, a computing device, or a similar computing device. Fig. 1 shows a hardware configuration block diagram of a computing device (or mobile device) for implementing a migration method of material. As shown in fig. 1, computing device 10 (or mobile device 10) may include one or more (shown as 102a, 102b, … …, 102 n) processors 102 (processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA), memory 104 for storing data, and transmission module 106 for communication functions. Besides, the method can also comprise the following steps: a display, an input/output interface (I/O interface), a Universal Serial Bus (USB) port (which may be included as one of the ports of the I/O interface), a network interface, a power source, and/or a camera. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration and is not intended to limit the structure of the electronic device. For example, computing device 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
It should be noted that the one or more processors 102 and/or other data processing circuitry described above may be referred to generally herein as "data processing circuitry". The data processing circuitry may be embodied in whole or in part in software, hardware, firmware, or any combination thereof. Further, the data processing circuitry may be a single, stand-alone processing module, or incorporated in whole or in part into any of the other elements in the computing device 10 (or mobile device). As referred to in the embodiments of the application, the data processing circuit acts as a processor control (e.g. selection of a variable resistance termination path connected to the interface).
The memory 104 may be used to store software programs and modules of application software, such as program instructions/data storage devices corresponding to the material migration method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the software programs and modules stored in the memory 104, that is, implementing the material migration method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 104 may further include memory located remotely from processor 102, which may be connected to computing device 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission module 106 is used to receive or transmit data via a network. Specific examples of such networks may include wireless networks provided by a communications provider of computing device 10. In one example, the transmission module 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission module 106 may be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
The display may be, for example, a touch screen type Liquid Crystal Display (LCD) that may enable a user to interact with a user interface of the computing device 10 (or mobile device).
It should be noted here that in some alternative embodiments, the computer device (or mobile device) shown in fig. 1 described above may include hardware elements (including circuitry), software elements (including computer code stored on a computer-readable medium), or a combination of both hardware and software elements. It should be noted that fig. 1 is only one example of a particular specific example and is intended to illustrate the types of components that may be present in the computer device (or mobile device) described above.
Under the operating environment, the application provides a material migration method as shown in fig. 2. Fig. 2 is a flowchart of a material migration method according to embodiment 1 of the present application.
In step S21, a first material map of the first three-dimensional object is extracted from the first image.
Specifically, the first material map may be a texture map on the first three-dimensional object. The first three-dimensional object may be an object from which material is to be extracted. The first image may be an image including a first three-dimensional object.
The first image may be a two-dimensional image containing a first three-dimensional object. In an alternative embodiment, taking the clothing design as an example, the first image may be a show scene image or a street photo, and after the designer obtains the design inspiration from the show scene or the street photo, the show scene image or the street photo may be used as the first image, and the material map of the clothing is extracted from the show scene image or the street photo.
The first image may be a three-dimensional image including a first three-dimensional object. In an alternative embodiment, taking a three-dimensional reloading game as an example, the first image may be a three-dimensional model of a garment existing in the game, and when a game player needs to reload the garment along with the material of the garment, a material map of the garment may be extracted from the first image.
It should be noted that, at present, in order to perform related design based on the same material map, a designer needs to acquire a source file of the material map, and based on the scheme of the present application, only a first image including a first three-dimensional object is acquired, such as a scene image and a street capture image, so that the difficulty of acquiring the material is greatly reduced, and the efficiency of searching the material is improved.
And step S23, receiving a mapping instruction for mapping the second three-dimensional object, wherein the mapping instruction is used for indicating that the material in the first material mapping needs to be replaced on the surface of the second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object.
Specifically, the mapping instruction may trigger the device processor to perform the step of replacing the first material mapping on the first three-dimensional object onto the second three-dimensional object. For the designer, the mapping instruction may be that after the first material mapping is extracted from the first image, the first material mapping is moved to the model of the second three-dimensional object displayed on the interface by a dragging operation, and the device processor receives the mapping instruction. For a game player of the reloading game, after the first material map and the second three-dimensional object of the first image are selected, the reloading control is selected, and then the map instruction can be sent to the equipment processor.
The first three-dimensional object and the second three-dimensional object are different in style. Taking the garment as an example, the first three-dimensional object can be a short sleeve, and the second three-dimensional object can be a long sleeve, a skirt or any other style of garment; taking a vase as an example, the first three-dimensional object can be a cylindrical vase, and the second three-dimensional object can be various vases different from the first three-dimensional object, such as a cone, a sphere and the like.
And step S25, responding to the mapping instruction, and adopting the material migration module to migrate the first material mapping to obtain a second material mapping of the second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles.
According to the scheme, the material migration model is obtained by constructing the semantic migration relationship between the first three-dimensional object and the second three-dimensional object, and then the material migration model is used for migrating the first material map, so that the second material map suitable for the second migration model can be obtained.
In an alternative embodiment, the first three-dimensional object and the second three-dimensional object may be subjected to structural analysis to obtain a rigid transformation matrix between the first three-dimensional object and the second three-dimensional object, and then the material migration module may be obtained based on the rigid transformation matrix between the first three-dimensional object and the second three-dimensional object.
Step S27, rendering the second pixel map onto the second three-dimensional object.
In the above steps, the second material map is rendered on the second three-dimensional object, and a similar design that the material map is the same as the first three-dimensional object but has a different style can be obtained. Still in the scene of clothing design, new product designs of different styles can be obtained by rendering the second plain material chartlet on the second three-dimensional object. For example, a designer can obtain inspiration from a show scene, extract a first material map from a short sleeve of the show scene, migrate the first material map through a material migration module to obtain a second material map, render the second material map onto a long sleeve, and obtain a design of the long sleeve in the same series as the short sleeve. Based on the scheme, the user can semi-automatically specify the semantic migration relationship of the object components among different styles so as to automatically migrate the input material map corresponding to the object map to the material map of another three-dimensional object model.
And in the steps, the obtained second pixel map is utilized, and a three-dimensional rendering engine is combined to render to obtain a final texture migration result. Besides static model effect display, technologies such as physical simulation and cloth animation can be combined to render a section of digital garment show-through video, so that a designer can feel the effect of texture design in an all-round manner. Therefore, the scheme can be applied to three-dimensional clothing design and virtual try-on scenes, and particularly in the field of interactive games, a user can transfer real clothing textures to a clothing model of a three-dimensional virtual character, so that a user-defined clothing effect is generated, and interactivity and interestingness are greatly improved.
In the above embodiment of the present application, a first material map of a first three-dimensional object is extracted from a first image; receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in a first material mapping need to be replaced on the surface of the second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object; responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; and rendering the second pixel map to the second three-dimensional object. According to the scheme, the texture of the object is obtained from the image, the material chartlet of the three-dimensional model in the same style is recovered, then the material chartlet is sent to the material transfer module, the material chartlet of the object in other styles is obtained, and then the transfer of the material between the objects in different styles can be completed by rendering based on the newly generated material chartlet, so that the texture transfer is realized between the objects in different styles, the technical problem that in the process of designing the three-dimensional image of the object in the prior art, the material of the two-dimensional image can be used for generating the corresponding three-dimensional model chartlet through a deep learning network, and then the three-dimensional model chartlet is re-rendered to the three-dimensional model of the object is solved, and the technical problem that the efficiency of the transfer process of the material is low due to the fact that the transfer of the material is limited to the transfer between the objects in the same style is solved.
As an alternative embodiment, semantic migration relationships are used to characterize rigid-body geometric transformations between image regions located on three-dimensional objects of different styles.
Specifically, a rigid body transformation matrix between objects of different styles can be obtained by constructing a semantic migration relationship between the objects of different styles, so as to obtain the migration model, and the relationship can be converted into rigid body geometric transformation between different image regions by means of a material mapping form.
For the migration of different styles of material maps, two clothing style material maps with the same texture can be generated by a computer graphics rendering algorithm, and then the two clothing style material maps are sent to an image migration network for training, so that a material texture migration network model between two clothing models can be obtained, and a set of end-to-end migration scheme is formed. However, due to the large shape difference between different materials and the complex and various clothing textures, it is very difficult to directly train the image migration network, and the effect is not satisfactory. The scheme is based on a semantic-specified model migration model, so that a better migration effect can be obtained.
As an alternative embodiment, the method for extracting the first material map of the first three-dimensional object from the first image comprises the following steps: segmenting the first image, and acquiring an object image and a segmentation mask of a first three-dimensional object in the first image; processing the segmentation mask by adopting a material migration network model to generate a material pixel coordinate graph on the first three-dimensional object; the method comprises the steps of collecting a material of a first three-dimensional object based on a material pixel coordinate graph on the first three-dimensional object, and generating a first material map, wherein the shape of the material pixel coordinate graph is the same as that of the first material map.
Fig. 3 is a schematic diagram of extracting a material map according to embodiment 1 of the present application, and the following describes in detail the extraction of a first material map with reference to fig. 3:
the first image may be segmented using an image segmentation model to segment an object image of the first three-dimensional object and a segmentation mask from the first image. In an alternative embodiment, the UNet may be trained to obtain the segmentation model by using a large number of garment images segmented from the images containing the garments and a segmentation mask as training samples. With reference to fig. 3, the garment is divided from the single garment drawing I by the image division model, and the divided garment I is obtainedmaskAnd split MaskI
The material migration network model may be a UNet + ResNet-based texture migration network, and based on the material migration network model, a material pixel coordinate map on the first three-dimensional object may be generated according to the segmentation mask. In an alternative embodiment, still referring to FIG. 3, a UNet + ResNet based texture migration network can be constructed to incorporate MaskcoordSending the data into a network to generate a corresponding UV pixel coordinate graph G _ UVcoord
Since the material pixel coordinate graph has the same shape as the first material map, the material pixel coordinate graph of the first three-dimensional object can be extracted from the first graph based on the material pixel coordinate graph. Still in the example of FIG. 3, G _ UVcoordIs the shape of a UV map corresponding to a three-dimensional model of a garment, wherein each pixel stores a coordinate value of ImaskNormalized coordinates of the pixels in (1). That is, G _ UV can be passedcoordFrom ImaskSampling is carried out, so that a corresponding UV map G _ UV is obtainedpix
As an alternative embodiment, before processing the segmentation mask by using the material migration network model, the method further includes: carrying out normalization processing on the coordinates of each pixel on the first image to obtain the image coordinates of the first image; the image coordinates of the first image are filled into the segmentation mask of the object image.
In an alternative embodiment, after obtaining the object image and segmenting the mask, as shown in FIG. 3, in combinationMask may be applied to the division MaskIFilling coordinate values to obtain Maskcoord. For example, the image coordinates may be normalized to fill in the split mask portions. The coordinate values may be for two channels, but even if the third channel is pre-populated, network training is not affected.
As an alternative embodiment, before processing the segmentation mask by using the material migration network model to generate the material pixel coordinate map on the first three-dimensional object, the method further includes: constructing a migration network model, wherein the steps comprise: acquiring training data, wherein the training data are three-dimensional image samples and material chartlet samples which are obtained through processing based on a computer graphics rendering algorithm; and training the training data by adopting a neural network model to generate a material migration model.
Specifically, the training data may be obtained through a computer graphics rendering algorithm, and as shown in fig. 3, according to the obtained three-dimensional garment model and corresponding original garment texture material, a garment map with a realistic sense and a corresponding UV map, etc. generated through an offline rendering algorithm (such as ray tracing) are included, where the garment map includes a corresponding G _ UV mapcoordAnd G _ UVpixTrue data Real _ UV ofcoordAnd Real _ UVpix. Wherein G _ UVcoordAnd Real _ UVcoordL2-loss function (least squares error loss function), G _ UV, can be used in betweenpixAnd Real _ UVpixCan be trained using the L1-loss function (minimum absolute value deviation loss function) and the perceptual loss function. And training by using the training data to obtain a mature texture migration module.
In the prior art, a scheme of material mapping which is directly fitted through a commodity map may be used, but the problem that the resolution of the obtained material is low and the data generation time is long is solved. The scheme adopts a computer graphics rendering algorithm, a large amount of clothes training data including corresponding UV (ultraviolet) maps, synthesized commodity maps and the like are generated through the clothes three-dimensional model and the original texture materials, and clearer training data can be generated quickly by using the rendering synthesis scheme.
As an alternative embodiment, after generating the first material map, the method further comprises: and performing material completion on the first material map.
After the first material map is acquired through the material migration model, the first material map also needs to be completed because the training data itself may have a texture missing problem.
In an alternative embodiment, still referring to FIG. 3, the UV map G _ UV is generated by a texture migration modulepixLater, because the training data has the texture missing problem, the texture completion module can be started to carry out the final G _ UVpixAnd (6) completing. The texture completion module can be realized by networks such as Image Inpainting and the like, training and optimization are not required to be carried out on the garment training data again, and the semantic property of texture completion is improved. Finally, the well-complemented UV texture mapping G _ UV is obtainedpix_comple
According to the scheme, the clothing texture UV map on the first three-dimensional object is extracted through a clothing segmentation module, a texture migration module and a texture completion module. The scheme is different from the pix2surf scheme in that corresponding training data are generated through a computer graphics rendering algorithm, and a texture completion module is added in the last step to complete missing parts of the UV chartlet, so that texture loss caused in the data generation and network training processes is reduced, and the final result is more complete.
As an optional embodiment, before the material migration module is used to migrate the first material map to obtain the second material map of the second three-dimensional object, the method further includes: semantic segmentation is carried out on the first three-dimensional object and the second three-dimensional object respectively to obtain a first assembly set of the first three-dimensional object and a second assembly set of the second three-dimensional object, wherein the assembly sets are formed by block bounding boxes of the three-dimensional objects; acquiring a rigid body surrounding matrix between a first component set and a second component set, wherein the rigid body surrounding matrix records semantic migration relations between block bounding boxes in the first component set and block bounding boxes in the second component set; and generating a material migration module based on a rigid body surrounding matrix between the first component set and the second component set, wherein the material migration module records a semantic migration relation matrix between each component in the first component set and each component in the second component set.
In the above scheme, the semantic segmentation is performed on the first three-dimensional object and the second three-dimensional object respectively, which may be to segment different semantic parts of the two three-dimensional objects, taking a garment as an example, which may be to segment different parts of the garment, for example: left sleeve, right sleeve, collar, clothing body, etc. The tile bounding box can be used for bounding 0BB (0 riendbackoutlingbox) of different semantic regions, and a rigid bounding matrix between two three-dimensional objects can be established by specifying the corresponding relationship of the tile bounding boxes between the two three-dimensional objects.
Based on the migration model generated in the step, the semantic migration relationship of the material map can be semi-automatically specified by giving two different style three-dimensional garment models, and the material map G _ UV generated in the embodiment is mappedpix_compleAnd obtaining the material chartlet Trans _ UV of different styles of clothes after migrationpix_final
Fig. 4 is a schematic diagram of material migration according to embodiment 1 of the present application, and referring to fig. 4, in this example, still taking a garment and a UV map as an example, two garment three-dimensional models are provided, and the two garment three-dimensional models are respectively stylized to obtain information of various parts of the garment, for example, sleeves, collars and the like have corresponding model mesh material information, and a first three-dimensional object is denoted as MsrcDenote the second three-dimensional object as Mdst. To mix M withsrcCorresponding UV map migration to MdstTo get MsrcAnd MdstObtaining two UV maps UV through computer graphics rendering baking algorithmsrcAnd UVdstWherein, different blocks represent different clothing semantic segmentation information. For example, UVsrcThe leftmost rectangular area in the middle represents the right sleeve. Through a simple image segmentation algorithm, OBBs of different semantic regions in the image are obtained, and UVsrcCorresponding OBBsrc_left_sleeve、OBBsrc_front_body、OBBsrc_right_sleeve……UVdstCorresponding OBBdstleft_sleeve、OBBdst_frontbody、OBBdst_right_sleeve....... Tonno specifies semantic migration relationships between OBBs between two models, such as UVsrcOn Board Bus (OBB)src_leftsleeveMigration to UVdstOn Board Bus (OBB)dst_leftsleeveTherefore, the semantic migration relationship between different garment components can be converted into a rigid body transformation matrix between two OBBs.
The calculation of the corresponding rigid transformation matrix from the semantic migration relationship between the specified OBBs may be as followssrc_left_sleeveMigration to OBBdst_left_sleeveThe corresponding translational and rotational scaling matrix M can be calculatedleft_sleeveThe zooming part can select zooming according to the width and the same ratio and zooming according to the same ratio according to the semantic information, for example, when the length of the short sleeves is changed into the length of the long sleeves, zooming according to the same ratio should be adopted. Determining a transformation matrix M of the designated semantic migration relationship of all the clothing components according to the processtrans. According to a transformation matrix MtransUV can be usedsrcUV conversion to upper graphsrc_trans. Likewise, for the G _ UV generated in the first steppix_compleCan be migrated to Trans _ UV by the above transformation schemepix
Because the UV chartlet after migration may have the texture missing problem, a texture completion module is needed to be added for completion, the network structure of the module can be consistent with the last texture completion, and the completed migration UV chartlet Trans _ UV can be obtained only by retraining the clothing datapix_comple
As an alternative embodiment, before rendering the second pixel map onto the second three-dimensional object, the method further comprises: and removing the light and shadow wrinkle information on the second plain material map.
Because information such as light, shadow and wrinkles exists on the originally input clothing drawing, the information is also kept on the second plain material map, but for the final three-dimensional rendering, the light, shadow and wrinkle information of the second plain material map has certain influence on the three-dimensional rendering effect, namely the information should not exist on the second plain material map.
The intrinsic image decomposition module is added in the scheme, the light and shadow wrinkle information on the UV map is removed, and the scheme can adopt the intrinsic image decomposition module based on the traditional or learning scheme to obtain a new UV map Trans _ UV by combining with the figure 4pix_decomp. And finally, a super-separation module can be added for super-separation because the general resolution of the UV texture generated by the network is not high.
Due to the current limitations of the deep learning network, the generated pictures have low resolution, and the UV maps required by three-dimensional rendering have high requirements on resolution and definition, otherwise the final result is very fuzzy. Common learning-based image hyper-segmentation modules can be employed here while generating garment UV-chartlet training data for retraining. Through the step, the final high-definition UV map Trans _ UV is obtainedpix_final
The scheme of the embodiment provides a thought for acquiring the texture of the clothes from a single scene graph, a street photo graph and a commodity graph aiming at the defect that the texture migration scheme based on the two-dimensional image needs original materials, and removes various redundant light and shadow wrinkle information in the image by combining an intrinsic image decomposition module. The limitation on the use of the user is greatly reduced, and the design idea of a costume designer is widened, so that corresponding texture migration can be carried out as long as a corresponding costume drawing exists. As long as the corresponding three-dimensional garment model and the segmented semantic information exist, the user can freely designate the semantic migration relationship among the garment components, such as migrating the textures on the sleeves to the collar.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
According to an embodiment of the present invention, there is further provided another embodiment of a method for migrating a material, and fig. 5 is a flowchart of another method for migrating a material according to embodiment 2 of the present application, and with reference to fig. 5, the method includes:
step S51, displaying a first image and a second image in a display interface, where the object displayed in the first image is a first three-dimensional object, the object displayed in the second image is a second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object.
The display interface may be an application interface for transferring materials, for example, when a designer transfers materials for clothing, the display interface may be a clothing design interface, and when a player transfers materials in a change-over game, the display interface may be a game interface.
The first three-dimensional object and the second three-dimensional object are different in style. Taking the garment as an example, the first three-dimensional object can be a short sleeve, and the second three-dimensional object can be a long sleeve, a skirt or any other style of garment; taking a vase as an example, the first three-dimensional object can be a cylindrical vase, and the second three-dimensional object can be various vases different from the first three-dimensional object, such as a cone, a sphere and the like.
The first image may be a two-dimensional image containing a first three-dimensional object. In an alternative embodiment, taking the clothing design as an example, the first image may be a show scene image or a street photo, and after the designer obtains the design inspiration from the show scene or the street photo, the show scene image or the street photo may be used as the first image, and the material map of the clothing is extracted from the show scene image or the street photo.
The first image may be a three-dimensional image including a first three-dimensional object. In an alternative embodiment, taking a three-dimensional reloading game as an example, the first image may be a three-dimensional model of a garment existing in the game, and when a game player needs to reload the garment along with the material of the garment, a material map of the garment may be extracted from the first image.
Step S53, the first image is selected, and the material of the first three-dimensional object is extracted from the first image to obtain the first material map.
Specifically, the first material map extracted from the first image may be obtained by segmenting the first image.
And step S55, receiving a mapping instruction for mapping the second three-dimensional object, wherein the mapping instruction is used for indicating that the material in the first material mapping needs to be replaced on the surface of the second three-dimensional object.
Specifically, the mapping instruction may trigger the device processor to perform the step of replacing the first material mapping on the first three-dimensional object onto the second three-dimensional object. For the designer, the mapping instruction may be that after the first material mapping is extracted from the first image, the first material mapping is moved to the model of the second three-dimensional object displayed on the interface by a dragging operation, and the device processor receives the mapping instruction. For a game player of the reloading game, after the first material map and the second three-dimensional object of the first image are selected, the reloading control is selected, and then the map instruction can be sent to the equipment processor.
It should be noted that, at present, in order to perform related design based on the same material map, a designer needs to acquire a source file of the material map, and based on the scheme of the present application, only a first image including a first three-dimensional object is acquired, such as a scene image and a street capture image, so that the difficulty of acquiring the material is greatly reduced, and the efficiency of searching the material is improved.
And step S57, responding to the mapping instruction, and adopting the material migration module to migrate the first material mapping to obtain a second material mapping of the second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles.
According to the scheme, the material migration model is obtained by constructing the semantic migration relationship between the first three-dimensional object and the second three-dimensional object, and then the material migration model is used for migrating the first material map, so that the second material map suitable for the second migration model can be obtained.
In an alternative embodiment, the first three-dimensional object and the second three-dimensional object may be subjected to structural analysis to obtain a rigid transformation matrix between the first three-dimensional object and the second three-dimensional object, and then the material migration module may be obtained based on the rigid transformation matrix between the first three-dimensional object and the second three-dimensional object.
Step S59, showing a second image rendered onto the second three-dimensional object by the second pixel map.
In the above steps, the second material map is rendered on the second three-dimensional object, and a similar design that the material map is the same as the first three-dimensional object but has a different style can be obtained. Still in the scene of clothing design, new product designs of different styles can be obtained by rendering the second plain material chartlet on the second three-dimensional object. For example, a designer can obtain inspiration from a show scene, extract a first material map from a short sleeve of the show scene, migrate the first material map through a material migration module to obtain a second material map, render the second material map onto a long sleeve, and obtain a design of the long sleeve in the same series as the short sleeve. Based on the scheme, the user can semi-automatically specify the semantic migration relationship of the object components among different styles so as to automatically migrate the input material map corresponding to the object map to the material map of another three-dimensional object model.
And in the steps, the obtained second pixel map is utilized, and a three-dimensional rendering engine is combined to render to obtain a final texture migration result. Besides static model effect display, technologies such as physical simulation and cloth animation can be combined to render a section of digital garment show-through video, so that a designer can feel the effect of texture design in an all-round manner. Therefore, the scheme can be applied to three-dimensional clothing design and virtual fitting scenes, and particularly in the field of interactive games, a user can transfer real clothing textures to a clothing model of a three-dimensional virtual character, so that a user-defined clothing effect is generated, and interactivity and interestingness are greatly improved.
In the above embodiment of the present application, a first image and a second image are displayed in a display interface, where an object displayed in the first image is a first three-dimensional object, an object displayed in the second image is a second three-dimensional object, and styles of the first three-dimensional object and the second three-dimensional object are different; selecting a first image, and extracting materials of a first three-dimensional object from the first image to obtain a first material map; receiving a mapping instruction for mapping the second three-dimensional object, wherein the mapping instruction is used for indicating that materials in the first material mapping need to be replaced on the surface of the second three-dimensional object; responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; a second image of the second pixel map rendered onto the second three-dimensional object is shown. According to the scheme, the texture of the object is obtained from the image, the material chartlet of the three-dimensional model in the same style is recovered, then the material chartlet is sent to the material transfer module, the material chartlet of the object in other styles is obtained, and then the transfer of the material between the objects in different styles can be completed by rendering based on the newly generated material chartlet, so that the texture transfer is realized between the objects in different styles, the technical problem that in the process of designing the three-dimensional image of the object in the prior art, the material of the two-dimensional image can be used for generating the corresponding three-dimensional model chartlet through a deep learning network, and then the three-dimensional model chartlet is re-rendered to the three-dimensional model of the object is solved, and the technical problem that the efficiency of the transfer process of the material is low due to the fact that the transfer of the material is limited to the transfer between the objects in the same style is solved.
Example 3
According to an embodiment of the present application, there is also provided a material migration apparatus for implementing the material migration method in embodiment 1, and fig. 6 is a schematic diagram of a material migration apparatus according to embodiment 3 of the present application, as shown in fig. 6, the apparatus 600 includes:
an extracting module 602, configured to extract a first material map of a first three-dimensional object from a first image;
the triggering module 604 is configured to receive a mapping instruction for mapping a second three-dimensional object, where the mapping instruction is used to indicate that a material in a first material mapping needs to be replaced on a surface of the second three-dimensional object, and styles of the first three-dimensional object and the second three-dimensional object are different;
a response module 606, configured to respond to the map instruction, migrate the first material map by using the material migration module to obtain a second material map of the second three-dimensional object, where the material migration module constructs semantic migration relationships between three-dimensional objects of different styles;
and a rendering module 608 for rendering the second pixel map onto the second three-dimensional object.
It should be noted here that the above-mentioned extracting module 602, triggering module 604, responding module 606 and rendering module 608 correspond to steps S21 to S27 in embodiment 1, and the four modules are the same as the corresponding steps in the implementation example and application scenario, but are not limited to the disclosure in the above-mentioned embodiment one. It should be noted that the modules described above as part of the apparatus may be run in the computing device 10 provided in the first embodiment.
As an alternative embodiment, semantic migration relationships are used to characterize rigid-body geometric transformations between image regions located on three-dimensional objects of different styles.
As an alternative embodiment, the extraction module comprises: the segmentation submodule is used for segmenting the first image, and acquiring an object image and a segmentation mask of a first three-dimensional object in the first image; the generation submodule is used for processing the division mask by adopting a material migration network model and generating a material pixel coordinate graph on the first three-dimensional object; and the acquisition submodule is used for acquiring a material of the first three-dimensional object based on a material pixel coordinate graph on the first three-dimensional object and generating a first material map, wherein the shape of the material pixel coordinate graph is the same as that of the first material map.
As an alternative embodiment, the apparatus further comprises: the normalization module is used for normalizing the coordinates of each pixel on the first image before the segmentation mask is processed by adopting the material migration network model to obtain the image coordinates of the first image; and the filling module is used for filling the image coordinates of the first image into the segmentation mask of the object image.
As an alternative embodiment, the apparatus further comprises: the system comprises a construction module and a first acquisition module, wherein the construction module is used for constructing a migration network model before a material migration network model is used for processing a segmentation mask to generate a material pixel coordinate graph on a first three-dimensional object, and the first acquisition module is used for acquiring training data, wherein the training data are a three-dimensional image sample and a material chartlet sample which are obtained by processing based on a computer graphics rendering algorithm; and the first generation module is used for training the training data by adopting a neural network model to generate a material migration model.
As an optional embodiment, the apparatus further comprises: and the completion module is used for completing the materials of the first material map after the first material map is generated.
As an alternative embodiment, the apparatus further comprises: the semantic segmentation module is used for performing semantic segmentation on the first three-dimensional object and the second three-dimensional object respectively to obtain a first component set of the first three-dimensional object and a second component set of the second three-dimensional object before the material migration module is used for migrating the first material map to obtain a second material map of the second three-dimensional object, wherein the component sets are formed by block bounding boxes of the three-dimensional objects; the second acquisition module is used for acquiring a rigid body surrounding matrix between the first component set and the second component set, wherein the rigid body surrounding matrix records semantic migration relations between the block bounding boxes in the first component set and the block bounding boxes in the second component set; and the second generation module is used for generating a material migration module based on the rigid body surrounding matrix between the first component set and the second component set, wherein the material migration module records a semantic migration relation matrix between each component in the first component set and each component in the second component set.
As an alternative embodiment, the apparatus further comprises: and the removing module is used for removing the light and shadow wrinkle information on the second pixel map before rendering the second pixel map on the second three-dimensional object.
Example 4
According to an embodiment of the present invention, there is further provided a material migration apparatus for implementing the material migration method in embodiment 2, and fig. 7 is a schematic diagram of a material migration apparatus according to embodiment 4 of the present application, as shown in fig. 7, the apparatus 700 includes:
a first display module 702, configured to display a first image and a second image in a display interface, where a display object in the first image is a first three-dimensional object, an object displayed in the second image is a second three-dimensional object, and styles of the first three-dimensional object and the second three-dimensional object are different;
the extracting module 704 is used for selecting the first image and extracting materials of the first three-dimensional object from the first image to obtain a first material map;
the triggering module 706 is configured to receive a mapping instruction for mapping the second three-dimensional object, where the mapping instruction is used to indicate that a material in the first material mapping needs to be replaced on the surface of the second three-dimensional object;
a response module 708, configured to respond to the map instruction, migrate the first material map by using the material migration module to obtain a second material map of the second three-dimensional object, where the material migration module constructs semantic migration relationships between three-dimensional objects of different styles;
a second displaying module 7010 is configured to display a second image rendered by the second pixel map onto the second three-dimensional object.
It should be noted here that the first presentation module 702, the extraction module 704, the trigger module 706, the response module 708, and the second presentation module 7010 correspond to steps S51 through S59 in embodiment 2, and the five modules are the same as the corresponding steps in the implementation example and the application scenario, but are not limited to the disclosure in the first embodiment. It should be noted that the modules described above as part of the apparatus may be run in the computing device 10 provided in the first embodiment.
Example 5
Embodiments of the invention may provide a computing device that may be any one of a computing device group. Optionally, in this embodiment, the computing device may also be replaced with a terminal device such as a mobile terminal.
Optionally, in this embodiment, the computing device may be located in at least one network device of a plurality of network devices of a computer network.
In this embodiment, the above-mentioned computing device may execute program codes of the following steps in the migration method of the material: extracting a first material map of a first three-dimensional object from the first image; receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in a first material mapping need to be replaced on the surface of the second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object; responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; and rendering the second pixel map to the second three-dimensional object.
Alternatively, fig. 8 is a block diagram of a computing device according to an embodiment of the invention. As shown in fig. 8, the computing device a may include: one or more processors 802 (only one of which is shown), a memory 804, and a peripheral interface 806.
The memory may be used to store software programs and modules, such as program instructions/modules corresponding to the material migration method and apparatus and the data processing method in the embodiments of the present invention, and the processor executes various functional applications and data processing by running the software programs and modules stored in the memory, that is, implements the material migration method described above. The memory may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory may further include memory remotely located from the processor, and these remote memories may be connected to terminal a through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The processor can call the information and application program stored in the memory through the transmission device to execute the following steps: extracting a first material map of a first three-dimensional object from the first image; receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in a first material mapping need to be replaced on the surface of the second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object; responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; and rendering the second pixel map to the second three-dimensional object.
Optionally, the semantic migration relationship is used to characterize rigid-body geometric transformations between image regions located on three-dimensional objects of different styles.
Optionally, the processor may further execute the program code of the following steps: extracting a first material map of a first three-dimensional object from a first image, comprising: segmenting the first image, and acquiring an object image and a segmentation mask of a first three-dimensional object in the first image; processing the segmentation mask by adopting a material migration network model to generate a material pixel coordinate graph on the first three-dimensional object; the method comprises the steps of collecting a material of a first three-dimensional object based on a material pixel coordinate graph on the first three-dimensional object, and generating a first material map, wherein the shape of the material pixel coordinate graph is the same as that of the first material map.
Optionally, the processor may further execute the program code of the following steps: before the material migration network model is adopted to process the segmentation mask, the coordinates of each pixel on the first image are normalized to obtain the image coordinates of the first image; the image coordinates of the first image are filled into the segmentation mask of the object image.
Optionally, the processor may further execute the program code of the following steps: before the segmentation mask is processed by adopting a material migration network model and a material pixel coordinate graph on a first three-dimensional object is generated, the migration network model is constructed, and the method comprises the following steps: acquiring training data, wherein the training data are three-dimensional image samples and material chartlet samples which are obtained through processing based on a computer graphics rendering algorithm; and training the training data by adopting a neural network model to generate a material migration model.
Optionally, the processor may further execute the program code of the following steps: and after the first material map is generated, performing material completion on the first material map.
Optionally, the processor may further execute the program code of the following steps: before a material migration module is adopted to migrate a first material map to obtain a second material map of a second three-dimensional object, semantic segmentation is respectively carried out on the first three-dimensional object and the second three-dimensional object to obtain a first component set of the first three-dimensional object and a second component set of the second three-dimensional object, wherein the component sets are formed by block bounding boxes of the three-dimensional objects; acquiring a rigid body surrounding matrix between a first component set and a second component set, wherein the rigid body surrounding matrix records semantic migration relations between block bounding boxes in the first component set and block bounding boxes in the second component set; and generating a material migration module based on a rigid body surrounding matrix between the first component set and the second component set, wherein the material migration module records a semantic migration relation matrix between each component in the first component set and each component in the second component set.
Optionally, the processor may further execute the program code of the following steps: and removing the light and shadow wrinkle information on the second pixel map before rendering the second pixel map on the second three-dimensional object.
The embodiment of the invention provides a material migration method. Mapping a first material of a first three-dimensional object by extracting the first material from a first image; receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in a first material mapping need to be replaced on the surface of the second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object; responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; and rendering the second pixel map to the second three-dimensional object. According to the scheme, the texture of the object is obtained from the image, the material chartlet of the three-dimensional model in the same style is recovered, then the material chartlet is sent to the material transfer module, the material chartlet of the object in other styles is obtained, and then the transfer of the material between the objects in different styles can be completed by rendering based on the newly generated material chartlet, so that the texture transfer is realized between the objects in different styles, the technical problem that in the process of designing the three-dimensional image of the object in the prior art, the material of the two-dimensional image can be used for generating the corresponding three-dimensional model chartlet through a deep learning network, and then the three-dimensional model chartlet is re-rendered to the three-dimensional model of the object is solved, and the technical problem that the efficiency of the transfer process of the material is low due to the fact that the transfer of the material is limited to the transfer between the objects in the same style is solved.
It can be understood by those skilled in the art that the structure shown in fig. 8 is only an illustration, and the computing device may also be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palmtop computer, a Mobile Internet Device (MID), a PAD, and the like. Fig. 8 is a diagram illustrating a structure of the electronic device. For example, computing device A may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in FIG. 8, or have a different configuration than shown in FIG. 8.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
Example 6
The embodiment of the invention also provides a storage medium. Optionally, in this embodiment, the storage medium may be configured to store a program code executed by the material migration method provided in the first embodiment.
Optionally, in this embodiment, the storage medium may be located in any one of computing devices in a computing device group in a computer network, or in any one of mobile terminals in a mobile terminal group.
Optionally, in this embodiment, the storage medium is configured to store program code for performing the following steps: extracting a first material map of a first three-dimensional object from the first image; receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in a first material mapping need to be replaced on the surface of the second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object; responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of a second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles; and rendering the second pixel map to the second three-dimensional object.
Example 7
According to an embodiment of the present invention, there is further provided an embodiment of a data processing method, and fig. 9 is a flowchart of a data processing method according to embodiment 7 of the present application, and with reference to fig. 9, the method includes:
in step S91, a first image is acquired.
The first image may be an image including a first three-dimensional object. It should be noted that, at present, in order to perform related design based on the same material map, a designer needs to acquire a source file of the material map, and based on the scheme of the present application, only a first image including a first three-dimensional object is acquired, such as a scene image and a street capture image, so that the difficulty of acquiring the material is greatly reduced, and the efficiency of searching the material is improved.
Step S93, a first material map of the first three-dimensional object is extracted from the first image.
Specifically, the first material map may be a texture map on the first three-dimensional object. The first three-dimensional object may be an object from which material is to be extracted.
In an alternative embodiment, the first image may be segmented, an object image of the first three-dimensional object in the first image and a segmentation mask acquired; processing the segmentation mask by adopting a material migration network model to generate a material pixel coordinate graph on the first three-dimensional object; the method comprises the steps of collecting a material of a first three-dimensional object based on a material pixel coordinate graph on the first three-dimensional object, and generating a first material map, wherein the shape of the material pixel coordinate graph is the same as that of the first material map.
Step S95, rendering the first material map onto a second three-dimensional object based on a geometric transformation relationship between the first three-dimensional object and the second three-dimensional object.
Because the first three-dimensional object and the second three-dimensional object are different in style, the first material map is difficult to directly transfer to the second three-dimensional object, and the first material map can be rendered on the second three-dimensional object based on the geometric transformation relation between the first three-dimensional object and the second three-dimensional object.
In an alternative embodiment, the first three-dimensional object and the second three-dimensional object may be subjected to structural analysis to obtain a rigid transformation matrix between the first three-dimensional object and the second three-dimensional object, the first material map is transformed based on the rigid transformation matrix between the first three-dimensional object and the second three-dimensional object, and the transformed first material map is rendered on the second three-dimensional object.
According to the scheme, the texture of an object is obtained from an image, a material map of a three-dimensional model in the same style is restored, then the first material map is rendered on the second three-dimensional object based on the geometric transformation relation between the two three-dimensional objects, so that the migration of materials between the objects in different styles is completed, the migration of the texture between the objects in different styles is realized, and the technical problem that in the prior art, in the process of designing the three-dimensional image of an article, the materials of the two-dimensional image can be generated into a corresponding three-dimensional model map through a deep learning network, and then the three-dimensional model map is re-rendered into the three-dimensional model of the article is solved.
Example 8
According to an embodiment of the present invention, there is further provided another embodiment of a data processing method, and fig. 10 is a flowchart of another data processing method according to embodiment 8 of the present application, and with reference to fig. 10, the method includes:
step S101 displays a first image and a second image.
The first image may be an image including a first three-dimensional object. The second image may be an image including a second three-dimensional object in which the map material in the first image is transferred to the second image in this embodiment. It should be noted that, at present, in order to perform related design based on the same material map, a designer needs to acquire a source file of the material map, and based on the scheme of the present application, only a first image including a first three-dimensional object is acquired, such as a scene image and a street capture image, so that the difficulty of acquiring the material is greatly reduced, and the efficiency of searching the material is improved.
Step S103, receiving a material extracting instruction, and extracting a first material map of the first three-dimensional object from the first image according to the material extracting instruction.
Specifically, the first material map may be a texture map on the first three-dimensional object. The first three-dimensional object may be an object from which material is to be extracted.
In an alternative embodiment, the first image may be segmented, an object image of the first three-dimensional object in the first image and a segmentation mask acquired; processing the segmentation mask by adopting a material migration network model to generate a material pixel coordinate graph on the first three-dimensional object; the method comprises the steps of collecting a material of a first three-dimensional object based on a material pixel coordinate graph on the first three-dimensional object, and generating a first material map, wherein the shape of the material pixel coordinate graph is the same as that of the first material map.
Step S105, receiving a material transfer instruction, and rendering the first material map to a second three-dimensional object in the second image according to the material transfer instruction based on a geometric transformation relation between the first three-dimensional object and the second three-dimensional object.
Because the first three-dimensional object and the second three-dimensional object are different in style, the first material map is difficult to directly transfer to the second three-dimensional object, and the first material map can be rendered on the second three-dimensional object based on the geometric transformation relation between the first three-dimensional object and the second three-dimensional object.
In an alternative embodiment, the first three-dimensional object and the second three-dimensional object may be subjected to structural analysis to obtain a rigid transformation matrix between the first three-dimensional object and the second three-dimensional object, the first material map is transformed based on the rigid transformation matrix between the first three-dimensional object and the second three-dimensional object, and the transformed first material map is rendered on the second three-dimensional object.
According to the scheme, the texture of an object is obtained from an image, a material map of a three-dimensional model in the same style is restored, then the first material map is rendered on the second three-dimensional object based on the geometric transformation relation between the two three-dimensional objects, so that the migration of materials between the objects in different styles is completed, the migration of the texture between the objects in different styles is realized, and the technical problem that in the prior art, in the process of designing the three-dimensional image of an article, the materials of the two-dimensional image can be generated into a corresponding three-dimensional model map through a deep learning network, and then the three-dimensional model map is re-rendered into the three-dimensional model of the article is solved.
Example 9
According to an embodiment of the present application, there is further provided a data processing apparatus for implementing the data processing method in embodiment 7, and fig. 11 is a schematic diagram of a data processing apparatus according to embodiment 9 of the present application, and as shown in fig. 11, the apparatus 1100 includes:
an obtaining module 1102 is configured to obtain a first image.
An extracting module 1104 is configured to extract a first material map of the first three-dimensional object from the first image.
A rendering module 1106, configured to render the first material map onto a second three-dimensional object based on a geometric transformation relationship between the first three-dimensional object and the second three-dimensional object.
It should be noted here that the obtaining module 1102, the extracting module 1104 and the rendering module 1106 correspond to steps S91 to S95 in embodiment 7, and the three modules are the same as the corresponding steps in the implementation example and application scenario, but are not limited to the disclosure in the first embodiment. It should be noted that the modules described above as part of the apparatus may be run in the computing device 10 provided in the first embodiment.
Example 10
According to an embodiment of the present application, there is further provided a data processing apparatus for implementing the data processing method in embodiment 8, and fig. 12 is a schematic diagram of a data processing apparatus according to embodiment 10 of the present application, and as shown in fig. 12, the apparatus 1200 includes:
a display module 1202 for displaying the first image and the second image.
A first receiving module 1204, configured to receive a story extraction instruction, and extract a first story map of a first three-dimensional object from the first image according to the story extraction instruction.
A second receiving module 1206, configured to receive a material migration instruction, and render the first material map onto a second three-dimensional object in the second image according to the material migration instruction based on a geometric transformation relationship between the first three-dimensional object and the second three-dimensional object.
It should be noted that the display module 1202, the first receiving module 1204 and the second receiving module 1206 correspond to steps S91 to S95 in embodiment 7, and the three modules are the same as the corresponding steps in the implementation example and the application scenario, but are not limited to the disclosure in the first embodiment. It should be noted that the modules described above as part of the apparatus may be run in the computing device 10 provided in the first embodiment.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (16)

1. A method for migrating a material, comprising:
extracting a first material map of a first three-dimensional object from the first image;
receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in the first material mapping need to be replaced on the surface of the second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object;
responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of the second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles;
rendering the second pixel map onto the second three-dimensional object.
2. The method according to claim 1, wherein the semantic migration relationships are used to characterize rigid body geometric transformations between image regions located on three-dimensional objects of different styles.
3. The method of claim 1, wherein extracting the first sketch map of the first three-dimensional object from the first image comprises:
segmenting the first image, and acquiring an object image and a segmentation mask of the first three-dimensional object in the first image;
processing the segmentation mask by adopting a material migration network model to generate a material pixel coordinate graph on the first three-dimensional object;
and collecting the material of the first three-dimensional object based on a material pixel coordinate graph on the first three-dimensional object, and generating the first material map, wherein the shape of the material pixel coordinate graph is the same as that of the first material map.
4. The method of claim 3, wherein prior to processing the segmentation mask using a material migration network model, the method further comprises:
normalizing the coordinates of each pixel on the first image to obtain the image coordinates of the first image;
filling image coordinates of the first image into the segmentation mask of the object image.
5. The method of claim 3, wherein prior to processing the segmentation mask using a material migration network model to generate a material pixel coordinate map on the first three-dimensional object, the method further comprises:
constructing the migration network model, wherein the steps comprise:
acquiring training data, wherein the training data are a three-dimensional image sample and a material chartlet sample which are obtained by processing based on a computer graphics rendering algorithm;
and training the training data by adopting a neural network model to generate a material migration model.
6. The method of claim 3, wherein after generating the first material map, the method further comprises: and completing the first material map.
7. The method of claim 1, wherein prior to migrating the first material map using a material migration module to obtain a second material map of the second three-dimensional object, the method further comprises:
semantic segmentation is carried out on the first three-dimensional object and the second three-dimensional object respectively to obtain a first assembly set of the first three-dimensional object and a second assembly set of the second three-dimensional object, wherein the assembly sets are formed by block bounding boxes of the three-dimensional objects;
acquiring a rigid body surrounding matrix between the first component set and the second component set, wherein the rigid body surrounding matrix records semantic migration relations between block bounding boxes in the first component set and block bounding boxes in the second component set;
and generating the material migration module based on a rigid body surrounding matrix between the first component set and the second component set, wherein the material migration module records a semantic migration relationship matrix between each component in the first component set and each component in the second component set.
8. The method of claim 1, wherein prior to rendering the second pixel map onto the second three-dimensional object, the method further comprises: and removing the light, shadow and fold information on the second plain material map.
9. A method for migrating a material, comprising:
displaying a first image and a second image in a display interface, wherein the displayed object in the first image is a first three-dimensional object, the displayed object in the second image is a second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object;
selecting the first image, and extracting materials of the first three-dimensional object from the first image to obtain a first material map;
triggering a mapping instruction for mapping the second three-dimensional object, wherein the mapping instruction is used for indicating that materials in the first material mapping need to be replaced on the surface of the second three-dimensional object;
responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of the second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles;
showing a second image of the second pixel map rendered onto the second three-dimensional object.
10. An apparatus for transferring a material, comprising:
the extraction module is used for extracting a first material map of the first three-dimensional object from the first image;
the triggering module is used for receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in the first material mapping need to be replaced on the surface of the second three-dimensional object, and the styles of the first three-dimensional object and the second three-dimensional object are different;
the response module is used for responding to the mapping instruction and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of the second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects with different styles;
and the rendering module is used for rendering the second pixel map to the second three-dimensional object.
11. An apparatus for transferring a material, comprising:
the display device comprises a first display module, a second display module and a display module, wherein the first display module is used for displaying a first image and a second image in a display interface, the display object in the first image is a first three-dimensional object, the display object in the second image is a second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object;
the extraction module is used for selecting the first image and extracting a material of the first three-dimensional object from the first image to obtain a first material map;
the triggering module is used for triggering a mapping instruction for mapping the second three-dimensional object, wherein the mapping instruction is used for indicating that materials in the first material mapping need to be replaced on the surface of the second three-dimensional object;
the response module is used for responding to the mapping instruction and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of the second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects with different styles;
and the second display module is used for displaying a second image which is rendered on the second three-dimensional object by the second pixel map.
12. A storage medium comprising a stored program, wherein the program, when executed, controls an apparatus on which the storage medium is located to perform the migration of material according to any one of claims 1 to 9.
13. A processor configured to execute a program, wherein the program when executed performs the migration of material as claimed in any one of claims 1 to 9.
14. A system for migrating material, comprising:
a processor; and
a memory coupled to the processor for providing instructions to the processor for processing the following processing steps:
extracting a first material map of a first three-dimensional object from the first image;
receiving a mapping instruction for mapping a second three-dimensional object, wherein the mapping instruction is used for indicating that materials in the first material mapping need to be replaced on the surface of the second three-dimensional object, and the style of the first three-dimensional object is different from that of the second three-dimensional object;
responding to the mapping instruction, and adopting a material migration module to migrate the first material mapping to obtain a second material mapping of the second three-dimensional object, wherein the material migration module constructs semantic migration relations among three-dimensional objects of different styles;
rendering the second pixel map onto the second three-dimensional object.
15. A data processing method, comprising:
acquiring a first image;
extracting a first material map of a first three-dimensional object from the first image;
rendering the first material map onto a second three-dimensional object based on a geometric transformation relationship between the first three-dimensional object and the second three-dimensional object.
16. A data processing method, comprising:
displaying the first image and the second image;
receiving a material extracting instruction, and extracting a first material map of a first three-dimensional object from the first image according to the material extracting instruction;
receiving a material transfer instruction, and rendering the first material map to a second three-dimensional object in the second image according to the material transfer instruction based on a geometric transformation relation between the first three-dimensional object and the second three-dimensional object.
CN202011418614.3A 2020-12-07 2020-12-07 Material migration method and device and data processing method Pending CN114612641A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011418614.3A CN114612641A (en) 2020-12-07 2020-12-07 Material migration method and device and data processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011418614.3A CN114612641A (en) 2020-12-07 2020-12-07 Material migration method and device and data processing method

Publications (1)

Publication Number Publication Date
CN114612641A true CN114612641A (en) 2022-06-10

Family

ID=81856815

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011418614.3A Pending CN114612641A (en) 2020-12-07 2020-12-07 Material migration method and device and data processing method

Country Status (1)

Country Link
CN (1) CN114612641A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114792354A (en) * 2022-06-22 2022-07-26 北京飞渡科技有限公司 Model processing method, model processing device, storage medium and electronic equipment
CN115631091A (en) * 2022-12-23 2023-01-20 南方科技大学 Selective style migration method and terminal

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114792354A (en) * 2022-06-22 2022-07-26 北京飞渡科技有限公司 Model processing method, model processing device, storage medium and electronic equipment
CN114792354B (en) * 2022-06-22 2022-11-11 北京飞渡科技有限公司 Model processing method and device, storage medium and electronic equipment
CN115631091A (en) * 2022-12-23 2023-01-20 南方科技大学 Selective style migration method and terminal
CN115631091B (en) * 2022-12-23 2023-03-21 南方科技大学 Selective style migration method and terminal

Similar Documents

Publication Publication Date Title
US11961200B2 (en) Method and computer program product for producing 3 dimensional model data of a garment
CN107274493B (en) Three-dimensional virtual trial type face reconstruction method based on mobile platform
CN106934693B (en) Ceramic tile type selection method and system displayed in VR scene based on AR product model
CN107358649B (en) Processing method and device of terrain file
CN108765520B (en) Text information rendering method and device, storage medium and electronic device
CN112215934A (en) Rendering method and device of game model, storage medium and electronic device
CN109325990B (en) Image processing method, image processing apparatus, and storage medium
CN106683161B (en) Augmented reality shielding method based on image segmentation and user-defined layer method
CN108876886B (en) Image processing method and device and computer equipment
CN102834849A (en) Image drawing device for drawing stereoscopic image, image drawing method, and image drawing program
CN106797458A (en) The virtual change of real object
CN111882627A (en) Image processing method, video processing method, device, equipment and storage medium
CN101477701A (en) Built-in real tri-dimension rendering process oriented to AutoCAD and 3DS MAX
CN114612641A (en) Material migration method and device and data processing method
CN101477700A (en) Real tri-dimension display method oriented to Google Earth and Sketch Up
CN109448088B (en) Method and device for rendering three-dimensional graphic wire frame, computer equipment and storage medium
CN106447756B (en) Method and system for generating user-customized computer-generated animations
KR102353556B1 (en) Apparatus for Generating Facial expressions and Poses Reappearance Avatar based in User Face
CN108986232A (en) A method of it is shown in VR and AR environment picture is presented in equipment
CN111815785A (en) Method and device for presenting reality model, electronic equipment and storage medium
CN112274934A (en) Model rendering method, device, equipment and storage medium
CN101477702A (en) Built-in real tri-dimension driving method for computer display card
KR20060108271A (en) Method of image-based virtual draping simulation for digital fashion design
CN110471528A (en) A kind of costume changing method, device, system and the storage medium of virtual spectators
CN111179390B (en) Method and device for efficiently previewing CG (content distribution) assets

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40074554

Country of ref document: HK