CN113204829A - Home decoration scene design method and system based on neural network - Google Patents
Home decoration scene design method and system based on neural network Download PDFInfo
- Publication number
- CN113204829A CN113204829A CN202110636722.6A CN202110636722A CN113204829A CN 113204829 A CN113204829 A CN 113204829A CN 202110636722 A CN202110636722 A CN 202110636722A CN 113204829 A CN113204829 A CN 113204829A
- Authority
- CN
- China
- Prior art keywords
- diagram
- scene
- home decoration
- wire frame
- gan model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000013461 design Methods 0.000 title claims abstract description 47
- 238000005034 decoration Methods 0.000 title claims abstract description 43
- 238000000034 method Methods 0.000 title claims abstract description 36
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 21
- 238000010586 diagram Methods 0.000 claims abstract description 99
- 230000000694 effects Effects 0.000 claims abstract description 59
- 238000013256 Gubra-Amylin NASH model Methods 0.000 claims abstract description 38
- 230000008569 process Effects 0.000 claims abstract description 12
- 238000012549 training Methods 0.000 claims description 41
- 230000003042 antagnostic effect Effects 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 4
- 238000013135 deep learning Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000000605 extraction Methods 0.000 description 2
- 230000005012 migration Effects 0.000 description 2
- 238000013508 migration Methods 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000010626 work up procedure Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
- G06F30/13—Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
- G06F30/27—Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Evolutionary Computation (AREA)
- General Physics & Mathematics (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- Biophysics (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Molecular Biology (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Computational Linguistics (AREA)
- Pure & Applied Mathematics (AREA)
- Civil Engineering (AREA)
- Structural Engineering (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Architecture (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a home decoration scene design method and a system based on a neural network, wherein the method comprises the following steps: generating a wire frame diagram according to the existing scene effect diagram and/or the blank scene diagram of the house type to be designed, and inputting the wire frame diagram into a pre-trained generation confrontation network GAN model to obtain the home decoration scene effect diagram of the house type to be designed. According to the invention, the GAN technology is utilized to take the wire frame diagram as input, the home decoration scene effect diagram is generated after neural network operation is carried out, and then the home decoration scene effect diagram is accessed to the front end to present the scene, so that the end-to-end application is formed, the efficiency of a designer for making the effect diagram is improved, and a user can participate in the process of jointly experiencing inspiration design.
Description
Technical Field
The invention relates to the field of home decoration design, in particular to a method and a system for designing a home decoration scene based on a neural network.
Background
In the field of home design, designers need to meet customer requirements as much as possible to make the service itself. The design inspiration of designers comes from multiple aspects, wherein, the key points are three: 1. the designer follows the experience and technology of the design (determines the quality and time consumption of the effect diagram design result); 2. information which the designer knows from the customer and thinks well about the requirements of the home decoration; 3. and estimating the overall cost of the customer decoration service.
In the process of meeting the requirements of the client, a designer needs to change the design scheme for many times so as to make the client agree with the decoration intention of the client. As a drainage mode in the traditional industry, the balance as appropriate as possible needs to be achieved in the aspects of labor cost and technical strength display. In the past, a designer needs to design a scheme base map, a soft-package model is generated or the existing soft-package model matching is selected, adjustment is carried out in a design scene, then achievement rendering is carried out, the whole time consumption of the scheme is long, and the designer can influence the maintenance of the guest situation when business is busy.
With the wide popularization of network application, the home decoration design also provides panoramic roaming experience type home decoration design experience, and relevant efforts are made in front end presentation to enrich the scene, but the pain point that a designer spends a long time in generating an effect graph is not solved.
Disclosure of Invention
Aiming at the technical problems, the invention provides a home decoration scene design method and system based on a neural network, which can greatly improve the efficiency of a designer for making a home decoration scene effect diagram.
The technical scheme for solving the technical problems is as follows:
a home decoration scene design method based on a neural network comprises the following steps:
inputting the wire frame diagram of the house type to be designed into a pre-trained generated countermeasure network GAN model to obtain the home decoration scene effect diagram of the house type to be designed.
The invention has the beneficial effects that: the method has the advantages that the GAN technology can be used for inputting the wire frame diagram, the home decoration scene effect diagram is generated after neural network operation is carried out, the home decoration scene effect diagram is accessed to the front end to present the scene, end-to-end application is formed, the efficiency of a designer for making the effect diagram is improved, and a user can participate in the process of experiencing inspiration design together.
On the basis of the technical scheme, the invention can be further improved as follows.
Further, before inputting the wire frame diagram of the house type to be designed into the pre-trained generated countermeasure network GAN model, the method further comprises:
and generating a wire frame diagram according to the existing scene effect diagram and/or the blank scene diagram of the house type to be designed.
Further, the method further comprises:
and adopting the existing scene effect graph as training data of the GAN model to carry out closed-loop learning.
Further, the closed-loop learning is performed by using the existing scene effect graph as the training data of the GAN model, and specifically includes:
and extracting a wire frame diagram through the training data, and simultaneously enabling a generator of the GAN model and the training data to be used as input and a discriminator of the GAN model to carry out antagonistic training, thereby optimizing parameters of the GAN model until the whole training process is converged.
In order to achieve the above object, the present invention further provides a home decoration scene design system based on a neural network, including:
and the effect diagram generating module is used for inputting the wire frame diagram of the house type to be designed into a pre-trained generated countermeasure network GAN model to obtain the home decoration scene effect diagram of the house type to be designed.
Further, the system further comprises: and the wire frame diagram generating module is used for generating a wire frame diagram according to the existing scene effect diagram and/or blank scene diagram of the house type to be designed before the effect diagram generating module inputs the wire frame diagram of the house type to be designed into a pre-trained generation confrontation network GAN model.
Further, the system further comprises:
and the model training module is used for performing closed-loop learning by using the existing scene effect graph as the training data of the GAN model.
Further, the model training module is specifically configured to:
and extracting a wire frame diagram through the training data, and simultaneously enabling a generator of the GAN model and the training data to be used as input and a discriminator of the GAN model to carry out antagonistic training, thereby optimizing parameters of the GAN model until the whole training process is converged.
Drawings
Fig. 1 is a flowchart of a home decoration scenario design method based on a neural network according to an embodiment of the present invention;
FIG. 2 is a diagram of the prior art effect and the generated wire frame as training data according to an embodiment of the present invention;
fig. 3 is a wire frame diagram generated by using an existing scene effect diagram as an input and a finally generated home decoration scene effect diagram according to an embodiment of the present invention;
fig. 4 is a wire frame diagram generated by using a blank scene diagram as an input and a final generated home decoration scene effect diagram according to an embodiment of the present invention.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
The development of artificial intelligence technology provides infinite possibility of cost reduction and efficiency improvement for various industries at present. Inspiration is the strength which cannot be proved but cannot be replaced in the design field, and the inspiration brings artistic interest and value to home decoration design.
The digitalized conversion of the home decoration exists in the conversion of a data storage format and the electronic form conversion of a design drawing, the design concept can also be used as a breakthrough to try, a deep learning technology framework is utilized, and a generation-confrontation network is adopted to learn the design concept of a high-end designer, so that feedback or work-up is provided for the inspiration design in the field of home decoration design, certain constraint conditions are added, and the attempt of assistance and technical revolution can be brought to the inspiration design of the home decoration.
A generated countermeasure Network (GAN) is a type of neural Network structure that has the ability to generate data based on the Network itself. As one of the most promising methods for unsupervised learning in complex distribution in recent years. The model passes through (at least) two modules in the framework: the mutual game learning of the Generative Model (hereinafter, referred to as a generator) and the discriminant Model (hereinafter, referred to as a discriminant) produces a relatively good output. In the original GAN theory, the generator and the discriminator are not required to be neural networks, and only functions capable of fitting corresponding generation and discrimination are needed. Deep neural networks are commonly used in practice as generators and discriminators.
Due to the powerful capabilities of GAN, their research in the field of deep learning is a very hot topic. Over the last few years, they have grown from producing blurred numbers to creating images that are as realistic as real humans. The generation of the relatively fine and smooth characteristic image of the human face gives enthusiasm technical drive to the field of image style conversion. The increasingly mature image transformation and style migration technology of GAN can meet the requirements of customers on artistic design in some home decoration fields, the color matching of the generated design scheme is guaranteed to be consistent, and the customers can experience flawlessly with the help of a rendering engine subsequently.
Fig. 1 is a flowchart of a home decoration scenario design method based on a neural network according to an embodiment of the present invention, and as shown in fig. 1, the method includes:
s1, generating a wire frame diagram according to the existing scene effect diagram and/or the blank scene diagram of the house type to be designed;
specifically, in the embodiment of the present invention, two wire frame diagrams may be used as input, and enter the GAN neural network operation module to perform generation countermeasure training, so as to finally generate an effect diagram. These two types of wireframe generation can be divided into:
1. reconstructing a house type graph according to the existing blank house type space structure, and then rapidly dragging a bottom layer model in a three-dimensional scene, wherein the step does not need model material optimization, and can be regarded as a white model, and then generating a wire frame graph by utilizing an algorithm;
2. and directly extracting a wire frame diagram according to the existing scene effect diagram.
The two wire frame diagram input modes respectively correspond to the design of an original house type indoor effect diagram and the style migration design of the existing house type indoor effect diagram.
And S2, inputting the wire frame diagram of the house type to be designed into a pre-trained generated countermeasure network GAN model to obtain the home decoration scene effect diagram of the house type to be designed.
Specifically, the closed-loop learning of the training data of the GAN model, which may be an existing scene effect graph, includes:
and extracting a wire frame diagram through the training data, and simultaneously enabling a generator of the GAN model and the training data to be used as input and a discriminator of the GAN model to carry out antagonistic training, thereby optimizing parameters of the GAN model until the whole training process is converged. Fig. 2 shows a conventional scene effect graph and a wire frame graph obtained by extraction.
When the whole training process is converged, the GAN network can be applied to design an effect graph. This indicator of convergence can be defined as: when the discriminator network can not confirm the difference between the generated result of the generator and the real effect diagram or the difference between the generated result of the generator and the real effect diagram has enough error in the definition of the loss function.
The generated wire frame diagram is used as the input of a trained model, and the optimized generator network can generate a computer-independent design effect diagram.
The following description will take the existing effect diagram and blank scene diagram generation effect diagram as examples respectively:
taking the existing effect diagram as an example, the existing effect diagram extraction wireframe diagram is used as the input of the GAN indoor scene design algorithm, and the effect diagram of the algorithm network design is output, and the drawing of the existing effect diagram conversion design of the living room is shown in fig. 3.
Taking a blank scene as an example, a two-dimensional house type diagram is drawn in home design software, each functional area, such as a main sleeping area or a living room, is marked, then the house type design diagram is reconstructed into a three-dimensional scene, and a three-dimensional model without a material diagram is dragged into the scene of a single functional area and is arranged according to a target position. And generating a line block diagram by using a deep learning algorithm model as the input of the GAN design model, namely generating a design effect diagram inferred by the model. A schematic drawing of the design of a living room is shown in fig. 4.
As a supplement to this embodiment, an indoor scene target detection function module may be added after the effect diagram in the embodiment is generated, and this module is used to identify a main soft component in the scene effect diagram, so that intelligent home decoration scene interpretation based on the soft component as an element may be performed. Thus, the GAN network design and the intelligent home decoration explanation are combined to form an important role of the digital home decoration service.
The embodiment of the invention provides a home decoration scene design system based on a neural network, and the functional principle of each module in the system is explained in the embodiment of the method, and is not described in detail below.
The system comprises: and the effect diagram generating module is used for inputting the wire frame diagram of the house type to be designed into a pre-trained generated countermeasure network GAN model to obtain the home decoration scene effect diagram of the house type to be designed.
Optionally, in this embodiment, the system further includes: and the wire frame diagram generating module is used for generating a wire frame diagram according to the existing scene effect diagram and/or blank scene diagram of the house type to be designed before the effect diagram generating module inputs the wire frame diagram of the house type to be designed into a pre-trained generation confrontation network GAN model.
Optionally, in this embodiment, the system further includes:
and the model training module is used for performing closed-loop learning by using the existing scene effect graph as the training data of the GAN model.
Optionally, in this embodiment, the model training module is specifically configured to:
and extracting a wire frame diagram through the training data, and simultaneously enabling a generator of the GAN model and the training data to be used as input and a discriminator of the GAN model to carry out antagonistic training, thereby optimizing parameters of the GAN model until the whole training process is converged.
The reader should understand that in the description of this specification, reference to the description of the terms "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the modules and units in the above described system embodiment may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (8)
1. A home decoration scene design method based on a neural network is characterized by comprising the following steps:
inputting the wire frame diagram of the house type to be designed into a pre-trained generated countermeasure network GAN model to obtain the home decoration scene effect diagram of the house type to be designed.
2. The method as claimed in claim 1, wherein before inputting the wire-frame diagram of the house type to be designed into the pre-trained GAN model of the generative confrontation network, the method further comprises:
and generating a wire frame diagram according to the existing scene effect diagram and/or the blank scene diagram of the house type to be designed.
3. The method for designing home decoration scenes based on neural networks as claimed in claim 1 or 2, wherein the method further comprises:
and adopting the existing scene effect graph as training data of the GAN model to carry out closed-loop learning.
4. The home decoration scenario design method based on the neural network as claimed in claim 3, wherein the performing closed-loop learning by using the existing scenario effect diagram as the training data of the GAN model specifically comprises:
and extracting a wire frame diagram through the training data, and simultaneously enabling a generator of the GAN model and the training data to be used as input and a discriminator of the GAN model to carry out antagonistic training, thereby optimizing parameters of the GAN model until the whole training process is converged.
5. A home decoration scene design system based on a neural network is characterized by comprising:
and the effect diagram generating module is used for inputting the wire frame diagram of the house type to be designed into a pre-trained generated countermeasure network GAN model to obtain the home decoration scene effect diagram of the house type to be designed.
6. The neural network-based home decoration scenario design system of claim 5, wherein the system further comprises: and the wire frame diagram generating module is used for generating a wire frame diagram according to the existing scene effect diagram and/or blank scene diagram of the house type to be designed before the effect diagram generating module inputs the wire frame diagram of the house type to be designed into a pre-trained generation confrontation network GAN model.
7. The system of claim 5 or 6, further comprising:
and the model training module is used for performing closed-loop learning by using the existing scene effect graph as the training data of the GAN model.
8. The system of claim 7, wherein the model training module is specifically configured to:
and extracting a wire frame diagram through the training data, and simultaneously enabling a generator of the GAN model and the training data to be used as input and a discriminator of the GAN model to carry out antagonistic training, thereby optimizing parameters of the GAN model until the whole training process is converged.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110636722.6A CN113204829A (en) | 2021-06-07 | 2021-06-07 | Home decoration scene design method and system based on neural network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110636722.6A CN113204829A (en) | 2021-06-07 | 2021-06-07 | Home decoration scene design method and system based on neural network |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113204829A true CN113204829A (en) | 2021-08-03 |
Family
ID=77024562
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110636722.6A Pending CN113204829A (en) | 2021-06-07 | 2021-06-07 | Home decoration scene design method and system based on neural network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113204829A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108875766A (en) * | 2017-11-29 | 2018-11-23 | 北京旷视科技有限公司 | Method, apparatus, system and the computer storage medium of image procossing |
CN109064389A (en) * | 2018-08-01 | 2018-12-21 | 福州大学 | A kind of free hand line draws the deep learning method of generation presence image |
CN110688704A (en) * | 2019-10-17 | 2020-01-14 | 广东三维家信息科技有限公司 | Home decoration design method and system and electronic equipment |
CN112257653A (en) * | 2020-11-06 | 2021-01-22 | Oppo广东移动通信有限公司 | Method and device for determining space decoration effect graph, storage medium and electronic equipment |
-
2021
- 2021-06-07 CN CN202110636722.6A patent/CN113204829A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108875766A (en) * | 2017-11-29 | 2018-11-23 | 北京旷视科技有限公司 | Method, apparatus, system and the computer storage medium of image procossing |
CN109064389A (en) * | 2018-08-01 | 2018-12-21 | 福州大学 | A kind of free hand line draws the deep learning method of generation presence image |
CN110688704A (en) * | 2019-10-17 | 2020-01-14 | 广东三维家信息科技有限公司 | Home decoration design method and system and electronic equipment |
CN112257653A (en) * | 2020-11-06 | 2021-01-22 | Oppo广东移动通信有限公司 | Method and device for determining space decoration effect graph, storage medium and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Lin et al. | St-gan: Spatial transformer generative adversarial networks for image compositing | |
CN109255831B (en) | Single-view face three-dimensional reconstruction and texture generation method based on multi-task learning | |
US11055828B2 (en) | Video inpainting with deep internal learning | |
Quan et al. | Image inpainting with local and global refinement | |
CN108875935B (en) | Natural image target material visual characteristic mapping method based on generation countermeasure network | |
Ju et al. | Morphable crowds | |
Zollhöfer et al. | Automatic reconstruction of personalized avatars from 3D face scans | |
CN106981080A (en) | Night unmanned vehicle scene depth method of estimation based on infrared image and radar data | |
Van Hoorick | Image outpainting and harmonization using generative adversarial networks | |
CN107240085A (en) | A kind of image interfusion method and system based on convolutional neural networks model | |
CN112164130B (en) | Video-animation style migration method based on depth countermeasure network | |
CN113344777A (en) | Face changing and replaying method and device based on three-dimensional face decomposition | |
CN110930500A (en) | Dynamic hair modeling method based on single-view video | |
KR20170046140A (en) | Method and device for editing a facial image | |
CN113362422A (en) | Shadow robust makeup transfer system and method based on decoupling representation | |
Li et al. | Frequency separation network for image super-resolution | |
CN116185179A (en) | Panoramic view visual saliency prediction method and system based on crowdsourcing eye movement data | |
Guérin et al. | Gradient terrain authoring | |
CN114663603A (en) | Static object three-dimensional grid model generation method based on nerve radiation field | |
CN115131492A (en) | Target object relighting method and device, storage medium and background replacement method | |
CN117115404A (en) | Method, device, computer equipment and storage medium for three-dimensional virtual scene adjustment | |
Kerim et al. | NOVA: Rendering virtual worlds with humans for computer vision tasks | |
Kubade et al. | Afn: Attentional feedback network based 3d terrain super-resolution | |
CN113204829A (en) | Home decoration scene design method and system based on neural network | |
Wang et al. | BlobGAN-3D: A spatially-disentangled 3D-aware generative model for indoor scenes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |