CN113628313A - Decoration effect graph generation method and device, electronic equipment and storage medium - Google Patents

Decoration effect graph generation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113628313A
CN113628313A CN202110970243.8A CN202110970243A CN113628313A CN 113628313 A CN113628313 A CN 113628313A CN 202110970243 A CN202110970243 A CN 202110970243A CN 113628313 A CN113628313 A CN 113628313A
Authority
CN
China
Prior art keywords
element layer
layer
illumination
target
decoration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110970243.8A
Other languages
Chinese (zh)
Inventor
刘玉丹
龚四维
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong 3vjia Information Technology Co Ltd
Original Assignee
Guangdong 3vjia Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong 3vjia Information Technology Co Ltd filed Critical Guangdong 3vjia Information Technology Co Ltd
Priority to CN202110970243.8A priority Critical patent/CN113628313A/en
Publication of CN113628313A publication Critical patent/CN113628313A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The embodiment of the invention provides a decoration effect graph generation method, a device, electronic equipment and a storage medium, wherein the method comprises the following steps: obtaining a decoration scheme of a target scene, performing off-line rendering on the decoration scheme, and generating element layers which depend on illumination and element layers which do not depend on illumination, which correspond to the decoration scheme; dividing each element layer depending on illumination into a first element layer and a second element layer; processing the first element layer to obtain a target mixed layer, and processing the second element layer and the element layer which does not depend on illumination to obtain a target element layer; and generating a decoration effect map of the decoration scheme based on the target mixed layer and the target element layer. The rendering time of the element layer not depending on illumination is short, the generation time of the decoration effect graph can be shortened on the premise of ensuring the quality of the decoration effect graph, the waiting time of a user is shortened, and the user experience is improved.

Description

Decoration effect graph generation method and device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to a decoration effect graph generation method and device, electronic equipment and a storage medium.
Background
At present, in the design process of a decoration scheme, the generation mode of a decoration effect picture is generally as follows: after a designer finishes designing a decoration scheme for a certain scene (such as a bedroom), performing off-line rendering according to a specified resolution and a preset global deterministic Monte Carlo sampling parameter, and outputting a final decoration effect graph. However, the generation method of the decoration effect drawing consumes a long time at present, a designer needs to wait for a long time, and experience is poor.
Disclosure of Invention
In order to solve the technical problems that the generation mode of the decoration effect diagram is long in time consumption, designers need to wait for a long time and experience is poor at present, the embodiment of the invention provides a decoration effect diagram generation method and device, electronic equipment and a storage medium.
In a first aspect of an embodiment of the present invention, a decoration effect map generation method is provided, where the method includes:
obtaining a decoration scheme of a target scene, performing off-line rendering on the decoration scheme, and generating element layers which depend on illumination and element layers which do not depend on illumination, which correspond to the decoration scheme;
dividing each element layer depending on illumination into a first element layer and a second element layer;
processing the first element layer to obtain a target mixed layer, and processing the second element layer and the element layer which does not depend on illumination to obtain a target element layer;
and generating a decoration effect map of the decoration scheme based on the target mixed layer and the target element layer.
In an optional embodiment, the performing offline rendering on the decoration scheme to generate the illumination-dependent element layers and the illumination-independent element layers corresponding to the decoration scheme includes:
obtaining a rendering resolution corresponding to the decoration scheme, performing off-line rendering on the decoration scheme according to the rendering resolution, and outputting each element layer which depends on illumination and corresponds to the decoration scheme;
and removing the light in the decoration scheme, performing off-line rendering on the decoration scheme after the light is removed according to the rendering resolution, and outputting the element layer which does not depend on the illumination and corresponds to the decoration scheme.
In an optional embodiment, the performing offline rendering on the decoration scheme according to the rendering resolution, and outputting the illumination-dependent element layers corresponding to the decoration scheme includes:
reducing the dimension of the rendering resolution ratio based on a preset resolution ratio dimension reduction strategy to obtain the rendering resolution ratio after dimension reduction;
and performing offline rendering on the decoration scheme according to the rendering resolution ratio after dimension reduction, and outputting each element layer which depends on illumination and corresponds to the decoration scheme.
In an optional embodiment, the processing the first type element layer to obtain a target mixed layer includes:
and sequentially carrying out linear attenuation and superposition on the first element layer to obtain a target mixed layer of the rendering resolution.
In an optional embodiment, the processing the first type element layer to obtain a target mixed layer includes:
and sequentially carrying out linear attenuation and superposition on the first type element layer to obtain a mixed layer, and carrying out super-resolution amplification on the mixed layer to obtain a target mixed layer with the rendering resolution.
In an optional embodiment, the processing the second type element layer and the illumination-independent element layer to obtain a target element layer includes:
and performing positive superposition on the second element layer and the element layer which does not depend on illumination to obtain a target element layer with the rendering resolution.
In an optional embodiment, the processing the second type element layer and the illumination-independent element layer to obtain a target element layer includes:
performing super-division amplification on the second element layer to obtain a middle element layer of the rendering resolution;
and performing positive superposition on the intermediate element layer and the element layer which does not depend on illumination to obtain a target element layer.
In an optional embodiment, the generating a finishing effect map of the finishing scheme based on the target mixed layer and the target element layer includes:
and performing linear attenuation superposition on the target mixed layer and the target element layer to generate a decoration effect map of the decoration scheme.
In an optional embodiment, the second type of map layer includes a global illumination map layer, the illumination-independent element map layer includes a diffuse reflection map layer, and the target element map layer includes a global illumination map layer with diffuse reflection.
In a second aspect of the embodiments of the present invention, there is provided a decoration effect map generating apparatus, including:
the scheme acquisition module is used for acquiring a decoration scheme of a target scene;
the layer determining module is used for performing off-line rendering on the decoration scheme to generate each element layer depending on illumination and each element layer not depending on illumination corresponding to the decoration scheme;
the layer dividing module is used for dividing each element layer depending on illumination into a first type element layer and a second type element layer;
the first processing module is used for processing the first type element layer to obtain a target mixed layer;
the second processing module is used for processing the second type element layer and the element layer which does not depend on illumination to obtain a target element layer;
and the effect map generating module is used for generating a decoration effect map of the decoration scheme based on the target mixed layer and the target element layer.
In a third aspect of the embodiments of the present invention, there is further provided an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete communication with each other through the communication bus;
a memory for storing a computer program;
and the processor is used for realizing the decoration effect graph generation method in the first aspect when executing the program stored in the memory.
In a fourth aspect of the embodiments of the present invention, there is also provided a storage medium, in which instructions are stored, and when the storage medium runs on a computer, the storage medium causes the computer to execute the decoration effect map generation method described in the first aspect.
In a fifth aspect of the embodiments of the present invention, there is also provided a computer program product containing instructions, which when run on a computer, causes the computer to execute the decoration effect map generation method described in the first aspect.
According to the technical scheme provided by the embodiment of the invention, the decoration scheme of the target scene is obtained, the off-line rendering is carried out on the decoration scheme, the illumination-dependent element layers and the illumination-independent element layers corresponding to the decoration scheme are generated, the illumination-dependent element layers are divided into the first element layer and the second element layer, the first element layer is processed to obtain the target mixed layer, the second element layer and the illumination-independent element layer are processed to obtain the target element layer, and the decoration effect graph of the decoration scheme is generated based on the target mixed layer and the target element layer. The decoration scheme of the target scene is rendered in an off-line mode, the element layers which depend on illumination and the element layers which do not depend on illumination and correspond to the decoration scheme are generated, the decoration effect graph of the decoration scheme is obtained by layer processing and layer combining, the time consumed for rendering the element layers which do not depend on illumination is short, the time consumed for generating the decoration effect graph can be shortened on the premise that the quality of the decoration effect graph is guaranteed, the user waiting time is shortened, and the user experience is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
FIG. 1 is a schematic flow chart illustrating an implementation of a decoration effect diagram generation method according to an embodiment of the present invention;
fig. 2 is a schematic flow chart illustrating an implementation of a method for outputting illumination-dependent element layers corresponding to a decoration scheme according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a layer merging implementation flow shown in the embodiment of the present invention;
fig. 4 is a schematic diagram of another layer merging implementation flow shown in the embodiment of the present invention;
fig. 5 is a schematic view of another layer merging implementation flow shown in the embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a decoration effect map generating apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device shown in the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the embodiment of the present invention, the offline rendering engine may decompose the rendering into different rendering elements, output the rendering elements to different element layers, and perform fine control on the final rendering effect (i.e., the effect map) by operating the element layers. In the embodiment of the present invention, two types of element layers are mainly used, where one type of element layer is an element layer that depends on illumination, and the method includes: the method comprises the steps of a global illumination layer (RAWGI), a reflection layer (REFLECT), a refraction layer (REFRACT), a highlight layer (SPECULAR), a scattering layer (SSS), a caustic layer (CAUSTICS), a fog layer (ATMOSPHERE), an exterior view layer (BACKGROUND), a self-luminous layer (SELFILUM) and the like, wherein the rendering of the element layers is long in time. Another type of element layers are element layers that are less dependent on illumination, including DIFFUSE reflection layers (DIFFUSE), which record colors and textures in the final effect map, and visually affect the definition of the colors and textures in the final effect map, and rendering of such element layers is very fast and time-consuming. And carrying out certain layer processing and layer merging and restoring to obtain a final effect graph.
Based on the above principle, as shown in fig. 1, an implementation flow diagram of the decoration effect graph generation method provided by the embodiment of the present invention is shown, and the method is applied to a processor, and specifically may include the following steps:
s101, obtaining a decoration scheme of a target scene, performing off-line rendering on the decoration scheme, and generating element layers depending on illumination and element layers not depending on illumination corresponding to the decoration scheme.
Currently, designers perform the design of finishing plans for target scenes (e.g., bedrooms) in 3D design software. After the designer finishes designing the decoration scheme of the target scene (such as a bedroom), the decoration scheme of the target scene is submitted and saved.
The embodiment of the invention can obtain the decoration scheme of the target scene, and call the offline rendering engine to perform offline rendering on the decoration scheme of the target scene to generate the illumination-dependent element layers and the illumination-independent element layers corresponding to the decoration scheme of the target scene.
It should be noted that, for each element layer depending on illumination, a global illumination layer (RAWGI), a reflection layer (REFLECT), a refraction layer (refrac), a high light layer (specula), a scattering layer (SSS), a caustic layer (CAUSTICS), a fog layer (ATMOSPHERE), an outer view layer (backbackground), a self-illumination layer (SELFILLUM), and the like are included, and for an element layer not depending on illumination, a DIFFUSE reflection layer (fudifse) is included.
In the embodiment of the invention, the corresponding rendering resolution ratio can be specified for the decoration scheme of the target scene, so that the rendering resolution ratio corresponding to the decoration scheme of the target scene is obtained, an offline rendering engine is called to perform (first time) offline rendering on the decoration scheme of the target scene according to the rendering resolution ratio, and the illumination-dependent element layers corresponding to the decoration scheme of the target scene are output.
And removing light in the decoration scheme of the target scene, calling an offline rendering engine to perform (second time) offline rendering on the decoration scheme of the target scene according to the rendering resolution, and outputting the element layer which does not depend on the light and corresponds to the decoration scheme of the target scene. It should be noted that, because the light is removed in the decoration scheme of the target scene, the rendering is very quick, and the rendering time is greatly reduced, so that the generation time of the decoration effect diagram can be shortened on the premise of ensuring the quality of the decoration effect diagram, the waiting time of a user is shortened, and the user experience is improved.
For example, the designer specifies a corresponding rendering resolution (1024 × 768) for the decoration scheme of the target scene, obtains the rendering resolution specified for the decoration scheme of the target scene, invokes an offline rendering engine to perform (first) offline rendering on the decoration scheme of the target scene according to the rendering resolution, and outputs illumination-dependent element layers (a global illumination layer (RAWGI), a reflection layer (REFLECT), a refraction layer (refrac), a highlight layer (specula), a scattering layer (SSS), a caustic layer (CAUSTICS), a fog layer (atmoshere), an exterior layer (backtrack), and a self-luminous layer (self).
Removing light in the decoration scheme of the target scene, calling an offline rendering engine to perform (second time) offline rendering on the decoration scheme of the target scene according to the rendering resolution (1024 × 768), and outputting an element layer (DIFFUSE reflection layer (DIFFFUSE)) which does not depend on illumination and corresponds to the decoration scheme of the target scene.
In addition, in order to further reduce rendering time, shorten generation time of the decoration effect graph, shorten user waiting time, and improve user experience, as shown in fig. 2, an implementation flow diagram of the method for outputting each element layer depending on illumination corresponding to the decoration scheme provided by the embodiment of the present invention is executed by a processor, and specifically may include the following steps:
s201, reducing the dimension of the rendering resolution ratio based on a preset resolution ratio dimension reduction strategy to obtain the rendering resolution ratio after dimension reduction.
In the embodiment of the invention, the rendering resolution corresponding to the decoration scheme of the target scene is reduced based on the preset resolution dimension reduction strategy, so that the rendering resolution after dimension reduction can be obtained.
For example, based on a preset resolution dimension reduction strategy, dimension reduction is performed on rendering resolution (1024 × 768) corresponding to the decoration scheme of the target scene, that is, the length and width of the rendering resolution are respectively reduced by 1/2, so as to obtain the rendering resolution (512 × 384) after dimension reduction.
It should be noted that, for the preset resolution dimension reduction strategy, it may be specifically set according to actual needs, for example, the length and width of the rendering resolution are respectively reduced by 1/2, 1/3, or 1/4, and the embodiment of the present invention does not limit this. Of course, it is preferable to render 1/2 each resolution length and width reduction, where the amount of rendering time reduction is better.
S202, performing off-line rendering on the decoration scheme according to the rendering resolution ratio after dimension reduction, and outputting each element layer which depends on illumination and corresponds to the decoration scheme.
After the rendering resolution corresponding to the decoration scheme of the target scene is reduced in dimension, an offline rendering engine is called to perform (first time) offline rendering on the decoration scheme of the target scene according to the rendering resolution after dimension reduction, and each element layer which depends on illumination and corresponds to the decoration scheme of the target scene is output.
For example, after the rendering resolution (1024 × 768) corresponding to the decoration scheme of the target scene is reduced, an offline rendering engine is called to perform (first) offline rendering on the decoration scheme of the target scene according to the rendering resolution (512 × 384) after the dimension reduction, and illumination-dependent element layers (a global illumination layer (RAWGI), a reflection layer (reflex), a refraction layer (refrac), a highlight layer (specula), a scattering layer (SSS), a caustic layer (CAUSTICS), a fog layer (ATMOSPHERE), an exterior layer (backround), and a self-luminous layer (selfium)) corresponding to the decoration scheme of the target scene are output.
It should be noted that, because the rendering resolution of the target scene is reduced in dimension, and the decoration scheme of the target scene is rendered offline according to the rendering resolution after the dimension reduction, the rendering time is further greatly reduced, so that the generation time of the decoration effect graph is further shortened, the user waiting time is shortened, and the user experience is improved.
S102, dividing each element layer depending on illumination into a first element layer and a second element layer.
In the embodiment of the present invention, for each element layer depending on illumination corresponding to the decoration scheme of the target scene, each element layer depending on illumination may be divided into a first type element layer and a second type element layer.
Wherein the first type element layers may comprise a reflection layer (REFLECT), a refraction layer (refrac), a highlight layer (SPECULAR), a scattering layer (SSS), a caustic layer (CAUSTICS), a haze layer (ATMOSPHERE), a BACKGROUND layer (BACKGROUND), a self-luminescence layer (selllum), and the second type element layers comprise a global illumination layer (RAWGI).
S103, processing the first element layer to obtain a target mixed layer, and processing the second element layer and the element layer not depending on illumination to obtain a target element layer.
In the embodiment of the present invention, for the first-type element layer, a target mixed layer may be obtained by processing the first-type element layer. The first-type element layers can be sequentially subjected to linear attenuation superposition to obtain a target mixed layer with rendering resolution.
For example, a reflection layer (reflex), a refraction layer (refrac), a highlight layer (SPECULAR), a scattering layer (SSS), a caustic layer (CAUSTICS), a haze layer (ATMOSPHERE), an exterior layer (backscenery), and a self-luminescence layer (selfium) are linearly subtracted and superimposed in sequence to obtain a target mixed layer (MIX) with a rendering resolution.
If the offline rendering engine is called to perform (first) offline rendering on the decoration scheme of the target scene according to the rendered resolution after the dimension reduction, and each element layer which depends on illumination and corresponds to the decoration scheme of the target scene is output, at this time, the first element layer is the first element layer of the rendered resolution after the dimension reduction, the first element layer needs to be sequentially subjected to linear attenuation and superposition to obtain a mixed layer, and the mixed layer is subjected to super-resolution amplification to obtain a target mixed layer of the rendered resolution. Thus, resolution is restored by super-division amplification.
For example, as shown in fig. 3, a reflection layer (REFLECT), a refraction layer (refrac), a highlight layer (SPECULAR), a scattering layer (SSS), a caustic layer (CAUSTICS), a fog layer (ATMOSPHERE), an exterior layer (backscenery), and a self-luminescence layer (self) are sequentially subjected to linear subtraction and superposition to obtain a mixed layer (MIX), and the mixed layer (MIX) is subjected to super-resolution amplification to obtain a target mixed layer (MIX-D) with a rendering resolution.
In addition, in the embodiment of the present invention, for the second type element layer and the element layer that does not depend on illumination and corresponds to the decoration scheme of the target scene, the second type element layer and the element layer that does not depend on illumination are processed to obtain the target element layer. And performing positive superposition on the second type element layer and the element layer not depending on illumination to obtain a target element layer with rendering resolution.
For example, the global illumination layer (RAWGI) and the DIFFUSE reflection layer (DIFFUSE) are subjected to positive film superposition to obtain a target element layer with rendering resolution, that is, a global illumination layer (GI-D) with DIFFUSE reflection.
If the offline rendering engine is called to perform (first) offline rendering on the decoration scheme of the target scene according to the rendering resolution after dimension reduction, and each element layer which depends on illumination and corresponds to the decoration scheme of the target scene is output, at this time, the second element layer is the second element layer of the rendering resolution after dimension reduction, the second element layer needs to be subjected to super-differential amplification to obtain a middle element layer of the rendering resolution, and the middle element layer and the element layer which does not depend on illumination are subjected to positive superposition to obtain the target element layer.
For example, as shown in fig. 4, the global illumination layer (RAWGI) is subjected to super-division amplification to obtain a middle element layer with a rendering resolution, that is, a new global illumination layer (RAWGI-D), and the middle element layer (the new global illumination layer (RAWGI-D)) and the DIFFUSE reflection layer (DIFFUSE) are subjected to positive film superposition to obtain a target element layer with a rendering resolution, that is, a global illumination layer with DIFFUSE reflection (GI-D).
And S104, generating a decoration effect map of the decoration scheme based on the target mixed layer and the target element layer.
For the target mixed layer and the target element layer, the decoration effect map of the decoration scheme of the target scene is generated based on the target mixed layer and the target element layer. And performing linear attenuation superposition on the target mixed layer and the target element layer to generate a decoration effect diagram of a decoration scheme of a target scene.
For example, as shown in fig. 5, a final decoration effect map (COLOR-D) of the rendering resolution is obtained by linearly reducing and superimposing the target mixed layer (MIX-D) of the rendering resolution and the target element layer (GI-D) of the rendering resolution, that is, the global illumination layer (GI-D) with diffuse reflection.
According to the technical scheme provided by the embodiment of the invention, the decoration scheme of the target scene is obtained, the decoration scheme is subjected to off-line rendering, the illumination-dependent element layers and the illumination-independent element layers corresponding to the decoration scheme are generated, the illumination-dependent element layers are divided into the first element layer and the second element layer, the first element layer is processed to obtain the target mixed layer, the second element layer and the illumination-independent element layers are processed to obtain the target element layer, and the decoration effect graph of the decoration scheme is generated based on the target mixed layer and the target element layer.
The decoration scheme of the target scene is rendered in an off-line mode, the element layers which depend on illumination and the element layers which do not depend on illumination and correspond to the decoration scheme are generated, the decoration effect graph of the decoration scheme is obtained by layer processing and layer combining, the time consumed for rendering the element layers which do not depend on illumination is short, the time consumed for generating the decoration effect graph can be shortened on the premise that the quality of the decoration effect graph is guaranteed, the user waiting time is shortened, and the user experience is improved.
Corresponding to the foregoing method embodiment, an embodiment of the present invention further provides a decoration effect map generating device, as shown in fig. 6, the device may include: the system comprises a scheme acquisition module 610, a layer determination module 620, a layer division module 630, a first processing module 640, a second processing module 650, and an effect graph generation module 660.
A scheme obtaining module 610, configured to obtain a decoration scheme of a target scene;
the layer determining module 620 is configured to perform offline rendering on the decoration scheme, and generate illumination-dependent element layers and illumination-independent element layers corresponding to the decoration scheme;
the layer dividing module 630 is configured to divide the element layers depending on illumination into a first type element layer and a second type element layer;
the first processing module 640 is configured to process the first type element layer to obtain a target mixed layer;
a second processing module 650, configured to process the second type element layer and the illumination-independent element layer to obtain a target element layer;
an effect map generating module 660, configured to generate a decoration effect map of the decoration scheme based on the target mixed layer and the target element layer.
The embodiment of the present invention further provides an electronic device, as shown in fig. 7, which includes a processor 71, a communication interface 72, a memory 73 and a communication bus 74, where the processor 71, the communication interface 72, and the memory 73 complete mutual communication through the communication bus 74,
a memory 73 for storing a computer program;
the processor 71, when executing the program stored in the memory 73, implements the following steps:
obtaining a decoration scheme of a target scene, performing off-line rendering on the decoration scheme, and generating element layers which depend on illumination and element layers which do not depend on illumination, which correspond to the decoration scheme; dividing each element layer depending on illumination into a first element layer and a second element layer; processing the first element layer to obtain a target mixed layer, and processing the second element layer and the element layer which does not depend on illumination to obtain a target element layer; and generating a decoration effect map of the decoration scheme based on the target mixed layer and the target element layer.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
In another embodiment of the present invention, a storage medium is further provided, where instructions are stored, and when the instructions are executed on a computer, the instructions cause the computer to execute the decoration effect map generating method in any one of the above embodiments.
In another embodiment of the present invention, there is also provided a computer program product including instructions, which when run on a computer, cause the computer to execute the decoration effect map generating method according to any one of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a storage medium or transmitted from one storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The storage medium may be any available medium that can be accessed by a computer or a data storage device including one or more available media integrated servers, data centers, and the like. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (12)

1. A decoration effect map generation method is characterized by comprising the following steps:
obtaining a decoration scheme of a target scene, performing off-line rendering on the decoration scheme, and generating element layers which depend on illumination and element layers which do not depend on illumination, which correspond to the decoration scheme;
dividing each element layer depending on illumination into a first element layer and a second element layer;
processing the first element layer to obtain a target mixed layer, and processing the second element layer and the element layer which does not depend on illumination to obtain a target element layer;
and generating a decoration effect map of the decoration scheme based on the target mixed layer and the target element layer.
2. The method of claim 1, wherein the performing offline rendering on the decoration scheme to generate illumination-dependent element layers and illumination-independent element layers corresponding to the decoration scheme comprises:
obtaining a rendering resolution corresponding to the decoration scheme, performing off-line rendering on the decoration scheme according to the rendering resolution, and outputting each element layer which depends on illumination and corresponds to the decoration scheme;
and removing the light in the decoration scheme, performing off-line rendering on the decoration scheme after the light is removed according to the rendering resolution, and outputting the element layer which does not depend on the illumination and corresponds to the decoration scheme.
3. The method of claim 2, wherein the offline rendering of the decoration scheme according to the rendering resolution and outputting of the illumination-dependent element layers corresponding to the decoration scheme comprises:
reducing the dimension of the rendering resolution ratio based on a preset resolution ratio dimension reduction strategy to obtain the rendering resolution ratio after dimension reduction;
and performing offline rendering on the decoration scheme according to the rendering resolution ratio after dimension reduction, and outputting each element layer which depends on illumination and corresponds to the decoration scheme.
4. The method according to claim 2, wherein said processing the first type element layer to obtain a target mixed layer comprises:
and sequentially carrying out linear attenuation and superposition on the first element layer to obtain a target mixed layer of the rendering resolution.
5. The method according to claim 3, wherein said processing the first type element layer to obtain a target mixed layer comprises:
and sequentially carrying out linear attenuation and superposition on the first type element layer to obtain a mixed layer, and carrying out super-resolution amplification on the mixed layer to obtain a target mixed layer with the rendering resolution.
6. The method according to claim 2, wherein said processing the second type element layer and the illumination-independent element layer to obtain a target element layer comprises:
and performing positive superposition on the second element layer and the element layer which does not depend on illumination to obtain a target element layer with the rendering resolution.
7. The method according to claim 3, wherein said processing the second type element layer and the illumination-independent element layer to obtain a target element layer comprises:
performing super-division amplification on the second element layer to obtain a middle element layer of the rendering resolution;
and performing positive superposition on the intermediate element layer and the element layer which does not depend on illumination to obtain a target element layer.
8. The method of claim 1, wherein generating the finishing effect map of the finishing scheme based on the target mixed layer and the target element layer comprises:
and performing linear attenuation superposition on the target mixed layer and the target element layer to generate a decoration effect map of the decoration scheme.
9. The method according to any one of claims 1 to 8, wherein the second type of layers comprise a global illumination layer, the illumination-independent element layers comprise diffuse reflection layers, and the target element layers comprise global illumination layers with diffuse reflection.
10. A finishing effect map generating apparatus, comprising:
the scheme acquisition module is used for acquiring a decoration scheme of a target scene;
the layer determining module is used for performing off-line rendering on the decoration scheme to generate each element layer depending on illumination and each element layer not depending on illumination corresponding to the decoration scheme;
the layer dividing module is used for dividing each element layer depending on illumination into a first type element layer and a second type element layer;
the first processing module is used for processing the first type element layer to obtain a target mixed layer;
the second processing module is used for processing the second type element layer and the element layer which does not depend on illumination to obtain a target element layer;
and the effect map generating module is used for generating a decoration effect map of the decoration scheme based on the target mixed layer and the target element layer.
11. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any one of claims 1 to 9 when executing a program stored on a memory.
12. A storage medium on which a computer program is stored, which program, when being executed by a processor, carries out the method according to any one of claims 1 to 9.
CN202110970243.8A 2021-08-23 2021-08-23 Decoration effect graph generation method and device, electronic equipment and storage medium Pending CN113628313A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110970243.8A CN113628313A (en) 2021-08-23 2021-08-23 Decoration effect graph generation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110970243.8A CN113628313A (en) 2021-08-23 2021-08-23 Decoration effect graph generation method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113628313A true CN113628313A (en) 2021-11-09

Family

ID=78387237

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110970243.8A Pending CN113628313A (en) 2021-08-23 2021-08-23 Decoration effect graph generation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113628313A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0856815A2 (en) * 1997-01-31 1998-08-05 Microsoft Corporation Method and system for determining and/or using illumination maps in rendering images
CN103761760A (en) * 2014-01-07 2014-04-30 珠海宜高科技有限公司 Method for manufacturing multi-view indoor design effect picture
CN112070875A (en) * 2020-09-11 2020-12-11 网易(杭州)网络有限公司 Image processing method, image processing device, electronic equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0856815A2 (en) * 1997-01-31 1998-08-05 Microsoft Corporation Method and system for determining and/or using illumination maps in rendering images
CN103761760A (en) * 2014-01-07 2014-04-30 珠海宜高科技有限公司 Method for manufacturing multi-view indoor design effect picture
CN112070875A (en) * 2020-09-11 2020-12-11 网易(杭州)网络有限公司 Image processing method, image processing device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
火星时代教育: "在PS里合成VRay分层渲染图", Retrieved from the Internet <URL:https://jiaocheng.hxsd.com/course/content/3412/> *

Similar Documents

Publication Publication Date Title
CN109194960B (en) Image frame rendering method and device and electronic equipment
CN108958736B (en) Page generation method and device, electronic equipment and computer readable medium
CN104881318B (en) A kind of interface call method, device and terminal
CN109564566B (en) Discovery of calling applications for controlling file hydration behavior
CN111209116B (en) Method and device for distributing video memory space and computer storage medium
US20130074095A1 (en) Handling and reporting of object state transitions on a multiprocess architecture
CN114116065B (en) Method and device for acquiring topological graph data object and electronic equipment
US20140267291A1 (en) Preserving and reusing intermediate data
CN111709879A (en) Image processing method, image processing device and terminal equipment
CN108876309B (en) Starting method and device of flow form, storage medium and electronic equipment
CN117077599B (en) Method and device for generating field programmable gate array view
CN108389153B (en) View loading method and terminal equipment
CN113628313A (en) Decoration effect graph generation method and device, electronic equipment and storage medium
CN112433713A (en) Application program design graph processing method and device
CN111246214B (en) Video decoding method and device
CN111767267A (en) Metadata processing method and device and electronic equipment
CN113238852B (en) Task allocation method and device, electronic equipment and storage medium
CN110933256B (en) Method and device for correcting image dark field leakage current, electronic terminal and storage medium
CN110038301B (en) Data processing method and device, electronic equipment and storage medium
CN111813988B (en) HNSW node deletion method, system, device and medium for image feature library
CN114782615A (en) Real-time rendering method and device of indoor scene, electronic equipment and storage medium
CN114253449A (en) Screen capturing method, device, equipment and medium
CN114697398A (en) Data processing method and device, electronic equipment, storage medium and product
US9069562B2 (en) Mobile computing program slicing and behavior
CN113901033A (en) Data migration method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination