CN112070875A - Image processing method, image processing device, electronic equipment and storage medium - Google Patents

Image processing method, image processing device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112070875A
CN112070875A CN202010956505.0A CN202010956505A CN112070875A CN 112070875 A CN112070875 A CN 112070875A CN 202010956505 A CN202010956505 A CN 202010956505A CN 112070875 A CN112070875 A CN 112070875A
Authority
CN
China
Prior art keywords
image
layer
rendering
processed
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010956505.0A
Other languages
Chinese (zh)
Inventor
赵海峰
王凯
胡一博
阮浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010956505.0A priority Critical patent/CN112070875A/en
Publication of CN112070875A publication Critical patent/CN112070875A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The application provides an image processing method, an image processing device, electronic equipment and a storage medium. The method and the device realize independent output of rendering layer data, namely the lighting layer, generated in the middle rendering process of the rendering engine, provide data support for reprocessing of other rendering engines on the basis of the rendering layer data, and achieve the effect of generating the high-quality CG animation by integrating the advantages of a plurality of rendering engines.

Description

Image processing method, image processing device, electronic equipment and storage medium
Technical Field
The present application relates to the field of image processing, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
With the rapid development of the electronic game industry, people have increasingly demanded more game animation pictures, and meanwhile, various CG (Computer Graphics) movie and television animation production engines are promoted, and even the game production engines have produced CG movie and television animation production functions.
In the existing CG movie and television animation engine, different engines have features in different aspects, and have respective defects, and an animation image output by each single CG movie and television animation engine cannot provide rendering layer data generated in a rendering intermediate process, that is, most of image data obtained by using the CG movie and television animation engine cannot be directly used by other software or engines.
Therefore, CG development technicians cannot effectively utilize the advantages of each CG movie and animation production engine to perform advantage complementation so as to obtain high-quality CG animation.
Disclosure of Invention
The application provides an image processing method, an image processing device, electronic equipment and a storage medium, which are used for solving the problem that in the prior art, layer data generated in an image rendering process cannot be split and output, so that a plurality of CG movie animation production engines cannot be mutually matched to obtain high-quality CG animation.
In a first aspect, the present application provides an image processing method, including:
acquiring an image to be processed and a rendering setting instruction, wherein the image to be processed comprises a plurality of areas to be rendered, each area to be rendered has different material properties, and the rendering setting instruction comprises a light setting parameter and a material setting parameter;
and generating a lamplight layer by utilizing a first preset rendering engine according to the lamplight setting parameters, the material setting parameters and the image to be processed, wherein the lamplight layer is used for generating a rendering image corresponding to the image to be processed.
In one possible design, the generating, by using a first preset rendering engine, a light layer according to the light setting parameter, the material setting parameter, and the image to be processed includes:
separating a target material layer corresponding to the target material parameter from the image to be processed according to the target material parameter in the material setting parameters by using the first preset rendering engine;
and generating a target lamplight layer according to the target material layer and the lamplight setting parameters, wherein the lamplight layer comprises the target lamplight layer.
Optionally, after the generating the light layer by using the first preset rendering engine and according to the light setting parameter, the material setting parameter, and the to-be-processed image, the method further includes:
and generating the rendering image according to a second preset rendering engine and all the lamplight layers.
In one possible design, the image processing method further includes:
filling different colors for each target object in the image to be processed, and generating color blocking layers.
Optionally, the filling different colors for each target object in the image to be processed includes:
and endowing different RGB values for the color channels of the target objects according to a preset random algorithm.
Optionally, after filling different colors for each target object in the image to be processed, the method further includes:
and generating the rendering image according to a second preset rendering engine, the light layer and the color blocking layer.
In one possible design, the lamp light setting parameters are used to set lamp light characteristics of at least one of an equalizing layer, a reflecting layer, a diffuse reflecting layer, a high light layer, a metallic layer, a self-emitting layer, and a sub-surface scattering layer.
In a second aspect, the present application provides an image processing apparatus comprising:
the system comprises an acquisition module, a rendering setting module and a display module, wherein the acquisition module is used for acquiring an image to be processed and a rendering setting instruction, the image to be processed comprises a plurality of areas to be rendered, each area to be rendered has different material characteristics, and the rendering setting instruction comprises light setting parameters and material setting parameters;
and the processing module is used for generating a light layer according to the light setting parameters, the material setting parameters and the image to be processed by utilizing a first preset rendering engine, and the light layer is used for generating a rendering image corresponding to the image to be processed.
In one possible design, the processing module is configured to generate a light layer according to the light setting parameter, the material setting parameter, and the image to be processed by using a first preset rendering engine, and includes:
the processing module is used for separating a target material layer corresponding to the target material parameter from the image to be processed according to the first preset rendering engine and the target material parameter in the material setting parameters;
the processing module is further configured to generate a target lighting layer according to the target material layer and the first preset rendering engine, where the lighting layer includes the target lighting layer.
Optionally, the processing module is configured to, after the generating the light layer according to the light setting parameter, the material setting parameter, and the to-be-processed image by using a first preset rendering engine, further include:
the processing module is further configured to generate the rendering image according to a second preset rendering engine and all the light layers.
In one possible design, the processing module is further configured to fill different colors for each target object in the image to be processed, and generate a color blocking layer.
Optionally, the processing module is further configured to fill different colors for each target object in the image to be processed, and includes:
the processing module is further used for endowing different RGB values to the color channels of the target objects according to a preset random algorithm.
Optionally, after the processing module is further configured to fill different colors into each target object in the image to be processed, the processing module further includes:
the processing module is further configured to generate the rendered image according to a second preset rendering engine, all the light layers, and the color blocking layer.
In a third aspect, the present application provides an electronic device, comprising:
a memory for storing program instructions;
and the processor is used for calling and executing the program instructions in the memory to execute any one of the possible image processing methods provided by the first aspect.
In a fourth aspect, the present application provides a storage medium, in which a computer program is stored, the computer program being configured to execute any one of the possible image processing methods provided by the first aspect.
The application provides an image processing method, an image processing device, electronic equipment and a storage medium. The method and the device realize independent output of rendering layer data, namely the lighting layer, generated in the middle rendering process of the rendering engine, provide data support for reprocessing of other rendering engines on the basis of the rendering layer data, and achieve the effect of generating the high-quality CG animation by integrating the advantages of a plurality of rendering engines.
Drawings
In order to more clearly illustrate the technical solutions in the present application or the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic diagram illustrating a particle effect in a CG image generated by a CG animation engine according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of an image processing method according to an embodiment of the present application;
fig. 3a to 3g are schematic diagrams of material layer images after splitting of an image to be processed according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of another image processing method according to an embodiment of the present application;
fig. 5 is a schematic diagram illustrating a principle of generating a target material layer according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram illustrating an object color channel rendering process in an image processing method according to an embodiment of the present disclosure;
FIGS. 7a-7b are schematic diagrams of color partition layers provided in embodiments of the present application;
fig. 8 is a schematic structural diagram of an image processing apparatus provided in the present application;
fig. 9 is a schematic structural diagram of an electronic device provided in the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments, including but not limited to combinations of embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any inventive step are within the scope of the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the CG movie animation production process, layered rendering is an essential link. The layer sequence after layered rendering is processed by later software or further added with work, so that a better effect can be achieved.
The existing CG movie animation production engine has various characteristics in different aspects, and has respective defects, for example, some software or engine is better at rendering, and some software or engine is better at other post-production such as depth of field, glow, motion blur, color correction, etc. Moreover, the animation image output by each single CG movie and television animation engine cannot provide rendering layer data generated in the middle of rendering, that is, most of the image data obtained by the CG movie and television animation engine cannot be used directly by other software or engines.
Therefore, CG development technicians cannot effectively utilize the advantages of each CG movie and animation production engine to perform advantage complementation so as to obtain high-quality CG animation. That is, in the prior art, layer data generated in the image rendering process cannot be split and output, so that a plurality of CG movie animation production engines cannot be mutually matched to obtain a high-quality CG animation.
Taking the game development Engine which is an Unreal Engine (Unreal Engine) and has CG movie animation production as an example, the depth of field effect of the Unreal Engine is completed in the screen space, which may cause the model edge to obtain incorrect depth of field effect and the astigmatism effect to be unsatisfactory. In addition, the saturation of the whole picture is reduced by the glow effect of the illusion engine, and the particle special effect in the CG image generated by the illusion engine cannot calculate the motion blur.
Fig. 1 is a schematic diagram illustrating a particle effect in a CG image generated by a conventional CG animation engine according to an embodiment of the present disclosure. As shown in fig. 1, a white light spot is a special effect of the particle in the CG image, but a long tail like a comet generated by the particle during the motion process cannot be reflected in the figure, i.e. motion blur cannot be reflected.
The color correction system of the illusion engine is relatively simple and is not easy to control compared with a special later professional color correction software system or engine. The lighting rendering data of the illusion engine cannot be output separately in a layered mode, and therefore the final later-stage synthesis adjustment is not facilitated.
In order to solve the above problems, the present application provides an image processing method, an apparatus, an electronic device, and a storage medium, which are specifically described in the following embodiments.
Fig. 2 is a schematic flowchart of an image processing method according to an embodiment of the present application. As shown in fig. 2, the method includes the following specific steps:
s201, acquiring an image to be processed and rendering a setting instruction.
In this step, the image to be processed includes a plurality of regions to be rendered, wherein each region to be rendered has different material characteristics, and the rendering setting instruction includes a light setting parameter and a material setting parameter.
Specifically, the image to be processed may be a two-dimensional image obtained from a three-dimensional model, or may be an image subjected to preliminary processing, such as adding rendering effects of texture, color, shading, and the like. Different areas of the image to be processed represent different pictures, for example, a cartoon figure with clothes, limbs, ornaments, faces and the like, which correspond to different materials in order to obtain better representation effect. In CG movie animation, a material represents a pattern, or a texture, or a buffering effect, or a combination of the three representations. For example, light and thin clothes are corresponding to the patterns and textures of the sense of yarn, which is the material characteristics of the clothes. It is obvious that different pattern regions correspond to different material properties. And the images with different styles can be obtained by changing the material characteristics of different areas of the image to be processed. Similarly, it can be understood that the line drawing is colored, different colors represent different material effects, and various color drawings can be obtained by filling different colors into each area by the same line drawing. This is the most basic rendering of the image.
The rendering setting instruction is an instruction manually input by a CG movie animation designer during CG production and comprises a light setting parameter and a material setting parameter. The light setting parameters are used to indicate the position of the light source, the type of light source, such as a point light source or a line light source, and the like, and may also include the number of light sources. Because the light is an important rendering basis in the process of rendering the CG movie and television animation, a plurality of material effects show different effects along with different light. And the material setting parameter is the setting of the material of different areas to be rendered of the image or the background area of the whole image.
In this embodiment, a material characteristic diagram corresponding to a specific shadow effect may also be set, for example, highlight, diffuse reflection, roughness, metal texture, and the like may be preset, and the material characteristic diagram may be stored in a material library. The rendering setting instruction is to set a corresponding material mapping parameter for a specific rendering special effect layer.
S202, generating a light layer according to the light setting parameters, the material setting parameters and the image to be processed by using a first preset rendering engine.
In this step, the light layer is used to generate a rendering image corresponding to the image to be processed.
According to the light setting parameters, the first preset rendering engine can call the light special effect rendering function contained in the first preset rendering engine to select one type of light.
In order to be able to split each material layer added to an image to be rendered in the process of rendering a first preset rendering engine, so as to perform optimization processing on a subsequent processing program or the rendering engine, a material parameter collection object is established through material setting parameters in the application, a plurality of material mapping parameters are defined in the material parameter collection object, the material mapping parameters comprise a hidden material parameter, the hidden material parameter is used for the material of a non-split target, after the hidden material parameter is set, the first preset rendering engine calls a corresponding hidden material map in a material library, the hidden material map can be set to be the same as a background color, and image data of a lighting effect can still be fed back. Therefore, after the first preset rendering engine finishes rendering, the output rendering image is the image data of each split material layer under the specific light.
Fig. 3a to 3g are schematic diagrams of material layer images after splitting of an image to be processed according to an embodiment of the present disclosure. Fig. 3a is an image of a split equilibrium layer, fig. 3b is an image of a split metallic layer, fig. 3c is an image of a split diffuse reflective layer, fig. 3d is an image of a split highlight layer, fig. 3e is an image of a split reflective layer, fig. 3f is an image of a split subsurface scattering layer, and fig. 3g is an image of a split self-luminescent layer.
As shown in fig. 3a to 3g, the image regions of non-local material layers can be filtered out by setting hidden materials for different material layers corresponding to specific light, so as to realize splitting of material images with specific rendering light effects.
The images corresponding to a plurality of different material layers can be obtained under the same type of light, and then the images are installed and correspond to the belonged material and the light attributes to be classified and stored, so that the light layer images are formed by the image sets with the same light attributes.
The images of the plurality of light layers are processed by a second preset rendering engine or other post-processing software, so that the CG movie animation with high quality can be obtained.
It should be noted that, in the execution carrier of the image processing method of the present application, the first preset rendering engine may be called as a bottom-layer tool, and may also be integrated into the execution carrier, where the execution carrier includes: an apparatus, a device, a computer program, etc. Similarly, the execution carrier may also call the second preset rendering engine as an underlying tool, or integrate into the execution carrier. Of course, it is understood that, in the embodiment of the present application, the second preset rendering engine is not necessarily included in the execution carrier, or the execution carrier may not call the second preset rendering engine, but only call the first preset rendering engine.
The embodiment provides an image processing method, which includes acquiring an image to be processed and a rendering setting instruction, generating a light layer by using a first preset rendering engine according to a light setting parameter, a material setting parameter and the image to be processed, and generating a rendering image corresponding to the image to be processed by using the light layer. The method and the device realize independent output of rendering layer data, namely the lighting layer, generated in the middle rendering process of the rendering engine, provide data support for reprocessing of other rendering engines on the basis of the rendering layer data, and achieve the effect of generating the high-quality CG animation by integrating the advantages of a plurality of rendering engines.
Fig. 4 is a schematic flowchart of another image processing method according to an embodiment of the present application. As shown in fig. 4, the image processing method includes the specific steps of:
s401, acquiring an image to be processed and rendering a setting instruction.
The step is similar to S301, and for the detailed explanation of the terms and principles, refer to S301, which is not described herein again.
S402, separating a target material layer corresponding to the target material parameter from the image to be processed according to the first preset rendering engine and the target material parameter in the material setting parameters.
The target material parameters are mapping parameters corresponding to the material of the image area corresponding to the target material layer and the material materials pre-stored in the material library. The rendering process of the first preset rendering engine only needs to search for corresponding material materials in the material library according to the indication of the target material parameters, and extra calculation amount is not needed to be spent on material rendering. And the introduction of the target material parameters can ensure that the automatic switching can be achieved only by modifying the target material parameters when the first preset rendering engine performs rendering, so as to generate rendering graphs with different effects. Therefore, a plurality of rendering images with different effects, namely a plurality of target material layers with the same shape but different display effects can be automatically obtained through the method, the basic material data of the CG movie animation is greatly enriched, and richer data support is provided for subsequent development.
Fig. 5 is a schematic diagram illustrating a generation principle of a target material layer according to an embodiment of the present disclosure. As shown in FIG. 5, the target material parameters serve to quickly switch the type of target material in the material library. In addition, a key parameter in this step is the hidden material parameter. The hidden material parameter is also a mapping parameter of the material library, and is used for hiding an image area which does not belong to the target material layer.
Since the final purpose of this embodiment is to obtain an image of the target material layer that has been split, it is different from the prior art that an image comprising multiple material layers that have not been split or combined together in a superimposed manner has been obtained. And at the same time, the existing rendering engine can be utilized without redeveloping the rendering engine with the splitting function. Therefore, the first preset rendering engine needs to be called repeatedly for many times, and only one target material layer is output each time, or only the target material layers with the material layer number less than all the material layer numbers are output.
For example, in this embodiment, the image to be processed includes 7 material layers, which are: the balance layer, the reflection layer, the diffuse reflection layer, the highlight layer, the metallic layer, the self-luminous layer and the sub-surface scattering layer may be specifically configured, for example, by setting a target material parameter for the highlight layer, setting material parameters of other material layers as hidden material parameters, and setting hidden materials corresponding to the hidden material parameters to be related to the type of the light, and/or the background or color of the image to be processed. Therefore, when the first preset rendering engine performs rendering, an image only containing the highlight layer is obtained, and the target material layer of the highlight layer is successfully separated. It can be understood that if the highlight layer and the diffuse reflection layer are set as the target material parameters, and the other material layers are set as the hidden material parameters, the target material layer of the highlight layer and the diffuse reflection layer combination can be obtained after rendering.
One skilled in the art can select a single material layer or a combination of multiple material layers as the target material layer according to actual situations, and the application is not limited herein.
And S403, generating a target lamplight layer according to the target material layer and the lamplight setting parameters.
In this embodiment, the light layer includes a target light layer, that is, the light layer may be an image obtained by rendering a target material layer through a specified light effect, that is, the target light layer, and at this time, since the light effect is various, a plurality of target light layers may exist.
And after the target material layer is rendered, the first preset rendering engine selects a light special effect, such as a point light source at the upper right corner, according to the light setting parameters in the rendering setting instruction. Because the same material can obtain the rendering pictures with different effects under different light special effect renderings, after the target material layer is rendered by the light, the obtained picture has the rendering effect picture with the attributes of the material and the light.
It should be noted that, in this embodiment, both S402 and S403 are automatically completed together by the first preset rendering engine calling the rendering process, and the sequence thereof may not be limited to the completion sequence of this embodiment.
In one possible design, to achieve fully automatic rendering of all the light and material layers, the light and material layers may be structurally organized, each light layer may contain 7 different material layers: the device comprises a balance layer, a reflection layer, a diffuse reflection layer, a highlight layer, a metal layer, a self-luminous layer and a sub-surface scattering layer, wherein each material layer corresponds to a rendering parameter setting array to store rendering settings of the material layer, the rendering parameter setting array comprises a light layer where the current material layer is located, a material layer type list (the highlight layer, the reflection layer and the like) contained in the material layer, a list of material layer objects needing to be hidden and the like.
The rendering parameter setting array is generated according to the rendering setting instruction, and then a recursive layered rendering function is generated, wherein the recursive layered rendering function is used for automatically and repeatedly calling the first preset rendering engine for multiple times to render each target material layer and the corresponding target lamplight layer. The recursive layered rendering function automatically creates a rendering queue during initialization, wherein the rendering queue is an ordered set formed by all target material layers needing to be split, a first preset rendering engine is called each time to generate a target lamplight layer, meanwhile, the rendering queue is automatically deleted to finish the split target material layers, the recursive layered rendering function automatically resets rendering parameters corresponding to the next target material layer, then the first preset rendering engine is called to render, the above processes are sequentially repeated until the rendering list is empty, and the recursive calling is finished. Thus, a plurality of target lamplight layers and a lamplight layer image set corresponding to the target lamplight layers are obtained.
And S404, generating a rendering image according to the second preset rendering engine and all the light layers.
In this step, all the light layers generated by the first preset rendering engine are combined and rendered to generate a high-quality rendered image.
It should be noted that, in the execution carrier of the image processing method of the present application, the first preset rendering engine may be called as a bottom-layer tool, and may also be integrated into the execution carrier, where the execution carrier includes: an apparatus, a device, a computer program, etc. Similarly, the execution carrier may also call the second preset rendering engine as an underlying tool, or integrate into the execution carrier.
It should be noted that rendering effects of the first preset rendering engine and the second preset rendering engine are different, and after the split lighting layer rendered by the first rendering engine is obtained, in order to meet a higher-level rendering requirement, or in order to obtain different rendering effects, the second rendering engine is introduced, and all lighting layers are used as rendering materials and input into the second rendering engine to generate a rendering image.
It should be further noted that the lighting layer rendered by the first rendering engine may also be post-processed by a plurality of other software, and finally, a rendering engine dedicated to combine lighting layers combines all lighting layers to obtain a high-quality rendered image.
The embodiment provides an image processing method, which includes the steps of obtaining an image to be processed and a rendering setting instruction, generating a light layer by using a first preset rendering engine according to light setting parameters, material setting parameters and the image to be processed, and performing secondary rendering on all the light layers by using a second preset rendering engine to generate a rendering image. The method and the device realize independent output of rendering layer data, namely the lighting layer, generated in the middle rendering process of the rendering engine, provide data support for reprocessing of other rendering engines on the basis of the rendering layer data, and achieve the effect of generating the high-quality CG animation by integrating the advantages of a plurality of rendering engines.
In one possible design, the image processing method further includes: and filling different color attributes for each target object in the image to be processed, and generating a color blocking layer. The color blocking layer is used for independently setting each target object in the image to be processed into layered image data so as to obtain color blocks of a plurality of target objects, and the target objects can be conveniently corrected in color or replaced by other colors.
Fig. 6 is a schematic diagram of an object color channel rendering process in an image processing method according to an embodiment of the present disclosure. As shown in fig. 6, filling different color attributes for each target object in the image to be processed, and generating a color blocking layer specifically includes:
s601, endowing different RGB values to the color channels of the target objects according to a preset random algorithm.
In this step, the image to be processed may include a plurality of target objects, and the division information of the target objects may be included in the data corresponding to the image to be processed, or the target objects may be divided according to the obtained material layer.
In order to facilitate subsequent color correction or replacement of other colors for images of CG movie animation, a color-blocked layer needs to be set to help subsequent software to quickly identify a target object, so in this step, a preset random algorithm is used to randomly assign an RGB color value to all objects in a scene displayed by an image to be processed, so that different colors can be given to different objects to obtain the color-blocked layer.
Fig. 7a to 7b are schematic diagrams of color partition layers according to an embodiment of the present application. As shown in fig. 7a, fig. 7a is a diagram illustrating the effect of the to-be-processed image without color channel rendering, and fig. 7a includes a sphere and a texture pattern carried by the sphere, and also includes a chair, a ground surface, and a shadow of the sphere and the chair on the ground surface; as shown in fig. 7b, for the effect of the color-block layer obtained by randomly distributing colors, the sphere, the chair and the ground are separated by the color blocks, so that operations such as color correction can be conveniently performed in subsequent CG movie animation software.
And S602, generating a rendering image according to the second preset rendering engine, all the light layers and the color blocking layers.
In this step, the second preset rendering engine combines all the lighting layers, and may also add a new rendering effect, and perform correction and adjustment on image attributes such as color, texture, buffering, and the like of an object included in the rendered image according to the color blocking layer, so as to obtain a high-quality CG movie animation image.
The embodiment provides an image processing method, which includes the steps of obtaining an image to be processed and a rendering setting instruction, generating a light layer according to light setting parameters, material setting parameters and the image to be processed by using a first preset rendering engine, performing secondary rendering on all the light layers by using a second preset rendering engine to generate a rendered image, and performing correction and adjustment on various attributes of an object in the image according to color blocking layers. The method and the device realize independent output of rendering layer data, namely the lighting layer, generated in the middle rendering process of the rendering engine, provide data support for reprocessing of other rendering engines on the basis of the rendering layer data, and achieve the effect of generating the high-quality CG animation by integrating the advantages of a plurality of rendering engines.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments can be implemented by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps including the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Fig. 8 is a schematic structural diagram of an image processing apparatus according to the present application. The image processing apparatus may be implemented by software, hardware, or a combination of both.
As shown in fig. 8, the image processing apparatus 800 according to the present embodiment includes:
an obtaining module 801, configured to obtain an image to be processed and a rendering setting instruction, where the image to be processed includes multiple areas to be rendered, each of the areas to be rendered has different material characteristics, and the rendering setting instruction includes a lighting setting parameter and a material setting parameter;
the processing module 802 is configured to generate a light layer according to the light setting parameter, the material setting parameter, and the to-be-processed image by using a first preset rendering engine, where the light layer is used to generate a rendering image corresponding to the to-be-processed image.
In one possible design, the processing module 802 is configured to generate a lighting layer according to the lighting setting parameter, the material setting parameter, and the image to be processed by using a first preset rendering engine, and includes:
the processing module 802 is configured to separate a target material layer corresponding to the target material parameter from the image to be processed according to the first preset rendering engine and the target material parameter in the material setting parameters;
the processing module 802 is further configured to generate a target lighting layer according to the target material layer and the first preset rendering engine, where the lighting layer includes the target lighting layer.
Optionally, the processing module 802 is configured to, after generating the light layer according to the light setting parameter, the material setting parameter, and the to-be-processed image by using a first preset rendering engine, further include:
the processing module 802 is further configured to generate the rendering image according to a second preset rendering engine and all the light layers.
In one possible design, the processing module 802 is further configured to fill different color attributes for each target object in the image to be processed, and generate a color blocking layer.
Optionally, the processing module 802 is further configured to fill different color attributes for each target object in the image to be processed, including:
the processing module 802 is further configured to assign different RGB values to the color channels of the target objects according to a preset random algorithm.
Optionally, after the processing module 802 is further configured to fill different color attributes for each target object in the image to be processed, the method further includes:
the processing module 802 is further configured to generate the rendered image according to a second preset rendering engine, all the light layers, and the color blocking layer.
It should be noted that the image processing apparatus provided in the embodiment shown in fig. 8 can execute the image processing method provided in any of the above method embodiments, and the specific implementation principle, technical features, term interpretation, and technical effects thereof are similar and will not be described herein again.
Fig. 9 is a schematic structural diagram of an electronic device provided in the present application. As shown in fig. 9, the electronic device 900 may include: at least one processor 901 and memory 902. Fig. 9 shows an electronic device as an example of a processor.
And a memory 902 for storing programs. In particular, the program may include program code including computer operating instructions.
Memory 902 may comprise high-speed RAM memory and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor 901 is configured to execute computer-executable instructions stored in the memory 902 to implement the image processing method described in the above method embodiments.
The processor 901 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits configured to implement the embodiments of the present application.
Alternatively, the memory 902 may be separate or integrated with the processor 901. When the memory 902 is a device independent of the processor 901, the electronic device 900 may further include:
a bus 903 for connecting the processor 901 and the memory 902. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. Buses may be classified as address buses, data buses, control buses, etc., but do not represent only one bus or type of bus.
Alternatively, in a specific implementation, if the memory 902 and the processor 901 are integrated into a chip, the memory 902 and the processor 901 may complete communication through an internal interface.
The present application also provides a computer-readable storage medium, which may include: a variety of media that can store program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and in particular, the computer-readable storage medium stores program instructions for the image processing method in the above embodiments.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. An image processing method, comprising:
acquiring an image to be processed and a rendering setting instruction, wherein the image to be processed comprises a plurality of areas to be rendered, each area to be rendered has different material properties, and the rendering setting instruction comprises a light setting parameter and a material setting parameter;
and generating a lamplight layer by utilizing a first preset rendering engine according to the lamplight setting parameters, the material setting parameters and the image to be processed, wherein the lamplight layer is used for generating a rendering image corresponding to the image to be processed.
2. The method according to claim 1, wherein the generating, by using a first preset rendering engine, a lighting layer according to the lighting setting parameter, the material setting parameter, and the image to be processed comprises:
separating a target material layer corresponding to the target material parameter from the image to be processed according to the target material parameter in the material setting parameters by using the first preset rendering engine;
and generating a lamplight layer according to the target material layer and the lamplight setting parameters.
3. The image processing method according to claim 1 or 2, wherein after the generating, by using the first preset rendering engine, the light layer according to the light setting parameter, the material setting parameter, and the image to be processed, the method further comprises:
and generating the rendering image according to a second preset rendering engine and all the lamplight layers.
4. The image processing method according to claim 1, further comprising:
filling different colors for each target object in the image to be processed, and generating color blocking layers.
5. The image processing method according to claim 4, wherein the filling different colors for each target object in the image to be processed comprises:
and endowing different RGB values for the color channels of the target objects according to a preset random algorithm.
6. The image processing method according to claim 4 or 5, further comprising, after the filling different colors for respective target objects in the image to be processed:
and generating the rendering image according to a second preset rendering engine, the light layer and the color blocking layer.
7. The image processing method according to any one of claims 1 to 2 and 4 to 5, wherein the lamp light setting parameters are used to set lamp light characteristics of at least one of an equalizing layer, a reflecting layer, a diffuse reflecting layer, a high light layer, a metallic layer, a self-luminous layer, and a sub-surface scattering layer.
8. An apparatus for optimizing image object detection, comprising:
the system comprises an acquisition module, a rendering setting module and a display module, wherein the acquisition module is used for acquiring an image to be processed and a rendering setting instruction, the image to be processed comprises a plurality of areas to be rendered, each area to be rendered has different material characteristics, and the rendering setting instruction comprises light setting parameters and material setting parameters;
and the processing module is used for generating a light layer according to the light setting parameters, the material setting parameters and the image to be processed by utilizing a first preset rendering engine, and the light layer is used for generating a rendering image corresponding to the image to be processed.
9. An electronic device, comprising:
a processor; and the number of the first and second groups,
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the image processing method of any of claims 1 to 7 via execution of the executable instructions.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the image processing method of any one of claims 1 to 7.
CN202010956505.0A 2020-09-11 2020-09-11 Image processing method, image processing device, electronic equipment and storage medium Pending CN112070875A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010956505.0A CN112070875A (en) 2020-09-11 2020-09-11 Image processing method, image processing device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010956505.0A CN112070875A (en) 2020-09-11 2020-09-11 Image processing method, image processing device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112070875A true CN112070875A (en) 2020-12-11

Family

ID=73696614

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010956505.0A Pending CN112070875A (en) 2020-09-11 2020-09-11 Image processing method, image processing device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112070875A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112604293A (en) * 2020-12-28 2021-04-06 完美世界(北京)软件科技发展有限公司 Data processing method and device, electronic equipment and readable medium
CN112950738A (en) * 2021-03-30 2021-06-11 杭州群核信息技术有限公司 Rendering engine processing method and device, storage medium and electronic equipment
CN113628313A (en) * 2021-08-23 2021-11-09 广东三维家信息科技有限公司 Decoration effect graph generation method and device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102402794A (en) * 2004-11-29 2012-04-04 Arm挪威股份有限公司 Processing of computer graphics
CN105023233A (en) * 2014-04-16 2015-11-04 Arm有限公司 Graphics processing systems
CN107680042A (en) * 2017-09-27 2018-02-09 杭州群核信息技术有限公司 Rendering intent, device, engine and storage medium
CN110738626A (en) * 2019-10-24 2020-01-31 广东三维家信息科技有限公司 Rendering graph optimization method and device and electronic equipment
CN110969685A (en) * 2018-09-28 2020-04-07 苹果公司 Customizable rendering pipeline using rendering maps

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102402794A (en) * 2004-11-29 2012-04-04 Arm挪威股份有限公司 Processing of computer graphics
CN105023233A (en) * 2014-04-16 2015-11-04 Arm有限公司 Graphics processing systems
CN107680042A (en) * 2017-09-27 2018-02-09 杭州群核信息技术有限公司 Rendering intent, device, engine and storage medium
CN110969685A (en) * 2018-09-28 2020-04-07 苹果公司 Customizable rendering pipeline using rendering maps
CN110738626A (en) * 2019-10-24 2020-01-31 广东三维家信息科技有限公司 Rendering graph optimization method and device and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
向魏,黄磊: "《光影材质艺术》", 31 August 2020, 重庆大学电子音像出版社, pages: 121 - 122 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112604293A (en) * 2020-12-28 2021-04-06 完美世界(北京)软件科技发展有限公司 Data processing method and device, electronic equipment and readable medium
CN112950738A (en) * 2021-03-30 2021-06-11 杭州群核信息技术有限公司 Rendering engine processing method and device, storage medium and electronic equipment
CN113628313A (en) * 2021-08-23 2021-11-09 广东三维家信息科技有限公司 Decoration effect graph generation method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111105491B (en) Scene rendering method and device, computer readable storage medium and computer equipment
CN112070875A (en) Image processing method, image processing device, electronic equipment and storage medium
CN109448089B (en) Rendering method and device
TW201816724A (en) Method for efficient construction of high resolution display buffers
CN109964255B (en) 3D printing using 3D video data
CN110674341A (en) Special effect processing method and device, electronic equipment and storage medium
US7054482B2 (en) Smart masking tool for image processing
CN107637075B (en) Three-dimensional view processing
JP2023033496A (en) Information processing device, information processing program and information processing method
CN113947657A (en) Target model rendering method, device, equipment and storage medium
WO2005073925A1 (en) Image rendering with multi-level z-buffers
US20230033319A1 (en) Method, apparatus and device for processing shadow texture, computer-readable storage medium, and program product
WO2023098358A1 (en) Model rendering method and apparatus, computer device, and storage medium
US20120133660A1 (en) Data processing method and apparatus in heterogeneous multi-core environment
CN115908685A (en) Scene rendering method, device, equipment and storage medium
US11263804B1 (en) Systems and methods for efficient point cloud visualization based on a hybrid rendering of data points and meshes
JP7422734B2 (en) Depth ray layer for visual noise reduction
CN110047120A (en) A kind of animated show method and device
EP2981944B1 (en) Look-based selection for rendering a computer-generated animation
CN106775741B (en) Icon generation method and device
US9582247B1 (en) Preserving data correlation in asynchronous collaborative authoring systems
CN109729285B (en) Fuse grid special effect generation method and device, electronic equipment and storage medium
KR102045753B1 (en) Image editing processing apparatus that supports transparency processing of the background area of an image and operating method thereof
CN110223363A (en) Image generating method and device
KR20200000254A (en) MultI-pass Layer Type Cartoon Shader Method for Webtoons

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination