CN111223169A - Three-dimensional animation post-production method and device, terminal equipment and cloud rendering platform - Google Patents

Three-dimensional animation post-production method and device, terminal equipment and cloud rendering platform Download PDF

Info

Publication number
CN111223169A
CN111223169A CN202010067233.9A CN202010067233A CN111223169A CN 111223169 A CN111223169 A CN 111223169A CN 202010067233 A CN202010067233 A CN 202010067233A CN 111223169 A CN111223169 A CN 111223169A
Authority
CN
China
Prior art keywords
file
template file
template
layered
rendering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010067233.9A
Other languages
Chinese (zh)
Other versions
CN111223169B (en
Inventor
韩定荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202010067233.9A priority Critical patent/CN111223169B/en
Publication of CN111223169A publication Critical patent/CN111223169A/en
Application granted granted Critical
Publication of CN111223169B publication Critical patent/CN111223169B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation

Abstract

The application is suitable for the technical field of three-dimensional animation, and provides a three-dimensional animation post-production method, a device, terminal equipment and a cloud rendering platform, which comprise the following steps: when an integrated file is received, acquiring a lens file to be manufactured, a light layered rendering template file and a synthesis template file from the integrated file; using the lamplight to render the template file and the synthesis template file in a layered mode, processing the lens file to be manufactured, and obtaining a synthesis sequence image corresponding to the lens file to be manufactured; and converting the synthetic sequence image corresponding to the lens file to be manufactured into a video format to generate the three-dimensional animation. According to the method and the device, manual operation can be reduced in the post-production process of the three-dimensional animation, and production efficiency and accuracy are improved.

Description

Three-dimensional animation post-production method and device, terminal equipment and cloud rendering platform
Technical Field
The application belongs to the technical field of three-dimensional animation, and particularly relates to a three-dimensional animation post-production method and device, terminal equipment and a cloud rendering platform.
Background
In addition to the previous design and production, the production of the three-dimensional animation is indispensable in the post-production. After the three-dimensional animation works are initially manufactured, the animation works respectively and independently exist in the form of single segments, and each segment can become a complete work after light rendering synthesis and special effect treatment. With the rapid development of public cloud platforms, it is more and more common for users to use cloud rendering platforms to render images. However, in the prior art, a task of making a shot before being submitted to a cloud rendering platform needs to be made on a user terminal according to a set flow rule, when the number of shots to be made is large enough, most of the time is fixed, repeated and tedious operation, the user has many parameter setting contents, the manual operation efficiency is low, the making time is long, and errors are easy to occur.
Disclosure of Invention
The application provides a three-dimensional animation post-production method and device, terminal equipment and a cloud rendering platform, so that manual operation is reduced and production efficiency and accuracy are improved in the three-dimensional animation post-production process.
In a first aspect, an embodiment of the present application provides a three-dimensional animation post-production method, where the three-dimensional animation post-production method includes:
obtaining a light rendering layered template file and a synthesized template file;
generating a lamplight layered rendering template file according to the lamplight layered rendering template file;
generating a synthesis template file according to the synthesis template file;
when a lens file to be manufactured is obtained, integrating the lens file to be manufactured, the lamplight layered rendering template file and the synthesis template file to obtain an integrated file;
and sending the integrated file to a cloud rendering platform, wherein the integrated file is used for indicating the cloud rendering platform to process the lens file to be manufactured according to the lamplight layered rendering template file and the synthesis template file in the integrated file, so as to generate the three-dimensional animation.
In a second aspect, an embodiment of the present application provides a three-dimensional animation post-production method, where the three-dimensional animation post-production method includes:
when an integrated file is received, acquiring a lens file to be manufactured, a light layered rendering template file and a synthesis template file from the integrated file;
using the lamplight to render the template file and the synthesis template file in a layered mode, processing the lens file to be manufactured, and obtaining a synthesis sequence image corresponding to the lens file to be manufactured;
and converting the synthetic sequence image corresponding to the lens file to be manufactured into a video format to generate the three-dimensional animation.
In a third aspect, an embodiment of the present application provides a three-dimensional animation post-production device, including:
the model file acquisition module is used for acquiring the lamplight layered rendering model file and the synthesis model file;
the first template generation module is used for generating a lamplight layered rendering template file according to the lamplight layered rendering template file;
the second template generating module is used for generating a synthetic template file according to the synthetic template file;
the file integration module is used for integrating the lens file to be manufactured, the lamplight rendering layered template file and the synthesis template file to obtain an integrated file when the lens file to be manufactured is obtained;
and the file sending module is used for sending the integrated file to a cloud rendering platform, wherein the integrated file is used for indicating the cloud rendering platform to process the lens file to be manufactured according to the lamplight layered rendering template file and the synthesis template file in the integrated file so as to generate the three-dimensional animation.
In a fourth aspect, an embodiment of the present application provides a three-dimensional animation post-production apparatus, including:
the template file acquisition module is used for acquiring a lens file to be manufactured, a light layered rendering template file and a synthesis template file from the integrated file when the integrated file is received;
the file processing module is used for processing the shot file to be manufactured by using the lamplight layered rendering template file and the synthesis template file to obtain a synthesis sequence image corresponding to the shot file to be manufactured;
and the animation generation module is used for converting the synthetic sequence image corresponding to the lens file to be made into a video format to generate the three-dimensional animation.
In a fifth aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor, when executing the computer program, implements the steps of the three-dimensional animation post-production method according to the first aspect.
In a sixth aspect, an embodiment of the present application provides a cloud rendering platform, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor, when executing the computer program, implements the steps of the three-dimensional animation post-production method according to the second aspect.
In a seventh aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and when being executed by a processor, the computer program implements the steps of the three-dimensional animation post-production method according to the first aspect; or the computer program when executed by a processor performs the steps of the three-dimensional animation post-production method as described in the second aspect above.
In an eighth aspect, embodiments of the present application provide a computer program product, which, when running on a terminal device, causes the terminal device to execute the steps of the three-dimensional animation post-production method according to the first aspect; or causing the cloud rendering platform to perform the steps of the three-dimensional animation post-production method according to the second aspect when the computer program product is run on the cloud rendering platform.
It is from top to bottom that this application is through producing light layering and is played up template file and synthetic template file at terminal equipment, can be convenient for cloud and play up the platform and when the camera lens file that treats the preparation carries out the post production, through calling the light layering and play up the batch automated processing of template file and synthetic template file treating the camera lens file of preparation, can realize the batch automated preparation to the camera lens, and need not the repeated parameter that sets up of user, reduced manual operation, saved the preparation time, promoted preparation efficiency and rate of accuracy.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a diagram of a network architecture of a three-dimensional animation post-production system according to an embodiment of the present application;
FIG. 2 is a schematic flow chart illustrating an implementation of a post-production method of a three-dimensional animation according to a second embodiment of the present application;
FIG. 3 is a schematic flow chart illustrating an implementation of a three-dimensional animation post-production method according to a third embodiment of the present application;
FIG. 4a is a diagram illustrating an exemplary process progress of a shot file to be produced; fig. 4b is an exemplary view of another processing progress and composition chart of a shot file to be produced;
FIG. 5 is a schematic structural diagram of a three-dimensional animation post-production device according to a fourth embodiment of the present application;
FIG. 6 is a schematic structural diagram of a three-dimensional animation post-production device according to a fifth embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device according to a sixth embodiment of the present application;
fig. 8 is a schematic structural diagram of a cloud rendering platform provided in the seventh embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, a network architecture of a three-dimensional animation post-production system according to an embodiment of the present application is provided, and for convenience of description, only a part related to the embodiment of the present application is shown.
As shown in fig. 1, the three-dimensional animation post-production system includes a terminal device 11 and a cloud rendering platform 12. Optionally, the terminal device 11 and the cloud rendering platform 12 may be connected and communicated in a wired or wireless manner, the terminal device 11 may refer to a user terminal, and the cloud rendering platform 12 may refer to a remote server for executing a rendering task, for example, a cloud computing platform such as tengcong cloud, airy cloud, huazhi cloud, and kyotong cloud.
In order to reduce manual operations and improve manufacturing efficiency and accuracy in the post-manufacturing process of the three-dimensional animation, the terminal device 11 of the present embodiment acquires a light layered rendering template file and a composite template file, generates a light layered rendering template file according to the light layered rendering template file, generates a composite template file according to the composite template file, integrates the lens file to be manufactured, the light layered rendering template file, and the composite template file when acquiring the lens file to be manufactured, acquires an integrated file, and transmits the integrated file to the cloud rendering platform 12, the cloud rendering platform 12 acquires the lens file to be manufactured, the light layered rendering template file, and the composite template file from the integrated file when receiving the integrated file, and processes the lens file to be manufactured by using the light layered rendering template file and the composite template file, and acquiring a synthetic sequence image corresponding to the lens file to be manufactured, converting the synthetic sequence image corresponding to the lens file to be manufactured into a video format, and generating the three-dimensional animation.
Optionally, the terminal device 11 obtains the lighting parameters and the layering parameters according to the lighting rendering layering template file, and generates a lighting layering rendering template file according to the lighting parameters and the layering parameters; and acquiring synthesis parameters according to the synthesis template file, and generating a synthesis template file according to the synthesis parameters.
Optionally, the cloud rendering platform 12 renders the template file hierarchically by using light, generates a renderable file corresponding to the lens file to be manufactured, performs single-frame rendering on the renderable file, synthesizes the rendered single-frame layered material by using the synthesis template file, outputs a single-frame image, performs sequence rendering on the renderable file when receiving a confirmation instruction for the renderable file based on the single-frame image, and outputs a synthesized sequence image after completing rendering of the sequence layered material in the renderable file.
Optionally, the lighting layered rendering template file includes lighting layered rendering template files corresponding to N kinds of script parameters, respectively, and the composition template file includes composition template files corresponding to L kinds of script parameters, respectively, where N is an integer greater than zero, and L is an integer greater than zero; the cloud rendering platform 12 obtains a target lighting layered rendering template file from the lighting layered rendering template files corresponding to the N kinds of session parameters, obtains a target synthesis template file from the synthesis template files corresponding to the L kinds of session parameters, processes the shot file to be made using the target lighting layered rendering template file and the target synthesis template file, and obtains a synthesis sequence image corresponding to the shot file to be made.
Optionally, when N is 1, the cloud rendering platform 12 determines that the lighting layered rendering template file corresponding to one of the script parameters included in the lighting layered rendering template file is a target lighting layered rendering template file;
when N is larger than 1, the cloud rendering platform 12 acquires the priorities of the light layered rendering template files corresponding to the N kinds of script parameters respectively, acquires the light layered rendering template file with the highest priority from the light layered rendering template files corresponding to the N kinds of script parameters respectively, and determines the light layered rendering template file with the highest priority as the target light layered rendering template file;
when L is 1, the cloud rendering platform 12 determines that a composite template file corresponding to one type of script parameter included in the composite template file is a target composite template file;
when L is greater than 1, the cloud rendering platform 12 obtains priorities of the composite template files corresponding to the L types of script parameters, respectively, obtains a composite template file with the highest priority from the priorities of the composite template files corresponding to the L types of script parameters, respectively, and determines the composite template file with the highest priority as a target composite template file.
Optionally, the cloud rendering platform 12 obtains a processing effect graph and/or a processing progress of the lens file to be manufactured in a process of processing the lens file to be manufactured by using the light layered rendering template file and the composite template file, and sends the processing effect graph and/or the processing progress to the terminal device 11, and the terminal device 11 displays the processing effect graph and/or the processing progress when receiving the processing effect graph and/or the processing progress of the lens file to be manufactured, which is sent by the cloud rendering platform 12.
According to the embodiment of the application, the light layered rendering template file and the synthesis template file are generated at the terminal device, so that the cloud rendering platform can conveniently perform batch automatic processing on the lens files to be manufactured by calling the light layered rendering template file and the synthesis template file when the lens files to be manufactured are subjected to post-manufacturing, batch automatic manufacturing on the lenses can be realized, repeated parameter setting by a user is not needed, manual operation is reduced, manufacturing time is saved, and manufacturing efficiency and accuracy are improved.
Referring to fig. 2, which is a schematic view of an implementation flow of a three-dimensional animation post-production method provided in the second embodiment of the present application, where the three-dimensional animation post-production method is applied to a terminal device, as shown in the figure, the three-dimensional animation post-production method may include the following steps:
step S201, obtaining a lamp light layered rendering model file and a synthesis model file.
In the embodiment of the present application, the post-production process of each shot in the three-dimensional animation generally includes lighting processing (i.e., lighting layout or lighting configuration), layered rendering, image composition, and the like. The purpose of the light treatment is to maximally simulate the light types of nature and artificial light types, including but not limited to target spotlight, target parallel light, free spotlight, free parallel light, floodlight, sky light, etc. Rendering refers to drawing a picture or animation according to setting of a scene, light, materials and the like. The layering mode can be divided into two modes, one mode is layering according to the object type, for example, if one lens has a person, a prop and a background, the lens can be divided into a person layer, a prop layer and a background layer according to the lens content, and the other mode is layering according to the object color, shadow and other layers, for example, the object can be divided into a basic color layer, a highlight layer, a shadow layer, a projection layer, a reflection layer, a refraction layer and a light-emitting layer according to the physical visual property of the object.
The light layered rendering template file can be a sample which can be referred to by a lens file to be manufactured during light processing and layered rendering. The composite template file may refer to a sample that can be referred to by a shot file to be produced when layered materials are synthesized. For example, 100 to-be-manufactured shot files of the same scene may be subjected to lighting processing and layered rendering on one to-be-manufactured shot file according to lighting parameters and layered parameters meeting user requirements, the to-be-manufactured shot file is subjected to lighting processing and layered rendering to obtain a file, namely, a lighting layered rendering template file, the lighting layered rendering template file is synthesized according to synthesis parameters meeting the user requirements, the obtained file is a synthesis template file, and the remaining 99 to-be-manufactured shot files may be automatically shot-manufactured by calling the lighting template rendering file and the synthesis template file. The light parameter may refer to a parameter (for example, a light type) used in light processing, and is used to indicate how to perform the light processing; the layering parameter may refer to a parameter (e.g., the number of layers) used when an image is layered, and is used to indicate how layering is performed; the composition parameter may be a parameter used in layer composition, and is used to indicate how to combine the rendered layered material.
And S202, generating a lamplight layered rendering template file according to the lamplight layered rendering template file.
In the embodiment of the application, according to the lighting layered rendering template file meeting the requirements of the user, a template file used for indicating the lens file to be manufactured to perform lighting processing and layered rendering, namely, the lighting layered rendering template file, can be generated.
Optionally, generating a lighting layered rendering template file according to the lighting layered rendering template file includes:
according to the lamplight layered rendering template file, obtaining lamplight parameters and layered parameters;
and generating the lamplight layered rendering template file according to the lamplight parameters and the layered parameters.
In this application embodiment, can install model parameter derivation instrument on terminal equipment, can follow the lamp light layering through model parameter derivation instrument and draw light parameter and layering parameter in the model file is rendered to the lamp light layering, according to the light parameter and the layering parameter of drawing, can generate the light layering and render the template file.
And step S203, generating a synthesis template file according to the synthesis template file.
In the embodiment of the present application, according to the composite template file meeting the user requirements, a template file for instructing the composition of the layered materials, that is, a composite template file, may be generated.
Optionally, the generating the composite template file according to the composite template file includes:
acquiring synthesis parameters according to the synthesis template file;
and generating the synthesis template file according to the synthesis parameters.
In the embodiment of the present application, synthesis parameters may be extracted from the synthesized template file by a template parameter derivation tool, and a synthesized template file may be generated according to the extracted synthesis parameters.
Step S204, when the lens file to be manufactured is obtained, integrating the lens file to be manufactured, the lamplight layered rendering template file and the synthesis template file to obtain an integrated file.
In the embodiment of the application, the lens file to be produced can be a material which is not subjected to lamplight processing and layered rendering, and can also be called as an upstream link file (namely a previous-stage production file), for example, an animation link is used in the previous stage, a scene can be drawn, roles in the scene can move, the whole scene can be illuminated through lamplight processing, and the picture can be more authentic through layered rendering, and the whole atmosphere of the picture is established. The number of the lens files to be manufactured is at least one.
The integrated files include, but are not limited to, a lens file to be manufactured, a light layered rendering template file, and a composition template file.
Step S205, sending the integrated file to a cloud rendering platform, wherein the integrated file is used for indicating the cloud rendering platform to process the lens file to be manufactured according to the lighting layered rendering template file and the synthesis template file in the integrated file, and generating a three-dimensional animation.
In the embodiment of the application, after the terminal device sends the integrated file to the cloud rendering platform, the cloud rendering platform can call the light layered rendering template file and the synthesis template file in the integrated file, and when the lens file to be manufactured is processed, the lens manufacturing of the lens file to be manufactured can be completed by applying the light layered rendering template file and the synthesis template file, so that a 3D scene picture meeting requirements is obtained.
According to the embodiment of the application, the light layered rendering template file and the synthesis template file are generated at the terminal device, so that the cloud rendering platform can conveniently perform batch automatic processing on the lens files to be manufactured by calling the light layered rendering template file and the synthesis template file when the lens files to be manufactured are subjected to post-manufacturing, batch automatic manufacturing on the lenses can be realized, repeated parameter setting by a user is not needed, manual operation is reduced, manufacturing time is saved, and manufacturing efficiency and accuracy are improved.
Referring to fig. 3, which is a schematic flow chart illustrating an implementation process of a three-dimensional animation post-production method provided in the third embodiment of the present application, where the three-dimensional animation post-production method is applied to a cloud rendering platform, as shown in the figure, the three-dimensional animation post-production method may include the following steps:
step S301, when an integrated file is received, acquiring a lens file to be manufactured, a light layered rendering template file and a synthesis template file from the integrated file.
Step S302, the lamplight layered rendering template file and the synthesis template file are used for processing the shot file to be manufactured, and a synthesis sequence image corresponding to the shot file to be manufactured is obtained.
In the embodiment of the application, when receiving the integrated file sent by the terminal device, the cloud rendering platform can call the lighting layered rendering template file and the synthesis template file in the integrated file, and apply the lighting layered rendering template file and the synthesis template file to process the lens file to be manufactured in the integrated file, so that a synthesis sequence image corresponding to the lens file to be manufactured can be obtained. The composite sequence image corresponding to the lens file to be manufactured may refer to an image obtained after the lens file to be manufactured is subjected to lighting processing, layered rendering and layered material composition, and is usually a 3D scene picture meeting requirements.
Optionally, the step of using the light to render the template file and the composite template file in a layered manner to process the shot file to be manufactured, and acquiring the composite sequence image corresponding to the shot file to be manufactured includes:
using the lamplight to render the template file in a layered mode to generate a renderable file corresponding to the lens file to be manufactured;
performing single-frame layered rendering on the renderable file, synthesizing the rendered single-frame layered material by using the synthesis template file, and outputting a single-frame image;
based on the single-frame image, performing sequential rendering on the renderable file when a confirmation instruction of the renderable file is received;
and outputting a composite sequence image after finishing rendering the sequence layered material in the renderable file.
In the embodiment of the present application, one lens file to be manufactured generally corresponds to one lens to be manufactured, and one lens to be manufactured generally corresponds to two frames of images, that is, one lens file to be manufactured generally corresponds to at least two frames of images, when the lens file to be manufactured is processed by using a lighting layered rendering template file and a synthesis template file, the lighting layered rendering template file may be loaded to generate a renderable file corresponding to the lens file to be manufactured, the renderable file is first rendered in a single frame, the rendered single frame layered rendering material is synthesized, a single frame image is output, and after determining that the renderable file is correct according to the single frame image, the renderable file is started to perform sequential rendering. Specifically, after single-frame rendering and synthesis are carried out on a renderable file, the obtained single-frame image is transmitted back to the terminal device, the terminal device displays the single-frame image, a user can conveniently check whether the manufactured single-frame image is wrong, if not, the renderable file is determined to be correct, a confirmation instruction of the renderable file is triggered and generated, the confirmation instruction is sent to the cloud rendering platform, and after the cloud rendering platform receives the confirmation instruction, sequence rendering is started, the renderable file is subjected to sequence rendering, and a synthesized sequence image is output.
The light layered rendering template file can be divided into a light template file and a layered rendering template file, the light template file is used for indicating the lens file to be manufactured to load light or perform light processing on the lens file to be manufactured, and the layered rendering template file is used for indicating the lens file to be manufactured to create layers. Renderable files may refer to files to be rendered that have been loaded with lights and created in layers.
The composite sequence image is at least two frames of images output after a lens file to be manufactured is manufactured, and the frame number range of the composite sequence image can be obtained from a database of a cloud rendering platform.
Optionally, the lighting layered rendering template file includes lighting layered rendering template files corresponding to N kinds of script parameters, respectively, the composition template file includes composition template files corresponding to L kinds of script parameters, respectively, N is an integer greater than zero, and L is an integer greater than zero; the embodiment of the application further comprises:
acquiring a target lamplight layered rendering template file from lamplight layered rendering template files respectively corresponding to the N kinds of script parameters;
acquiring target synthesis template files from the synthesis template files respectively corresponding to the L kinds of script parameters;
using the light to render the template file and the synthesized template file in a layered manner, processing the lens file to be manufactured, and acquiring a synthesized sequence image corresponding to the lens file to be manufactured comprises:
and using the target lamplight to render the template file and the target synthesis template file in a layering manner, processing the lens file to be manufactured, and acquiring a synthesis sequence image corresponding to the lens file to be manufactured.
The N kinds of script parameters include, but are not limited to, a project, a collection, a field, a lens number, a light layered rendering template file corresponding to the N kinds of script parameters respectively includes, but is not limited to, a project light layered rendering template file, a collection light layered rendering template file, a field light layered rendering template file, and a lens number light layered rendering template file, the project light layered rendering template file refers to a light layered rendering template file that all lens files to be manufactured belonging to the same project can use, the collection light layered rendering template file refers to a light layered rendering template file that all lens files to be manufactured belonging to the same collection can use, the field light layered rendering template file refers to a light layered rendering template file that all lens files to be manufactured belonging to the same field can use, and the lens number light layered rendering template file refers to a light layered rendering template file that all lens files to be manufactured belonging to the same lens number can use And rendering the template file by light layers.
The L types of script parameters include, but are not limited to, a project, a collection number, a scene, and a lens number, the synthesis template files corresponding to the L types of script parameters respectively include, but are not limited to, a project synthesis template file, a collection number synthesis template file, a scene synthesis template file, and a lens number synthesis template file, the project synthesis template file refers to a synthesis template file that all to-be-made lens files belonging to the same project can use, the collection number synthesis template file refers to a synthesis template file that all to-be-made lens files belonging to the same collection number can use, the scene synthesis template file refers to a synthesis template file that all to-be-made lens files belonging to the same scene can use, and the lens number synthesis template file refers to a synthesis template file that all to-be-made lens files belonging to the same lens number can use.
The target lighting layered rendering template file may refer to a lighting layered rendering template file used when a shot file to be manufactured is processed (i.e., post-manufacturing), and the target composition template file may refer to a composition template file used when the shot file to be manufactured is processed.
Optionally, the obtaining a target lighting layered rendering template file from the lighting layered rendering template files corresponding to the N kinds of session-keeping parameters respectively includes:
when N is 1, determining a lamplight layered rendering template file corresponding to a script mark parameter included in the lamplight layered rendering template file as the target lamplight rendered layered template file;
when N is larger than 1, acquiring the priority of the lamplight layered rendering template file corresponding to the N kinds of script parameters respectively;
and acquiring a lamplight layered rendering template file with the highest priority from lamplight layered rendering template files respectively corresponding to the N kinds of script parameters, and determining the lamplight layered rendering template file with the highest priority as the target lamplight layered rendering template file.
In the embodiment of the application, the priority of different light layered rendering template files can be preset, for example, the priority of the project light layered rendering template file, the integrated light layered rendering template file, the field light layered rendering template file and the lens number light layered rendering template file is from low to high in sequence of the project light layered rendering template file, the integrated light layered rendering template file, the field light layered rendering template file and the lens number light layered rendering template file, when the number of the light layered rendering template files in the integrated file is 1, the light layered rendering template file is a target light layered rendering template file, when the number of the light layered rendering template files in the integrated file is more than 1, and taking the lamplight layered rendering template file with the highest priority as a target lamplight layered rendering template file.
Optionally, the obtaining a target synthesis template file from the synthesis template files respectively corresponding to the L types of script parameters includes:
when L is 1, determining a synthetic template file corresponding to a script parameter included in the synthetic template file as the target synthetic template file;
when L is larger than 1, acquiring the priority of the synthetic template file corresponding to the L kinds of script parameters respectively;
and acquiring the synthetic template file with the highest priority from the priorities of the synthetic template files respectively corresponding to the L kinds of script parameters, and determining the synthetic template file with the highest priority as the target synthetic template file.
In this embodiment, priorities of different synthesis template files may be preset, for example, the priority of the project synthesis template file, the collection synthesis template file, the field synthesis template file, and the lens number synthesis template file is, in order from low to high, the project synthesis template file, the collection synthesis template file, the field synthesis template file, and the lens number synthesis template file, where when the number of synthesis template files in the integration file is 1, the synthesis template file is the target synthesis template file, and when the number of synthesis template files in the integration file is greater than 1, the synthesis template file with the highest priority is used as the target synthesis template file.
In the embodiment of the application, in the process of processing the lens file to be manufactured by using the light layered rendering template file and the synthesis template file by the cloud rendering platform, the processing effect graph and/or the processing progress of the shot file to be made at each stage can be obtained, and sending the processing effect graph and/or the processing progress to the terminal device, wherein when the terminal device receives the processing effect graph and/or the processing progress of the lens file to be manufactured, which is sent by the cloud rendering platform, displaying the processing effect graph and/or the processing progress, wherein a user can check whether a data error occurs in the processing process through the processing effect graph displayed by the terminal equipment, and if the data error occurs, and triggering a pause instruction, sending the pause instruction to the cloud rendering platform, and pausing the corresponding processing process when the cloud rendering platform receives the pause instruction. FIG. 4a is a diagram illustrating an exemplary processing progress of a shot file to be produced; fig. 4b is an exemplary diagram showing another processing progress of the shot file to be produced and a composite map, which is an image in the composite sequence image.
Step S303, converting the synthetic sequence image corresponding to the lens file to be made into a video format, and generating a three-dimensional animation.
The video format includes, but is not limited to, MOV format, MOV or QuickTime movie format, which is an audio and video file format developed by Apple inc.
According to the embodiment of the application, when the lens file to be manufactured is manufactured in the later period, the light layered rendering template file and the synthesis template file are called to automatically process the lens file to be manufactured in batches, so that the automatic manufacturing of the lenses in batches can be realized, the repeated setting of parameters by a user is not needed, the manual operation is reduced, the manufacturing time is saved, and the manufacturing efficiency and the accuracy are improved.
Fig. 5 is a schematic structural diagram of a three-dimensional animation post-production device according to a fourth embodiment of the present application, and for convenience of description, only the parts related to the embodiment of the present application are shown.
The three-dimensional animation post-production device comprises:
a template file obtaining module 51, configured to obtain a lighting layered rendering template file and a synthesis template file;
the first template generating module 52 is configured to generate a lamplight layered rendering template file according to the lamplight layered rendering template file;
a second template generating module 53, configured to generate a synthesized template file according to the synthesized template file;
the file integration module 54 is configured to, when a lens file to be manufactured is obtained, integrate the lens file to be manufactured, the lighting layered rendering template file, and the synthesis template file to obtain an integrated file;
and the file sending module 55 is configured to send the integrated file to a cloud rendering platform, where the integrated file is used to instruct the cloud rendering platform to process the lens file to be manufactured according to the lighting layered rendering template file and the synthesis template file in the integrated file, so as to generate a three-dimensional animation.
Optionally, the first template generating module 52 is specifically configured to:
according to the lamplight layered rendering template file, obtaining lamplight parameters and layered parameters;
generating the lamplight layered rendering template file according to the lamplight parameters and the layered parameters;
the second template generating module 53 is specifically configured to:
acquiring synthesis parameters according to the synthesis template file;
and generating the synthesis template file according to the synthesis parameters.
The three-dimensional animation post-production device provided by the embodiment of the application can be applied to the second embodiment, and for details, reference is made to the description of the second embodiment, and details are not repeated here.
Fig. 6 is a schematic structural diagram of a three-dimensional animation post-production device provided in the fifth embodiment of the present application, and for convenience of description, only the parts related to the embodiment of the present application are shown.
The three-dimensional animation post-production device comprises:
the template file acquisition module 61 is used for acquiring a lens file to be manufactured, a light layered rendering template file and a synthesis template file from the integrated file when the integrated file is received;
the file processing module 62 is configured to render the template file and the composite template file in a layered manner by using the light, process the shot file to be made, and acquire a composite sequence image corresponding to the shot file to be made;
and the animation generation module 63 is configured to convert the target image corresponding to the shot file to be made into a video format, and generate a three-dimensional animation.
Optionally, the file processing module 62 is specifically configured to:
using the lamplight to render the template file in a layered mode to generate a renderable file corresponding to the lens file to be manufactured;
performing single-frame rendering on the renderable file, synthesizing the rendered single-frame layered material by using the synthesis template file, and outputting a single-frame image;
based on the single-frame image, performing sequential rendering on the renderable file when a confirmation instruction of the renderable file is received;
and outputting a composite sequence image after finishing rendering the sequence layered material in the renderable file.
Optionally, the lighting layered rendering template file includes lighting layered rendering template files corresponding to N kinds of script parameters, respectively, the composition template file includes composition template files corresponding to L kinds of script parameters, respectively, N is an integer greater than zero, and L is an integer greater than zero;
the three-dimensional animation post-production device further comprises:
the first target acquisition module is used for acquiring a target lamplight layered rendering template file from lamplight layered rendering template files respectively corresponding to the N kinds of script parameters;
a second target obtaining module, configured to obtain target synthesis template files from synthesis template files corresponding to the L types of script parameters, respectively;
the file processing module 62 is specifically configured to:
and using the target lamplight to render the template file and the target synthesis template file in a layering manner, processing the lens file to be manufactured, and acquiring a synthesis sequence image corresponding to the lens file to be manufactured.
Optionally, the first target obtaining module is specifically configured to:
when N is 1, determining that the lamplight layered rendering template file corresponding to one type of script recording parameter included in the lamplight layered rendering template file is the target lamplight layered rendering template file;
when N is larger than 1, acquiring the priority of the lamplight layered rendering template file corresponding to the N kinds of script parameters respectively;
acquiring a lamplight layered rendering template file with the highest priority from lamplight layered rendering template files respectively corresponding to the N kinds of script parameters, and determining the lamplight layered rendering template file with the highest priority as the target lamplight layered rendering template file;
the second target acquisition module is specifically configured to:
when L is 1, determining a synthetic template file corresponding to a script parameter included in the synthetic template file as the target synthetic template file;
when L is larger than 1, acquiring the priority of the synthetic template file corresponding to the L kinds of script parameters respectively;
and acquiring the synthetic template file with the highest priority from the priorities of the synthetic template files respectively corresponding to the L kinds of script parameters, and determining the synthetic template file with the highest priority as the target synthetic template file.
The three-dimensional animation post-production device provided by the embodiment of the application can be applied to the third embodiment, and for details, reference is made to the description of the third embodiment, and details are not repeated here.
Fig. 7 is a schematic structural diagram of a terminal device according to a sixth embodiment of the present application. As shown in fig. 7, the terminal device 7 of this embodiment includes: one or more processors 70 (only one of which is shown), a memory 71 and a computer program 72 stored in said memory 71 and executable on said processor 70. The processor 70, when executing the computer program 72, implements the steps of the second embodiment of the three-dimensional animation post-production method.
The terminal device 7 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is merely an example of a terminal device 7 and does not constitute a limitation of the terminal device 7 and may comprise more or less components than shown, or some components may be combined, or different components, for example the terminal device may further comprise input output devices, network access devices, buses, etc.
The Processor 70 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may be an internal storage unit of the terminal device 7, such as a hard disk or a memory of the terminal device 7. The memory 71 may also be an external storage device of the terminal device 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 7. Further, the memory 71 may also include both an internal storage unit and an external storage device of the terminal device 7. The memory 71 is used for storing the computer program and other programs and data required by the terminal device. The memory 71 may also be used to temporarily store data that has been output or is to be output.
Fig. 8 is a schematic structural diagram of a cloud rendering platform provided in the seventh embodiment of the present application. As shown in fig. 8, the cloud rendering platform 8 of this embodiment includes: one or more processors 80 (only one of which is shown), a memory 81, and a computer program 82 stored in the memory 81 and executable on the processors 80. The processor 80, when executing the computer program 82, implements the steps of the third embodiment of the three-dimensional animation post-production method described above.
The cloud rendering platform 8 may be a computing device such as a cloud server. The cloud rendering platform may include, but is not limited to, a processor 80, a memory 81. Those skilled in the art will appreciate that fig. 8 is merely an example of a cloud rendering platform 8, and does not constitute a limitation of the cloud rendering platform 8, and may include more or fewer components than illustrated, or combine certain components, or different components, e.g., the cloud rendering platform may also include input-output devices, network access devices, buses, etc.
The processor 80 may be a CPU, but may also be other general purpose processors, digital signal processors DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 81 may be an internal storage unit of the cloud rendering platform 8, such as a hard disk or a memory of the cloud rendering platform 8. The memory 81 may also be an external storage device of the cloud rendering platform 8, such as a plug-in hard disk, SMC, SD card, flash memory card, etc. equipped on the cloud rendering platform 8. Further, the memory 81 may also include both an internal storage unit and an external storage device of the cloud rendering platform 8. The memory 81 is used for storing the computer programs and other programs and data required by the cloud rendering platform. The memory 81 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device/cloud rendering platform and method may be implemented in other ways. For example, the above-described apparatus/terminal device/cloud rendering platform embodiments are merely illustrative, and for example, the division of the modules or units is only one logical function division, and there may be another division manner in actual implementation, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
When the computer program product runs on a terminal device, the terminal device can implement the steps in the second method embodiment; or when the computer program product runs on a cloud rendering platform, the cloud rendering platform can implement the steps in the third embodiment of the method when executed.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A three-dimensional animation post-production method is characterized by comprising the following steps:
obtaining a light layered rendering model file and a synthesis model file;
generating a lamplight layered rendering template file according to the lamplight layered rendering template file;
generating a synthesis template file according to the synthesis template file;
when a lens file to be manufactured is obtained, integrating the lens file to be manufactured, the lamplight layered rendering template file and the synthesis template file to obtain an integrated file;
and sending the integrated file to a cloud rendering platform, wherein the integrated file is used for indicating the cloud rendering platform to process the lens file to be manufactured according to the lamplight layered rendering template file and the synthesis template file in the integrated file, so as to generate the three-dimensional animation.
2. The method of claim 1, wherein the generating a light layered rendering template file according to the light layered rendering template file comprises:
according to the lamplight layered rendering template file, obtaining lamplight parameters and layered parameters;
generating the lamplight layered rendering template file according to the lamplight parameters and the layered parameters;
the generating the composite template file according to the composite template file comprises:
acquiring synthesis parameters according to the synthesis template file;
and generating the synthesis template file according to the synthesis parameters.
3. A three-dimensional animation post-production method is characterized by comprising the following steps:
when an integrated file is received, acquiring a lens file to be manufactured, a light layered rendering template file and a synthesis template file from the integrated file;
using the lamplight to render the template file and the synthesis template file in a layered mode, processing the lens file to be manufactured, and obtaining a synthesis sequence image corresponding to the lens file to be manufactured;
and converting the synthetic sequence image corresponding to the lens file to be manufactured into a video format to generate the three-dimensional animation.
4. The method of claim 3, wherein the step of rendering the template file and the composite template file in layers by using the light, and processing the shot file to be produced to obtain the composite sequence image corresponding to the shot file to be produced comprises:
using the lamplight to render the template file in a layered mode to generate a renderable file corresponding to the lens file to be manufactured;
performing single-frame rendering on the renderable file, synthesizing the rendered single-frame layered material by using the synthesis template file, and outputting a single-frame image;
based on the single-frame image, performing sequential rendering on the renderable file when a confirmation instruction of the renderable file is received;
and outputting a composite sequence image after finishing rendering the sequence layered material in the renderable file.
5. The post-production method of the three-dimensional animation according to claim 3 or 4, wherein the light layered rendering template file comprises light layered rendering template files corresponding to N kinds of script parameters respectively, the composition template file comprises composition template files corresponding to L kinds of script parameters respectively, N is an integer greater than zero, and L is an integer greater than zero; the post-production method of the three-dimensional animation further comprises the following steps:
acquiring a target lamplight layered rendering template file from lamplight layered rendering template files respectively corresponding to the N kinds of script parameters;
acquiring target synthesis template files from the synthesis template files respectively corresponding to the L kinds of script parameters;
using the light to render the template file and the synthesized template file in a layered manner, processing the lens file to be manufactured, and acquiring a synthesized sequence image corresponding to the lens file to be manufactured comprises:
and using the target lamplight to render the template file and the target synthesis template file in a layering manner, processing the lens file to be manufactured, and acquiring a synthesis sequence image corresponding to the lens file to be manufactured.
6. The method for post-production of three-dimensional animation according to claim 5, wherein the obtaining of the target lighting layered rendering template file from the lighting layered rendering template files respectively corresponding to the N kinds of script parameters comprises:
when N is 1, determining that the lamplight layered rendering template file corresponding to one type of script recording parameter included in the lamplight layered rendering template file is the target lamplight layered rendering template file;
when N is larger than 1, acquiring the priority of the lamplight layered rendering template file corresponding to the N kinds of script parameters respectively;
acquiring a lamplight layered rendering template file with the highest priority from lamplight layered rendering template files respectively corresponding to the N kinds of script parameters, and determining the lamplight layered rendering template file with the highest priority as the target lamplight layered rendering template file;
the acquiring a target synthesis template file from the synthesis template files respectively corresponding to the L types of script parameters includes:
when L is 1, determining a synthetic template file corresponding to a script parameter included in the synthetic template file as the target synthetic template file;
when L is larger than 1, acquiring the priority of the synthetic template file corresponding to the L kinds of script parameters respectively;
and acquiring the synthetic template file with the highest priority from the priorities of the synthetic template files respectively corresponding to the L kinds of script parameters, and determining the synthetic template file with the highest priority as the target synthetic template file.
7. A three-dimensional animation post-production apparatus characterized by comprising:
the model file acquisition module is used for acquiring the lamplight layered rendering model file and the synthesis model file;
the first template generation module is used for generating a lamplight layered rendering template file according to the lamplight layered rendering template file;
the second template generating module is used for generating a synthetic template file according to the synthetic template file;
the file integration module is used for integrating the lens file to be manufactured, the lamplight layered rendering template file and the synthesis template file to obtain an integrated file when the lens file to be manufactured is obtained;
and the file sending module is used for sending the integrated file to a cloud rendering platform, wherein the integrated file is used for indicating the cloud rendering platform to process the lens file to be manufactured according to the lamplight layered rendering template file and the synthesis template file in the integrated file so as to generate the three-dimensional animation.
8. A three-dimensional animation post-production apparatus characterized by comprising:
the template file acquisition module is used for acquiring a lens file to be manufactured, a light layered rendering template file and a synthesis template file from the integrated file when the integrated file is received;
the file processing module is used for processing the shot file to be manufactured by using the lamplight layered rendering template file and the synthesis template file to obtain a synthesis sequence image corresponding to the shot file to be manufactured;
and the animation generation module is used for converting the synthetic sequence image corresponding to the lens file to be made into a video format to generate the three-dimensional animation.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the three-dimensional animation post-production method according to claim 1 or 2 when executing the computer program.
10. A cloud rendering platform comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor when executing the computer program implements the steps of the three-dimensional animation post-production method according to any one of claims 3 to 6.
CN202010067233.9A 2020-01-20 2020-01-20 Three-dimensional animation post-production method and device, terminal equipment and cloud rendering platform Active CN111223169B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010067233.9A CN111223169B (en) 2020-01-20 2020-01-20 Three-dimensional animation post-production method and device, terminal equipment and cloud rendering platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010067233.9A CN111223169B (en) 2020-01-20 2020-01-20 Three-dimensional animation post-production method and device, terminal equipment and cloud rendering platform

Publications (2)

Publication Number Publication Date
CN111223169A true CN111223169A (en) 2020-06-02
CN111223169B CN111223169B (en) 2023-02-28

Family

ID=70831291

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010067233.9A Active CN111223169B (en) 2020-01-20 2020-01-20 Three-dimensional animation post-production method and device, terminal equipment and cloud rendering platform

Country Status (1)

Country Link
CN (1) CN111223169B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113572917A (en) * 2021-09-27 2021-10-29 北京天图万境科技有限公司 Electronic clapper board and clapper control system
CN115883818A (en) * 2022-11-29 2023-03-31 北京优酷科技有限公司 Automatic statistical method and device for video frame number, electronic equipment and storage medium
CN116367387A (en) * 2023-06-02 2023-06-30 深圳市千岩科技有限公司 Lamp effect generation method, lamp effect control method, device and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103208132A (en) * 2012-08-10 2013-07-17 天津十彩动画科技有限公司 Cluster animation rendering numerical control system
US20130336640A1 (en) * 2012-06-15 2013-12-19 Efexio, Inc. System and method for distributing computer generated 3d visual effects over a communications network
CN106780678A (en) * 2017-02-03 2017-05-31 北京华严世界影业有限公司 A kind of simulation animation film making method and system complete in real time
CN107526623A (en) * 2016-06-22 2017-12-29 腾讯科技(深圳)有限公司 A kind of data processing method and device
CN108876887A (en) * 2017-05-16 2018-11-23 北京京东尚科信息技术有限公司 rendering method and device
CN109448089A (en) * 2018-10-22 2019-03-08 美宅科技(北京)有限公司 A kind of rendering method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130336640A1 (en) * 2012-06-15 2013-12-19 Efexio, Inc. System and method for distributing computer generated 3d visual effects over a communications network
CN103208132A (en) * 2012-08-10 2013-07-17 天津十彩动画科技有限公司 Cluster animation rendering numerical control system
CN107526623A (en) * 2016-06-22 2017-12-29 腾讯科技(深圳)有限公司 A kind of data processing method and device
CN106780678A (en) * 2017-02-03 2017-05-31 北京华严世界影业有限公司 A kind of simulation animation film making method and system complete in real time
CN108876887A (en) * 2017-05-16 2018-11-23 北京京东尚科信息技术有限公司 rendering method and device
CN109448089A (en) * 2018-10-22 2019-03-08 美宅科技(北京)有限公司 A kind of rendering method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
干彬;许志强;王雪梅;: "云计算技术在动漫渲染行业的应用研究" *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113572917A (en) * 2021-09-27 2021-10-29 北京天图万境科技有限公司 Electronic clapper board and clapper control system
CN115883818A (en) * 2022-11-29 2023-03-31 北京优酷科技有限公司 Automatic statistical method and device for video frame number, electronic equipment and storage medium
CN115883818B (en) * 2022-11-29 2023-09-19 北京优酷科技有限公司 Video frame number automatic counting method and device, electronic equipment and storage medium
CN116367387A (en) * 2023-06-02 2023-06-30 深圳市千岩科技有限公司 Lamp effect generation method, lamp effect control method, device and medium
CN116367387B (en) * 2023-06-02 2023-08-15 深圳市千岩科技有限公司 Lamp effect generation method, lamp effect control method, device and medium

Also Published As

Publication number Publication date
CN111223169B (en) 2023-02-28

Similar Documents

Publication Publication Date Title
CN111223169B (en) Three-dimensional animation post-production method and device, terminal equipment and cloud rendering platform
CN111475675B (en) Video processing system
CN111932664A (en) Image rendering method and device, electronic equipment and storage medium
CN105827897A (en) Adjustment card manufacturing method, system, color correction matrix debugging method and device
CN108702452A (en) A kind of image capturing method and device
CN111667420B (en) Image processing method and device
CN111612878B (en) Method and device for making static photo into three-dimensional effect video
KR20150129260A (en) Service System and Method for Object Virtual Reality Contents
CN111179402B (en) Rendering method, device and system of target object
CN105488470A (en) Method and apparatus for determining character attribute information
CN114694136A (en) Article display method, device, equipment and medium
CN113421312A (en) Method and device for coloring black and white video, storage medium and terminal
CN111739150B (en) Noble metal three-dimensional model construction method and device
CN112489144A (en) Image processing method, image processing apparatus, terminal device, and storage medium
CN113938750A (en) Video processing method and device, electronic equipment and storage medium
WO2023048983A1 (en) Methods and apparatus to synthesize six degree-of-freedom views from sparse rgb-depth inputs
CN112991497B (en) Method, device, storage medium and terminal for coloring black-and-white cartoon video
CN105824608B (en) Processing, plug-in unit generation method and the device of process object
CN109040612B (en) Image processing method, device and equipment of target object and storage medium
CN115775310A (en) Data processing method and device, electronic equipment and storage medium
CN113034449A (en) Target detection model training method and device and communication equipment
CN113066166A (en) Image processing method and device and electronic equipment
CN112312041A (en) Image correction method and device based on shooting, electronic equipment and storage medium
CN111292245A (en) Image processing method and device
CN113395439A (en) Virtual image distance measuring method, system, device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant