CN111598983A - Animation system, animation method, storage medium, and program product - Google Patents

Animation system, animation method, storage medium, and program product Download PDF

Info

Publication number
CN111598983A
CN111598983A CN202010419549.XA CN202010419549A CN111598983A CN 111598983 A CN111598983 A CN 111598983A CN 202010419549 A CN202010419549 A CN 202010419549A CN 111598983 A CN111598983 A CN 111598983A
Authority
CN
China
Prior art keywords
animation
virtual
rendering
user
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010419549.XA
Other languages
Chinese (zh)
Inventor
周春光
唐锋
周健巍
陈泽鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Le Element Culture Development Co ltd
Original Assignee
Beijing Le Element Culture Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Le Element Culture Development Co ltd filed Critical Beijing Le Element Culture Development Co ltd
Priority to CN202010419549.XA priority Critical patent/CN111598983A/en
Publication of CN111598983A publication Critical patent/CN111598983A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/603D [Three Dimensional] animation of natural phenomena, e.g. rain, snow, water or plants
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Abstract

The present invention relates to an animation production system, method, storage medium and program product, the system comprising: the asset acquisition module is used for acquiring animation assets; and the composite rendering module is used for processing and rendering the animation assets by utilizing the real-time rendering engine so as to obtain a rendering result containing the virtual character and the virtual scene. The asset acquisition module is used for making or acquiring pre-made model data, shooting actors through a real-time dynamic compensation technology to obtain role movement data so as to transmit the role movement data to the virtual role, and shooting a real scene so as to generate or process the virtual scene by using the real scene. Further, the system includes an asset management module configured to: the animation assets and one or more mirrors are managed in a unified manner. By using the animation production system provided by the invention, virtual scenes and characters are shot by using a traditional shooting method, the threshold is reduced, a large number of traditional operators can easily operate, and the learning cost is reduced.

Description

Animation system, animation method, storage medium, and program product
Technical Field
The present invention relates to the field of animation production technologies, and in particular, to an animation production system, method, storage medium, and program product.
Background
The traditional animation scheme is that an operator sets the position angle of a lens in a computer space, and the operator needs to have high professional skill. In addition, in the conventional movie and animation production flow, ten kinds of software are generally required to be cooperated with each other from production to completion, so that the problems of compatibility, universality and the like in information transmission among a plurality of software need to be solved, and the production period is also longer.
Disclosure of Invention
It is an object of the present invention to provide a novel animation system, method, storage medium and program product.
The purpose of the invention is realized by adopting the following technical scheme. The animation system provided by the invention comprises: the asset acquisition module is used for acquiring animation assets; the composite rendering module is used for processing and rendering the animation assets by utilizing a real-time rendering engine to obtain a rendering result containing the virtual character and the virtual scene; the asset acquisition module comprises an actor shooting unit, a scene shooting unit and a model making unit; the model making unit is used for making model data of the virtual role and the virtual scene and/or acquiring the model data which is made in advance; the actor shooting unit is used for shooting actors through a real-time motion compensation technology to obtain role motion data so as to transmit the role motion data to the virtual role; the scene shooting unit is used for shooting a real scene so as to generate or process the virtual scene by using the real scene.
The object of the invention can be further achieved by the following technical measures.
In the animation system, the composition rendering module includes one or more of the following units: the artificial processing unit is used for providing an interface for artificially processing the animation assets and/or controlling the virtual characters and the virtual scenes, receiving a user instruction, and processing the animation assets and/or controlling the virtual characters and the virtual scenes according to the user instruction so as to facilitate the user to control and adjust the rendering result; the automatic processing unit is used for automatically processing other animation assets or automatically controlling virtual characters and virtual scenes according to a preset program and/or some animation assets; the preview unit is used for displaying the processing effect in real time in the process of processing the animation assets; and the rendering unit is used for rendering according to the processed animation assets to obtain the rendering result.
In the animation system, the composite rendering module includes: the material change recording unit is used for recording the change condition of the material applied to one virtual object so as to process and render by using the change condition of the material, so that the original attribute of the material is not changed and the application of the material to other virtual objects is not influenced; wherein the virtual object comprises a virtual character or a virtual scene.
In the animation system, the artifact processing unit includes one or more of the following sub-units: the climate change subunit is used for providing an interface for adjusting the climate of the virtual scene in real time, receiving a first instruction of a user, and adjusting the climate of the virtual scene in real time according to the first instruction of the user; the role dynamic adjusting subunit is used for providing an interface for adjusting the appearance of the virtual role in real time, receiving a second instruction of a user, and adjusting the appearance of the virtual role in real time according to the second instruction of the user; the picture color correction subunit is used for providing an interface for performing color correction on the whole picture previewed in real time and/or the single object in the picture, receiving a third instruction of a user, and performing color correction in real time according to the third instruction of the user; the special effect subunit is used for providing an interface for adding or adjusting the special effect in the virtual scene in real time, receiving a fourth instruction of a user, and adjusting the special effect in the virtual scene in real time according to the fourth instruction of the user; and the lighting subunit is used for providing an interface for adjusting lighting parameters in the virtual scene in real time, receiving a fifth instruction of the user, and adjusting lighting effects in the virtual scene in real time according to the fifth instruction of the user.
In the animation system, the character dynamic adjustment subunit is specifically configured to: when the material of the virtual character is adjusted, the original attribute of the material is not changed, but the change condition of the material is recorded, so that the virtual character is rendered according to the change condition of the material during rendering.
In the animation system, the asset acquisition module includes an audio acquisition unit, configured to acquire an audio material; the automatic processing unit comprises a lamplight automatic processing subunit, and the lamplight automatic processing subunit is used for automatically generating or automatically adjusting lamplight parameters of a virtual scene according to the audio material, so that the lamplight in the virtual scene changes along with the rhythm of the audio material.
In the animation system, the composite rendering module includes a coordinate matching unit, configured to: and matching the coordinates of the real scene and the virtual scene according to the model data of the real scene and the virtual scene obtained by shooting.
The animation system further includes an asset management module, communicatively connected to the one or more asset acquisition modules and the one or more composition rendering modules, and configured to: uniformly managing the animation assets and one or more sub-mirrors; the split mirror comprises the animation assets processed by the synthesis rendering module as split mirrors to be rendered and/or rendering results rendered by the synthesis rendering module as rendered split mirrors.
In the animation system, the asset management module includes an asset access unit, configured to: providing an interface to read the animation asset/the split mirror; sending the animation assets/the split mirrors which are read according to a reading request input by a user or a reading request automatically generated by a system; providing an interface to receive the animation asset/the split mirror; storing the received animation assets/the partial mirror.
In the animation system, the asset management module includes a process monitoring unit, configured to perform one or more of the following: acquiring manufacturing flow information including manufacturing progress of the split mirror and a responsible person of the split mirror; displaying the manufacturing process information; providing an interface for setting a task list and making a priority; showing the animation assets used by each of the mirrors and the location of the mirror in the entire movie.
In the animation system, the asset management module includes an offline rendering unit, configured to: and receiving an offline rendering request, performing rendering on the to-be-rendered split mirror according to the offline rendering request, and storing a rendering result.
The object of the present invention is also achieved by the following technical means. The animation production system provided by the invention comprises a server device and one or more client devices, wherein the client devices are in communication connection with the server device; each user side device is used for processing split mirrors and comprises at least one of the asset acquisition module and the composite rendering module; the system comprises an asset acquisition module, a synthetic rendering module and a processing module, wherein the asset acquisition module is used for acquiring a lens material to be processed, and the synthetic rendering module is used for processing the lens material to obtain a lens; the server device comprises the asset management module and is used for uniformly managing each split mirror material and each split mirror.
The object of the present invention is also achieved by the following technical means. The animation production method provided by the invention comprises the following steps: obtaining an animation asset, comprising: making model data of a virtual character and a virtual scene and/or acquiring the model data which has been made in advance, shooting actors by a real-time motion compensation technology to obtain character motion data so as to transfer the character motion data to the virtual character, and shooting a real scene so as to generate or process the virtual scene by using the real scene; and processing and rendering the animation assets by utilizing a real-time rendering engine to obtain a rendering result containing the virtual character and the virtual scene.
In the animation method, the processing and rendering the animation asset by using the real-time rendering engine includes one or more of the following steps: providing an interface for artificially processing the animation assets and controlling the virtual characters and the virtual scenes, receiving user instructions, and processing the animation assets and/or controlling the virtual characters and the virtual scenes according to the user instructions so that a user can control and adjust the rendering result; automatically processing other of the animation assets or automatically controlling virtual characters, virtual scenes according to a preset program, and/or according to some of the animation assets; displaying the processing effect in real time in the process of processing the animation assets; and rendering according to the processed animation assets to obtain the rendering result.
In the animation method, the processing and rendering the animation asset by using the real-time rendering engine includes: recording the change condition of the material applied to one virtual object so as to process and render by using the change condition of the material, so that the original attribute of the material is not changed and the application of the material to other virtual objects is not influenced; wherein the virtual object comprises a virtual character or a virtual scene.
The animation method provides an interface for artificially processing the animation assets and controlling the virtual characters and the virtual scenes so that a user can control and adjust rendering results, and comprises one or more of the following steps: receiving a first instruction of a user by using a provided climate adjusting interface, and adjusting the climate of the virtual scene in real time according to the first instruction of the user; receiving a second instruction of a user by using the provided interface for adjusting the role, and adjusting the appearance of the virtual role in real time according to the second instruction of the user; receiving a third instruction of a user by utilizing a provided image color correction interface, and performing color correction on the whole image previewed in real time and/or the single object in the image according to the third instruction of the user; receiving a fourth instruction of a user by using a provided special effect interface, and adding or adjusting the special effect in the virtual scene in real time according to the fourth instruction of the user; and receiving a fifth instruction of the user by using the provided interface for adjusting the light parameters, and adjusting the light effect in the virtual scene in real time according to the fifth instruction of the user.
In the animation method, the adjusting of the appearance of the virtual character in real time by using the provided interface for adjusting the character specifically includes: when the material of the virtual character is adjusted, the original attribute of the material is not changed, but the change condition of the material is recorded, so that the virtual character is rendered according to the change condition of the material during rendering.
In the animation production method, the obtaining of the animation asset further includes obtaining an audio material; the automatically processing the animation assets or controlling the virtual characters and the virtual scenes according to a preset program and/or a part of the animation assets comprises the following steps: and automatically generating or automatically adjusting light parameters of the virtual scene according to the audio materials so that the light in the virtual scene changes along with the rhythm of the audio materials.
In the animation method, the processing and rendering the animation asset by using the real-time rendering engine includes: and matching the coordinates of the real scene and the virtual scene according to the model data of the real scene and the virtual scene obtained by shooting.
The animation method further includes: uniformly managing the animation assets and one or more sub-mirrors; and the split mirror comprises the processed animation assets as split mirrors to be rendered, and/or the rendering result obtained by rendering is used as rendered split mirrors.
In the animation method, the unified management of the animation assets and the one or more mirrors includes: providing an interface to read the animation asset/the split mirror; sending the animation assets/the split mirrors which are read according to a reading request input by a user or a reading request automatically generated by a system; providing an interface to receive the animation asset/the split mirror; storing the received animation assets/the partial mirror.
In the animation method, the unified management of the animation assets and the one or more mirrors includes performing one or more of the following steps: acquiring manufacturing flow information including manufacturing progress of the split mirror and a responsible person of the split mirror; displaying the manufacturing process information; providing an interface for setting a task list and making a priority; showing the animation assets used by each of the mirrors and the location of the mirror in the entire movie.
In the animation method, the unified management of the animation assets and the one or more mirrors includes: and receiving an offline rendering request, performing rendering on the to-be-rendered split mirror according to the offline rendering request, and storing a rendering result.
The object of the present invention is also achieved by the following technical means. A computer storage medium according to the present invention includes computer instructions that, when executed on a device, cause the device to perform any one of the possible animation methods of any one of the above aspects.
The object of the present invention is also achieved by the following technical means. According to the invention, a computer program product is proposed, which, when run on a device, causes the device to carry out any one of the possible animation methods according to any one of the above aspects.
Compared with the prior art, the invention has obvious advantages and beneficial effects. By the technical scheme, the animation production system, the animation production method, the storage medium and the program product provided by the invention at least have the following advantages and beneficial effects:
(1) the virtual scene and the characters are shot by the traditional shooting method, so that the threshold is reduced, a large number of traditional operators can easily operate, and the learning cost is reduced;
(2) according to the animation production system, animation production can be realized only by utilizing one model production software, action data acquisition and one rendering engine, so that the use amount of the software is reduced, the error rate of information in the transmission process is reduced, most of work in the animation production process is integrated through one platform, the work flow is simplified, the production efficiency is improved, and the communication cost is reduced;
(3) the invention is used for making animation, has short making period and flexible making, and all modules are connected with each other in a seamless way, and can adjust all modules in real time, preview the film in real time and reduce the error rate;
(4) the invention is used for making animation, can automatically generate some animation effects in a virtual scene, and can carry out artificial processing on the animation effects, so that the making is automatic, the efficiency is improved, and the cost is reduced;
(5) the animation production method is used for producing the animation, and the effects of various scenes can be realized by only producing one material ball and recording the changed attributes of different scenes and reading the original material ball and the change condition of the material during previewing or rendering, so that the storage resource is saved; the parameters of a material in a certain scene and a certain role can be changed, and the material body is not changed, so that the material can not be changed in other scenes and other roles;
(6) by arranging the asset management module, the invention can realize the simultaneous and parallel work of a plurality of persons, ensure the integrity of the assets, display all the production flows on line, avoid repeated work, provide convenience in the aspects of corresponding multi-person cooperation and multi-party cooperation and meet the requirements of large multi-person animation production flows;
(7) according to the invention, by arranging the asset management module comprising the rendering engine and the offline rendering unit, a user can transfer the prepared sub-mirror to the server for rendering without occupying the time of workers.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical means of the present invention more clearly understood, the present invention may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present invention more clearly understandable, the following preferred embodiments are described in detail with reference to the accompanying drawings.
Drawings
FIG. 1 is a block diagram of the architecture of an animation system according to an embodiment of the invention;
FIG. 2 is a block diagram of a composition rendering module provided by one embodiment of the invention;
FIG. 3 is a schematic diagram of changing the climate of a virtual scene provided by one embodiment of the invention;
FIG. 4 is a diagram illustrating dynamic adjustment of the material of a virtual character according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of color correction of a picture according to an embodiment of the present invention;
FIG. 6 is a block diagram of an animation system according to another embodiment of the present invention;
FIG. 7 is a block diagram of an asset management module provided by one embodiment of the present invention;
FIG. 8 is a block flow diagram of a method of animation according to one embodiment of the invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiments, structures, features and effects of the animation system, method, storage medium and program product according to the present invention will be made with reference to the accompanying drawings and preferred embodiments.
FIG. 1 is a schematic block diagram of one embodiment of an animation system 100 of the present invention. Referring to FIG. 1, an exemplary animation system 100 of the present invention generally comprises an asset acquisition module 110 and a composition rendering module 120.
The asset acquisition module 110 is used for acquiring animation assets. The animation assets are materials used to animate, are created around the animation project itself, and include, but are not limited to, model data, motion data, video material, program code, and all other virtual digitized artifacts.
In one particular example, the asset acquisition module 110 includes a modeling unit 111, an actor capture unit 112, and a scene capture unit 113. The modeling unit 111 is used to create model data and/or obtain model data that has been created in advance. The model data includes, but is not limited to, models of virtual objects such as virtual characters, virtual scenes, virtual props, and the like. Note that, in general, a virtual scene includes one or more virtual props; virtual characters may be set or superimposed in one or more virtual scenes. The virtual scene and the virtual props included in the virtual scene are not collectively referred to as a virtual scene. Generally, the modeling unit 111 includes modeling software. The actor photographing unit 112 is configured to photograph an actor through a real-time anaplerotic technique to obtain character motion data, and transfer the character motion data to a virtual character so that the virtual character simulates a motion of the actor. The scene capturing unit 113 is configured to capture a real scene to animate the real scene, for example, generate a virtual scene or process the virtual scene using the real scene. It should be noted that the order of acquisition of the various animated assets is not limited, i.e., the order of execution of the units included in the asset acquisition module 110 is not limited, but generally, the model may be created first, the actors may be photographed, and the scene may be photographed.
The composite rendering module 120 is configured to process and render the animation asset by using a real-time rendering engine to obtain a rendering result including the virtual character and the virtual scene.
In some examples of the invention, each module of the animation system 100 is a separate hardware device, for example, the actor shooting unit 112 and the scene shooting unit 113 include cameras, the model making unit 111, and the composition rendering module 120 include computers with corresponding software installed, and a plurality of hardware devices are communicatively connected to transfer data and signals such as animation assets. In yet other examples of the invention, the modules of animation system 100 are different modules within a hardware device.
By using the animation production system 100 provided by the invention, virtual scenes and characters are shot by a traditional shooting method, the threshold is reduced, a large number of traditional animation production personnel (also called operators and users) can easily start, and the learning cost is reduced.
FIG. 2 is a schematic block diagram of a composition rendering module 120 provided in one embodiment of the animation system 100 of the present invention. Referring to fig. 2, in some embodiments, the composite rendering module 120 includes one or more of a human processing unit 121, an automatic processing unit 122, a preview unit 123, and a rendering unit 124.
The human processing unit 121 is configured to provide an interface for artificially processing an animation asset and/or controlling a virtual object such as a virtual character virtual scene, receive one or more user instructions, and process a corresponding animation asset and/or control a virtual character and a virtual scene according to the user instructions, so that a user controls and adjusts a rendering result to obtain a rendering result meeting a requirement. Note that interfaces mentioned herein include, but are not limited to: one or more of a graphical interactive interface, a virtual control, a physical button, and the like. The aforementioned user instructions are received through the interface. Optionally, the user instructions are one or more adjustment operations performed by the user on the animated asset or virtual object according to the production requirements.
The automatic processing unit 122 is configured to automatically process some of the animation assets or automatically control virtual objects such as virtual characters and virtual scenes according to a preset program and/or some of the animation assets.
Preview section 123 is configured to: and in the process of processing the animation assets, the processing effect is displayed in real time so that a user can adjust the processing effect in real time.
The rendering unit 124 is configured to: and rendering according to the processed animation assets to obtain a rendering result.
Virtual characters and virtual scenes simulate real physical properties through various materials (also called attributes), and material balls are a general term for integrating the attributes of the materials. In some embodiments, the composite rendering module 120 includes a preset unit (not shown in the figure) for a user to preset attributes of the virtual character and the virtual scene, and store the preset attributes, so that the preset attributes can be directly applied during an actual processing rendering process.
In an animated project, after a material is created, it may be applied to multiple objects in the project, and since the objects all use the same material, if the parameters of the material are modified, the objects using the material will change. But many times we do not want the aforementioned effect, but rather want: the parameter of a material in a certain scene is changed, but the parameter of the material in other scenes is not changed.
To this end, in some embodiments, the composition rendering module 120 includes a material change recording unit (not shown). The material change recording unit is used for recording the change situation of the material applied to one virtual object (not called as a first object) so as to process and render by using the change situation of the material, thereby not changing the original attribute of the material and not influencing the application of the material to other virtual objects (not called as second objects). The virtual object comprises a virtual character or a virtual scene.
Specifically, the change of the attribute of each material caused by the adjustment of each virtual item in a virtual scene (i.e., the first object) and each virtual character in the virtual scene by the user is recorded as the material change corresponding to the virtual scene. For other virtual scenes (i.e. the aforementioned second object), if it uses one or some of the same materials as the first object, since the original properties of these same materials are not changed, the user's adjustment of the first object does not affect the second object. For example, if the color attribute of a material is set to red and the material is assigned to the beads in scene 1 and scene 2, respectively, then the beads in both scenes are red. If the color of the material is directly changed into green by using the prior art, the small balls of the two scenes are changed into green; with the animation system 100 of the present example, the beads in scene 1 can be turned green, while the beads in scene 2 do not change.
By using the animation production system 100 provided by the invention, the effect of various scenes can be realized by only producing one material ball and recording the changed attributes of different scenes and reading the original material ball and the change condition of the material during previewing or rendering, so that the storage resource is saved; the method can also change the parameter of a material in a certain scene, but cannot change the parameter of the material in other scenes.
Further, in some embodiments, the artifact processing unit 121 includes one or more of a climate change subunit 1211, a character dynamic adjustment subunit 1212, a picture color correction subunit 1213, a special effects subunit 1214, and a lighting subunit 1215.
The climate change subunit 1211 is configured to provide an interface for adjusting the climate of the virtual scene in real time, receive a first user instruction, and adjust the climate of the virtual scene in real time according to the first user instruction. Fig. 3 is a schematic diagram illustrating the transformation of the climate of a virtual scene by the climate transformation subunit 1211 according to an embodiment of the present invention. Referring to fig. 3, a plurality of weather modules are prepared in advance, and when the weather of the virtual scene needs to be changed, the weather changing subunit 1211 is used to change the weather of the scene by loading the corresponding weather module. Optionally, the weather module comprises: the projection angle, the brightness, the color, the shadow effect, the volume light and the sky box form of the sky light can realize different switching of spring, summer, autumn and winter, morning, evening, cloudy, sunny, cloudy and the like. It should be noted that the plurality of weather modules may be pre-manufactured and pre-stored in the system, or may be user-defined. By modularizing the climate change, the complexity of scene change can be simplified, and one-key switching is realized. In addition, the scene climate can be adjusted at the right time and can be watched at any time.
The character dynamic adjustment subunit 1212 is configured to provide an interface for adjusting the appearance of the virtual character in real time, receive a second instruction of the user, and adjust the appearance of the virtual character in real time according to the second instruction of the user. The appearance includes the style and color of skin, clothing, hair, etc. Fig. 4 is a schematic diagram illustrating a material of a virtual character dynamically adjusted by using a character dynamic adjustment subunit 1212 according to an embodiment of the present invention. Referring to fig. 4, the color of the skin and clothes of the virtual character is determined by the material. Giving a material to a model is a reference to that material, so in the general case, all materials on the model that reference that material are themselves. Therefore, modifying the attributes of the material on a model is equivalent to modifying the material itself, and the place where the material is used will be changed. But we do not want this when animating. To this end, the role dynamic adjustment subunit 1212 provided in the present invention is specifically configured to: when one or more materials of the virtual character are adjusted, the original properties of the materials are not changed, but the change condition of the materials is recorded, so that the virtual character is rendered according to the change condition of the materials during rendering. Note that, in this specific example, the character dynamic adjustment subunit 1212 is the same as the technical principle of the material change recording unit described above, and the material of the character is encapsulated based on this technology, and the material is not adjusted one by one, and only the kind of the character, such as skin, clothes, hair, and the like, needs to be adjusted. By using the role dynamic adjustment subunit 1212 provided in the present invention, it is possible to control the material only to influence the current role and the material attribute in the current scene, and the material body is unchanged.
The frame color correction subunit 1213 is configured to provide an interface for performing color correction on the entire frame and/or the individual objects in the frame previewed in real time, receive a third instruction from the user, and perform color correction in real time according to the third instruction from the user. Fig. 5 is a schematic diagram illustrating a frame adjustment by the frame color correction subunit 1213 according to an embodiment of the present invention. Referring to fig. 5, the frame color correction subunit 1213 includes a unified console (or called an operation panel) for user operation, an interface for adjusting parameters such as color, saturation, brightness, and the like of the frame, and an interface for adjusting and adding lens parameters such as depth of field, blur, and filter. Further, the frame color correction subunit 1213 includes, but is not limited to, the aforementioned interfaces, and the interfaces support hot plug, which modules are required to be loaded, and support the joining of new modules. By unifying each interface to a console and adjusting parameters, the desired picture effect can be previewed in real time, so that a director can modify the picture conveniently, and the time of rendering waiting is saved.
The special effect subunit 1214 is configured to provide an interface for adding or adjusting a special effect in the virtual scene in real time, receive a fourth instruction from the user, and adjust the special effect in the virtual scene in real time according to the fourth instruction from the user.
The lighting subunit 1215 is configured to provide an interface for adjusting the lighting parameters in the virtual scene in real time, receive a fifth instruction from the user, and adjust the lighting effect in the virtual scene in real time according to the fifth instruction from the user. Optionally, the light parameter includes a motion parameter of the light, an irradiation direction of the light, and the like.
In some embodiments, the aforementioned asset acquisition module 110 further comprises an audio acquisition unit (not shown in the figures) for acquiring audio material; the automatic processing unit 122 includes an automatic light processing subunit 1221, where the automatic light processing subunit 1221 is configured to automatically generate or automatically adjust a light parameter of the virtual scene according to the audio material, so that the light in the virtual scene changes according to the rhythm of the audio material. Optionally, the light parameters include brightness, color, etc. of the light. By using the animation production system 100 provided by the invention, the light in the virtual scene can change the attributes of the light, such as brightness, color and the like according to the rhythm of music, so that the production of the light effect is automated, one key is generated, the efficiency is improved, and the cost is reduced.
In some embodiments, the aforementioned composition rendering module 120 includes a coordinate matching unit (not shown in the figures) configured to: and matching the coordinates of the real scene and the virtual scene according to the model data of the real scene and the virtual scene obtained by shooting. Note that in the present embodiment, capturing a real scene with the scene capturing unit 113 has the effect of matching the coordinates of the real scene and the virtual scene. When the actors perform in the empty real scene, the director's monitor sees the virtual character and the virtual scene, so as to design and adjust the placement of the shots.
It should be noted that in practice, a part of the animation is composed of a plurality of mirrors, and the work of the producer is to shoot the mirrors, then process the mirrors, and finally connect them together. When animation is produced by using the animation production system 100 of the present invention, shooting the split mirror includes performing works such as actor shooting and scene shooting using the units of the asset acquisition module 110, and processing the split mirror includes processing the picture using the units of the composition rendering module 120.
FIG. 6 is a schematic block diagram of another embodiment of an animation system 100 of the present invention. Referring to FIG. 6, in some embodiments, the animation system 100 of the present example also includes an asset management module 130. The asset management module 130 is communicatively coupled to one or more asset acquisition modules 110 and one or more composition rendering modules 120. Alternatively, the asset management module 130 is in direct communication with the units included in the asset acquisition module 110 and the units included in the composite rendering module 120. Optionally, asset management module 130 is a server.
The asset management module 130 is configured to: animation assets, and one or more mirrors, are managed collectively. The sub-mirror includes the processed animation assets generated and transmitted by the composition rendering module 120 as the sub-mirror to be rendered, and/or the rendering results rendered and transmitted by the composition rendering module 120 as the rendered sub-mirror.
FIG. 7 is a schematic block diagram of an asset management module 130 according to one embodiment of the invention. Referring to fig. 7, in some embodiments, asset management module 130 includes an asset access unit 131 for: providing an interface for reading the animation assets, an interface for reading the split mirror, an interface for receiving the animation assets, and an interface for receiving the split mirror; sending out the read animation assets or read split mirrors according to a read request input by a user or automatically generated read requests of the asset acquisition module 110 and each unit thereof or the composite rendering module 120 and each unit thereof; the method comprises the steps of receiving a transferred animation asset, receiving a transferred split mirror, storing the received animation asset and storing the received split mirror. It is noted that providing access interfaces as described above may include providing one or more users with an interface to enable users to access data via a web page or client, and may also include providing access interfaces to the asset acquisition module 110 and the composite rendering module 120 for automated access to data with the asset management module 130. Thus, a user such as a maker can take a necessary resource from the asset management module 130 to create the resource, and return the resource to the asset management module 130 after creation. It should be noted that the read requests automatically generated by the modules and units of the foregoing system include, but are not limited to: pre-stored settings and various animation assets used by the autoprocessing unit 122.
Further, in some optional examples, the aforementioned storing the received animation asset and storing the received split mirror specifically includes: storing the latest version and history of the animation assets, and storing the latest version and history of the split mirror. Note that the history may be past versions of animation assets, mirrors; or the complete past version is not saved, and only the version information of the past version is recorded.
In some embodiments, the aforementioned asset management module 130 includes a process monitoring unit 132 for performing one or more of the following:
acquiring manufacturing flow information including information of manufacturing progress of the split mirror, a current person in charge of the split mirror and the like; alternatively, the production flow information may be directly obtained from the asset obtaining module 110 or the composite rendering module 120, or may be obtained by analyzing the animation assets stored in the asset management module 130 and performing a split mirror process;
displaying the manufacturing process information; optionally, performing online real-time display; optionally, the whole production process or the partial mirror production process which the user wants to view can be shown according to the request of the user;
providing an interface for setting a task list and making a priority, and setting the task list and making the priority on line by a common user;
and displaying the animation assets used by each lens and the positions of the lenses in the whole film, so that the search is facilitated.
In some embodiments, the animation assets are managed by the server in a unified way, and appear in each link of the animation production process, and in each link, a producer can take required resources from the server to produce the animation assets, and put the animation assets back after production is completed, and can render the movie which is confirmed to be produced.
Optionally, asset management module 130 also contains the same rendering engine as composition rendering module 120. In some embodiments, asset management module 130 includes an offline rendering unit 133 to: and receiving an offline rendering request sent by a user or automatically generated by the system, performing rendering on the to-be-rendered split mirror by using a rendering engine according to the offline rendering request, and storing a rendering result in the server. Further, the offline rendering unit 133 also supports: and displaying the rendering progress, recording and displaying the rendering history, optimizing rendering, pausing rendering, canceling rendering, selecting rendering format and the like, and the rendering can be viewed and downloaded on line after being completed. According to the invention, by arranging the offline rendering unit 133, a user can transfer the prepared split mirror to the server for rendering, so that the time of workers is not occupied, and the workers can continue to work next time.
It should be noted that the asset management module 130 is not limited to include the asset access unit 131, the process monitoring unit 132, and the offline rendering unit 133 at the same time, and may include only one or more of them.
By utilizing the animation production system 100 provided by the invention, multiple persons can simultaneously work in parallel through the asset management module 130, the integrity of assets is ensured, all production processes are displayed, repeated work is avoided, and the requirement of large-scale multi-person animation production processes is met.
In some embodiments of the present invention, the animation system 100 includes a server device and one or more client devices communicatively coupled to the server device.
Each client device handles one or more sub-mirrors. The client device includes at least one of the asset acquisition module 110 and the composition rendering module 120. The asset obtaining module 110 is configured to obtain a split mirror material to be processed, and the composite rendering module 120 is configured to process the split mirror material to obtain a split mirror. The rendering process includes rendering the animation assets processed by the composite rendering module 120 as rendered rendering components and/or rendering results rendered by the composite rendering module 120 as rendered rendering components.
The server device includes the asset management module 130, which is used to perform unified management on each split mirror material and each split mirror. Alternatively, the staff can log in the server device through a webpage. Optionally, in some examples, the server device also contains a composite rendering module 120 that is the same as the client device.
The animation production system 100 provided by the invention provides convenience in the aspects of multi-person cooperation and multi-party cooperation, and is an industrialized production line.
In some embodiments, the animation system 100 of the present examples further includes one or more clipping modules, clipping units, for clipping media data in the animation assets or rendering results. In an alternative embodiment, the aforementioned composite rendering module 120 includes an asset clipping unit (not shown) for providing an interface for clipping the captured or processed animation assets; the animation system 100 further comprises a piece clipping module (not shown) for clipping the rendering results into pieces. Note that in this example, the first clipping is done in the rendering engine, and the second clipping is done on the rendered frame, i.e. all rendered segments are concatenated by the clipping software.
In other embodiments, rather than the animation system 100 comprising a separate clipping module, the composition rendering module 120 comprises one or more clipping units (not shown) that clip the captured animation assets for further processing and rendering, and clip the rendering results into pieces. In this embodiment, a real-time rendering engine is used, and except for model making, all the other steps are made in the same platform by using modules such as editing, special effect making, synthesizing, color correction, rendering and the like.
During traditional animation production, multiple pieces of software are generally required to cooperate together, the model production software produces a model to render an animation video, the model is edited in the editing software, a special effect is produced in the special effect production software and the video is rendered, the edited video and the special effect video are imported into the synthesis software to be synthesized and made into a video, the video is imported into the color correction software to be corrected, and finally the video is rendered and output. The problems of compatibility, universality and the like in information transmission between software are complex and easy to make mistakes.
The animation production system 100 of the present example employs production methods that are divided into early asset accumulation, intermediate composition rendering, and late assembly slicing. And in the early-stage asset accumulation, only one piece of model making software and action data acquisition are needed, the middle-stage synthetic rendering is completely carried out in one real-time rendering engine, and the later-stage editing and slicing can be carried out by only one piece of editing software. Therefore, the use amount of software is reduced, and the error rate of information in the transmission process is reduced.
And if the traditional film production is wrongly modified after the rendering is finished, the modification is complex, but the scheme of the embodiment of the invention can be wrongly modified at any time, and has maneuverability and flexibility.
The animation production system 100 of the embodiment of the invention has the advantages that the shooting is flexible, the performance is realized once, the shooting is carried out for multiple times, the virtual scene is arranged in the virtual space, the virtual character transmits the motion data to the virtual character through the real-time motion compensation technology after the performance of the performer is finished, and the motion of the performer is completely simulated by the virtual character. The virtual camera is arranged in the virtual space and is not influenced by time space, so that the camera can shoot at any angle and any direction, can shoot repeatedly, and is not afraid of the problems of shooting mistakes, forgetting to shoot a certain lens and the like.
FIG. 8 is a schematic flow chart diagram of one embodiment of a method of animation of the present invention. Referring to fig. 8, the animation method according to the embodiment of the present invention mainly includes the following steps:
step S11, the animation asset is obtained. The method specifically comprises the following steps:
making model data and/or obtaining pre-made model data, wherein the model data comprises but is not limited to models of virtual objects such as virtual characters, virtual scenes, virtual props and the like;
photographing an actor through a real-time anaplerotic technique to obtain character motion data so as to transfer the character motion data to a virtual character so that the virtual character simulates the action of the actor;
and shooting a real scene to animate using the real scene, for example, to generate a virtual scene or to animate the virtual scene using the real scene. Optionally, before the acquiring of the animation asset in step S11, the method further includes: and determining the script.
And step S12, processing and rendering the animation assets by using the real-time rendering engine to obtain rendering results containing the virtual characters and the virtual scenes.
In some embodiments, the aforementioned step S12 includes one or more of the following steps:
providing an interface for artificially processing animation assets and controlling virtual characters and virtual scenes, receiving one or more user instructions, and processing corresponding animation assets and/or controlling the virtual characters and the virtual scenes according to the user instructions so that a user can control and adjust rendering results;
step two, according to the preset program and/or according to some of the animation assets, automatically processing other animation assets, or automatically controlling the virtual character and the virtual scene
Step three, in the process of processing the animation assets, displaying the processing effect in real time so as to facilitate the real-time adjustment of a user;
and step four, rendering according to the processed animation assets to obtain rendering results.
It should be noted that the sequence of the first step to the fourth step is not limited.
In some embodiments, the aforementioned step S12 includes: the change condition of the material applied to one virtual object is recorded, and the change condition of the material is utilized to process and render, so that the original attribute of the material is not changed, and the application of the material to other virtual objects is not influenced. The virtual object comprises a virtual character or a virtual scene.
In some embodiments, the aforementioned step one comprises one or more of the following steps:
receiving a first instruction of a user by using a provided climate adjusting interface, and adjusting the climate of the virtual scene in real time according to the first instruction of the user;
receiving a second instruction of a user by using the provided interface for adjusting the role, and adjusting the appearance of the virtual role in real time according to the second instruction of the user;
receiving a third instruction of a user by utilizing a provided image color correction interface, and performing color correction on the whole image previewed in real time and/or the single object in the image according to the third instruction of the user;
receiving a fourth instruction of a user by using the provided special-effect interface, and adding or adjusting the special effect in the virtual scene in real time according to the fourth instruction of the user;
and receiving a fifth instruction of the user by using the provided interface for adjusting the light parameters, and adjusting the light effect in the virtual scene in real time according to the fifth instruction of the user.
It should be noted that the order of the above-mentioned steps included in the first step is not limited.
Further, in some embodiments, the adjusting the appearance of the virtual character in real time by using the provided interface for adjusting the character specifically includes: when the material of the virtual character is adjusted, the original attribute of the material is not changed, but the change condition of the material is recorded, so that the virtual character is rendered according to the change condition of the material during rendering.
In some embodiments, the aforementioned step S11 further includes obtaining audio material; the second step comprises: and automatically generating or automatically adjusting the light parameters of the virtual scene according to the audio material so that the light in the virtual scene changes along with the rhythm of the audio material.
In some embodiments, the aforementioned step S12 includes: and matching the coordinates of the real scene and the virtual scene according to the model data of the real scene and the virtual scene obtained by shooting.
In some embodiments, the animation method of the present examples further comprises: the animation assets and one or more mirrors are managed in a unified manner. Wherein, this minute mirror includes: and the processed animation assets are used as the split mirror to be rendered, and/or rendering results obtained by rendering are used as rendered split mirrors.
In some embodiments, the aforementioned unified management of animation assets and split mirrors specifically comprises: providing an interface for reading the animation assets and an interface for reading the split mirror; sending out the read animation assets/read mirrors according to a read request input by a user or a read request automatically generated by a system; providing an interface for receiving animation assets and an interface for receiving a split mirror; and storing the received animation assets and storing the received split mirrors.
Further, in some optional examples, the aforementioned storing the received animation asset comprises: the latest version and history of the animation asset are stored.
In some embodiments, the aforementioned unified management of animation assets comprises performing one or more of the following:
acquiring manufacturing flow information including manufacturing progress of the split mirror and a responsible person of the split mirror;
displaying the manufacturing process information;
providing an interface for setting a task list and making a priority;
and showing the animation assets used by each lens and the positions of the lenses in the whole movie.
In some embodiments, the aforementioned unified management animation asset comprises: and receiving an offline rendering request, performing rendering on the to-be-rendered split mirror according to the offline rendering request, and storing a rendering result.
In some embodiments, the animation method of the present examples further comprises: the media data is clipped. As a specific example, after the foregoing step S11 is performed, the method further includes: editing the collected or processed animation assets for processing or rendering the animation assets by utilizing a real-time rendering engine; after the foregoing step S12, the method further includes: and editing the rendering result to obtain the fragments.
It should be noted that all relevant contents and achieved beneficial effects of the functional modules related to the foregoing system embodiment may be referred to the description of each step of the corresponding method embodiment, and are not described herein again.
Embodiments of the present invention further provide a computer storage medium, where computer instructions are stored, and when the computer instructions are executed on a device, the device executes the above related method steps to implement the animation production method in the above embodiments.
Embodiments of the present invention also provide a computer program product, which when run on a computer, causes the computer to execute the above related steps to implement the animation method in the above embodiments.
In addition, the embodiment of the present invention further provides an apparatus, which may specifically be a chip, a component or a module, and the apparatus may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the animation production method in the above method embodiments.
The computer storage medium, the computer program product, or the chip provided by the present invention are all used for executing the corresponding method provided above, and therefore, the beneficial effects achieved by the present invention can refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Although the present invention has been described with reference to a preferred embodiment, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (25)

1. An animation system, comprising:
the asset acquisition module is used for acquiring animation assets; and the number of the first and second groups,
the composite rendering module is used for processing and rendering the animation assets by utilizing a real-time rendering engine so as to obtain a rendering result containing the virtual character and the virtual scene;
the asset acquisition module comprises an actor shooting unit, a scene shooting unit and a model making unit; the model making unit is used for making model data of the virtual role and the virtual scene and/or acquiring the model data which is made in advance; the actor shooting unit is used for shooting actors through a real-time motion compensation technology to obtain role motion data so as to transmit the role motion data to the virtual role; the scene shooting unit is used for shooting a real scene so as to generate or process the virtual scene by using the real scene.
2. The animation system of claim 1, wherein the composition rendering module comprises one or more of the following:
the artificial processing unit is used for providing an interface for artificially processing the animation assets and/or controlling the virtual characters and the virtual scenes, receiving a user instruction, and processing the animation assets and/or controlling the virtual characters and the virtual scenes according to the user instruction so as to facilitate the user to control and adjust the rendering result;
the automatic processing unit is used for automatically processing other animation assets or automatically controlling virtual characters and virtual scenes according to a preset program and/or some animation assets;
the preview unit is used for displaying the processing effect in real time in the process of processing the animation assets;
and the rendering unit is used for rendering according to the processed animation assets to obtain the rendering result.
3. The animation system of claim 1, wherein the composition rendering module comprises:
the material change recording unit is used for recording the change condition of the material applied to one virtual object so as to process and render by using the change condition of the material, so that the original attribute of the material is not changed and the application of the material to other virtual objects is not influenced; wherein the virtual object comprises a virtual character or a virtual scene.
4. An animation system as claimed in claim 2, wherein the artifact unit comprises one or more of the following sub-units:
the climate change subunit is used for providing an interface for adjusting the climate of the virtual scene in real time, receiving a first instruction of a user, and adjusting the climate of the virtual scene in real time according to the first instruction of the user;
the role dynamic adjusting subunit is used for providing an interface for adjusting the appearance of the virtual role in real time, receiving a second instruction of a user, and adjusting the appearance of the virtual role in real time according to the second instruction of the user;
the picture color correction subunit is used for providing an interface for performing color correction on the whole picture previewed in real time and/or the single object in the picture, receiving a third instruction of a user, and performing color correction in real time according to the third instruction of the user;
the special effect subunit is used for providing an interface for adding or adjusting the special effect in the virtual scene in real time, receiving a fourth instruction of a user, and adjusting the special effect in the virtual scene in real time according to the fourth instruction of the user;
and the lighting subunit is used for providing an interface for adjusting lighting parameters in the virtual scene in real time, receiving a fifth instruction of the user, and adjusting lighting effects in the virtual scene in real time according to the fifth instruction of the user.
5. An animation system as claimed in claims 3 and 4, wherein the character dynamic adjustment subunit is specifically configured to:
when the material of the virtual character is adjusted, the original attribute of the material is not changed, but the change condition of the material is recorded, so that the virtual character is rendered according to the change condition of the material during rendering.
6. The animation system of claim 2, wherein:
the asset acquisition module comprises an audio acquisition unit for acquiring audio materials;
the automatic processing unit comprises a lamplight automatic processing subunit, and the lamplight automatic processing subunit is used for automatically generating or automatically adjusting lamplight parameters of a virtual scene according to the audio material, so that the lamplight in the virtual scene changes along with the rhythm of the audio material.
7. The animation system of claim 1, wherein the composite rendering module comprises a coordinate matching unit to: and matching the coordinates of the real scene and the virtual scene according to the model data of the real scene and the virtual scene obtained by shooting.
8. The animation system of claim 1, further comprising an asset management module communicatively coupled to one or more of the asset acquisition modules and one or more of the composition rendering modules to:
uniformly managing the animation assets and one or more sub-mirrors; the split mirror comprises the animation assets processed by the synthesis rendering module as split mirrors to be rendered and/or rendering results rendered by the synthesis rendering module as rendered split mirrors.
9. The animation system of claim 8, wherein the asset management module comprises an asset access unit to: providing an interface to read the animation asset/the split mirror; sending the animation assets/the split mirrors which are read according to a reading request input by a user or a reading request automatically generated by a system; providing an interface to receive the animation asset/the split mirror; storing the received animation assets/the partial mirror.
10. The animation system of claim 8, wherein the asset management module comprises a process monitoring unit to perform one or more of:
acquiring manufacturing flow information including manufacturing progress of the split mirror and a responsible person of the split mirror;
displaying the manufacturing process information;
providing an interface for setting a task list and making a priority;
showing the animation assets used by each of the mirrors and the location of the mirror in the entire movie.
11. The animation system of claim 8, wherein the asset management module comprises an offline rendering unit to:
and receiving an offline rendering request, performing rendering on the to-be-rendered split mirror according to the offline rendering request, and storing a rendering result.
12. An animation system, comprising a server device and one or more client devices, wherein the client devices are communicatively coupled to the server device;
each client device is used for processing split mirrors and comprises at least one of the asset acquisition module and the composition rendering module of any one of claims 1 to 11; the system comprises an asset acquisition module, a synthetic rendering module and a processing module, wherein the asset acquisition module is used for acquiring a lens material to be processed, and the synthetic rendering module is used for processing the lens material to obtain a lens;
the server device comprises an asset management module according to any one of claims 8 to 11, and is used for performing unified management on each of the split mirror materials and each of the split mirrors.
13. A method of animation, the method comprising the steps of:
obtaining an animation asset, comprising: making model data of a virtual character and a virtual scene and/or acquiring the model data which has been made in advance, shooting actors by a real-time motion compensation technology to obtain character motion data so as to transfer the character motion data to the virtual character, and shooting a real scene so as to generate or process the virtual scene by using the real scene;
and processing and rendering the animation assets by utilizing a real-time rendering engine to obtain a rendering result containing the virtual character and the virtual scene.
14. The animation method of claim 13, wherein the processing and rendering the animation asset using a real-time rendering engine comprises one or more of:
providing an interface for artificially processing the animation assets and controlling the virtual characters and the virtual scenes, receiving user instructions, and processing the animation assets and/or controlling the virtual characters and the virtual scenes according to the user instructions so that a user can control and adjust the rendering result;
automatically processing other of the animation assets or automatically controlling virtual characters, virtual scenes according to a preset program, and/or according to some of the animation assets;
displaying the processing effect in real time in the process of processing the animation assets;
and rendering according to the processed animation assets to obtain the rendering result.
15. The animation method of claim 13, wherein the processing and rendering the animation asset using a real-time rendering engine comprises:
recording the change condition of the material applied to one virtual object so as to process and render by using the change condition of the material, so that the original attribute of the material is not changed and the application of the material to other virtual objects is not influenced; wherein the virtual object comprises a virtual character or a virtual scene.
16. The animation method as claimed in claim 14, wherein the interface for artificially processing the animation assets and controlling the virtual characters and scenes is provided so that a user can control and adjust the rendering result, and comprises one or more of the following steps:
receiving a first instruction of a user by using a provided climate adjusting interface, and adjusting the climate of the virtual scene in real time according to the first instruction of the user;
receiving a second instruction of a user by using the provided interface for adjusting the role, and adjusting the appearance of the virtual role in real time according to the second instruction of the user;
receiving a third instruction of a user by utilizing a provided image color correction interface, and performing color correction on the whole image previewed in real time and/or the single object in the image according to the third instruction of the user;
receiving a fourth instruction of a user by using a provided special effect interface, and adding or adjusting the special effect in the virtual scene in real time according to the fourth instruction of the user;
and receiving a fifth instruction of the user by using the provided interface for adjusting the light parameters, and adjusting the light effect in the virtual scene in real time according to the fifth instruction of the user.
17. An animation method as claimed in claims 15 and 16, wherein the adjusting of the appearance of the virtual character in real time using the provided interface for adjusting the character comprises:
when the material of the virtual character is adjusted, the original attribute of the material is not changed, but the change condition of the material is recorded, so that the virtual character is rendered according to the change condition of the material during rendering.
18. The animation method according to claim 14, wherein:
the obtaining of the animation asset further comprises obtaining audio material;
the automatically processing the animation assets or controlling the virtual characters and the virtual scenes according to a preset program and/or a part of the animation assets comprises the following steps: and automatically generating or automatically adjusting light parameters of the virtual scene according to the audio materials so that the light in the virtual scene changes along with the rhythm of the audio materials.
19. The animation method of claim 13, wherein the processing and rendering the animation asset using a real-time rendering engine comprises: and matching the coordinates of the real scene and the virtual scene according to the model data of the real scene and the virtual scene obtained by shooting.
20. The animation method as claimed in claim 13, further comprising: uniformly managing the animation assets and one or more sub-mirrors; and the split mirror comprises the processed animation assets as split mirrors to be rendered, and/or the rendering result obtained by rendering is used as rendered split mirrors.
21. The method of claim 20, wherein the unified management of the animation assets and the one or more mirrors comprises:
providing an interface to read the animation asset/the split mirror; sending the animation assets/the split mirrors which are read according to a reading request input by a user or a reading request automatically generated by a system; providing an interface to receive the animation asset/the split mirror; storing the received animation assets/the partial mirror.
22. The animation method of claim 20, wherein the unified management of the animation assets and the one or more mirrors comprises performing one or more of the following:
acquiring manufacturing flow information including manufacturing progress of the split mirror and a responsible person of the split mirror;
displaying the manufacturing process information;
providing an interface for setting a task list and making a priority;
showing the animation assets used by each of the mirrors and the location of the mirror in the entire movie.
23. The method of claim 20, wherein the unified management of the animation assets and the one or more mirrors comprises:
and receiving an offline rendering request, performing rendering on the to-be-rendered split mirror according to the offline rendering request, and storing a rendering result.
24. A computer storage medium comprising computer instructions which, when run on a device, cause the device to perform the animation method of any of claims 13 to 23.
25. A computer program product, characterized in that it causes a computer to carry out the animation method as claimed in any one of claims 13 to 23, when said computer program product is run on the computer.
CN202010419549.XA 2020-05-18 2020-05-18 Animation system, animation method, storage medium, and program product Pending CN111598983A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010419549.XA CN111598983A (en) 2020-05-18 2020-05-18 Animation system, animation method, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010419549.XA CN111598983A (en) 2020-05-18 2020-05-18 Animation system, animation method, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN111598983A true CN111598983A (en) 2020-08-28

Family

ID=72182867

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010419549.XA Pending CN111598983A (en) 2020-05-18 2020-05-18 Animation system, animation method, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN111598983A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112565555A (en) * 2020-11-30 2021-03-26 魔珐(上海)信息科技有限公司 Virtual camera shooting method and device, electronic equipment and storage medium
CN113538640A (en) * 2021-07-08 2021-10-22 潘宁馨 Cartoon making method
CN114915855A (en) * 2022-04-29 2022-08-16 完美世界(北京)软件科技发展有限公司 Virtual video program loading method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106780678A (en) * 2017-02-03 2017-05-31 北京华严世界影业有限公司 A kind of simulation animation film making method and system complete in real time
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
CN108200445A (en) * 2018-01-12 2018-06-22 北京蜜枝科技有限公司 The virtual studio system and method for virtual image
CN110070594A (en) * 2019-04-25 2019-07-30 深圳市金毛创意科技产品有限公司 The three-dimensional animation manufacturing method that real-time rendering exports when a kind of deduction
CN110225224A (en) * 2019-07-05 2019-09-10 北京乐元素文化发展有限公司 Director method, the apparatus and system of virtual image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106780678A (en) * 2017-02-03 2017-05-31 北京华严世界影业有限公司 A kind of simulation animation film making method and system complete in real time
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
CN108200445A (en) * 2018-01-12 2018-06-22 北京蜜枝科技有限公司 The virtual studio system and method for virtual image
CN110070594A (en) * 2019-04-25 2019-07-30 深圳市金毛创意科技产品有限公司 The three-dimensional animation manufacturing method that real-time rendering exports when a kind of deduction
CN110225224A (en) * 2019-07-05 2019-09-10 北京乐元素文化发展有限公司 Director method, the apparatus and system of virtual image

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112565555A (en) * 2020-11-30 2021-03-26 魔珐(上海)信息科技有限公司 Virtual camera shooting method and device, electronic equipment and storage medium
CN112565555B (en) * 2020-11-30 2021-08-24 魔珐(上海)信息科技有限公司 Virtual camera shooting method and device, electronic equipment and storage medium
CN113538640A (en) * 2021-07-08 2021-10-22 潘宁馨 Cartoon making method
CN114915855A (en) * 2022-04-29 2022-08-16 完美世界(北京)软件科技发展有限公司 Virtual video program loading method

Similar Documents

Publication Publication Date Title
US9041899B2 (en) Digital, virtual director apparatus and method
CN111598983A (en) Animation system, animation method, storage medium, and program product
US20080307304A1 (en) Method and system for producing a sequence of views
CA2286784A1 (en) Media production with correlation of image stream and abstract objects in a three-dimensional virtual stage
US11539932B2 (en) Dynamically generating and changing view-specific-filter parameters for 360-degree videos
US11176716B2 (en) Multi-source image data synchronization
US11443450B2 (en) Analyzing screen coverage of a target object
CN114598819A (en) Video recording method and device and electronic equipment
WO2023236656A1 (en) Method and apparatus for rendering interactive picture, and device, storage medium and program product
US11200919B2 (en) Providing a user interface for video annotation tools
CN115379195B (en) Video generation method, device, electronic equipment and readable storage medium
CN116030108A (en) Configuration and scheduling method and system based on three-dimensional live-action reconstruction algorithm
CN114173059B (en) Video editing system, method and device
US20210287433A1 (en) Providing a 2-dimensional dataset from 2-dimensional and 3-dimensional computer vision techniques
Huang et al. A process for the semi-automated generation of life-sized, interactive 3D character models for holographic projection
Gao et al. Aesthetics Driven Autonomous Time-Lapse Photography Generation by Virtual and Real Robots
WO2014111160A1 (en) Device and method for rendering of moving images and set of time coded data containers
CN116452718B (en) Path planning method, system, device and storage medium for scene roaming
WO2024057905A1 (en) Program, information processing method, and information processing device
CN117173291A (en) Animation method, device, apparatus, storage medium, and computer program product
Gouchet et al. Scp camera
CN117788647A (en) Method, apparatus and computer readable medium for producing track animation
CN117354481A (en) Interaction method, device, electronic equipment, storage medium and program product
CN117527994A (en) Visual presentation method and system for space simulation shooting
CN117241127A (en) Shooting scene evaluation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100022 13 / F, 1212, building 16, 89 Jianguo Road, Chaoyang District, Beijing

Applicant after: Beijing xingludong Technology Co.,Ltd.

Address before: 100022 1507, 12 / F, building 8, courtyard 88, Jianguo Road, Chaoyang District, Beijing

Applicant before: Beijing Le Element Culture Development Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200828