CN113808237B - Animation generation method and device - Google Patents

Animation generation method and device Download PDF

Info

Publication number
CN113808237B
CN113808237B CN202111113011.7A CN202111113011A CN113808237B CN 113808237 B CN113808237 B CN 113808237B CN 202111113011 A CN202111113011 A CN 202111113011A CN 113808237 B CN113808237 B CN 113808237B
Authority
CN
China
Prior art keywords
animation
event
sub
track
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111113011.7A
Other languages
Chinese (zh)
Other versions
CN113808237A (en
Inventor
黄锦寿
刘澈
唐磊
方泽华
张智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Kingsoft Digital Network Technology Co Ltd
Original Assignee
Zhuhai Kingsoft Digital Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Kingsoft Digital Network Technology Co Ltd filed Critical Zhuhai Kingsoft Digital Network Technology Co Ltd
Priority to CN202111113011.7A priority Critical patent/CN113808237B/en
Publication of CN113808237A publication Critical patent/CN113808237A/en
Application granted granted Critical
Publication of CN113808237B publication Critical patent/CN113808237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides an animation generation method and device, wherein the animation generation method comprises the steps of determining an animation object based on a received animation creation instruction, reading an animation event which is created in advance for an animation track corresponding to the animation object, and creating a target animation file based on the animation object, the animation track and the animation event, so that animation creation is more visual through divided animation levels, the interpretability is stronger, and the creation efficiency of the animation is improved.

Description

Animation generation method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to an animation generating method and apparatus.
Background
With the development of animation technology, the design of animation scenes is more and more complex. At present, when complex animation scenes are processed, animation tracks used in the animation scenes are scattered. The user is required to actively typeset the animation tracks or actively group the animation tracks to indicate the functional relevance between the animation tracks. In this case, not only the efficiency of animation creation but also the animation creation experience of the user is affected.
Disclosure of Invention
In view of the above, embodiments of the present application provide an animation generation method and apparatus, a computing device and a computer readable storage medium, so as to solve the technical drawbacks in the prior art.
According to a first aspect of an embodiment of the present application, there is provided an animation generation method, including:
determining an animation object based on the received animation creation instruction;
Reading an animation event which is created in advance for an animation track corresponding to the animation object;
A target animation file is created based on the animation object, the animation track, and the animation event.
Optionally, the reading the animation event created in advance for the animation track corresponding to the animation object includes:
Reading a basic animation event which is pre-established for an active picture track corresponding to the animation object, and reading an extended animation event which is pre-established for a sub-animation track associated with a target animation object, wherein the target animation object is an animation object associated with the sub-animation track in the animation object;
accordingly, the creating a target animation file based on the animation object, the animation track, and the animation event includes:
a target animation file is created based on the animation object, the active drawing track, the base animation event, the sub-animation track, and the extended animation event.
Optionally, in a case that a preset type of animation resource is preconfigured, before the creating the target animation file based on the animation object, the animation track and the animation event, the method further includes:
taking the animation object, the animation track and the animation event as animation elements;
Determining a target animation resource of the preset type associated with at least one target animation element in the animation elements, and reading resource information of the target animation resource;
Updating the at least one target animation element through the resource information according to the association relationship between the target animation resource and the at least one target animation element;
updating the animation object, the animation track and the animation event based on the updated at least one target animation element;
and taking the updated animation objects, the updated animation tracks and the updated animation events as the animation objects, the updated animation tracks and the updated animation events.
Optionally, the at least one target animation element is determined by:
and selecting an animation element associated with the animation resource of the preset type as at least one target animation element in the animation object, the animation track and the animation event.
Optionally, in a case of pre-configuring a pre-form resource, before the creating the target animation file based on the animation object, the animation track, and the animation event, the method further includes:
taking the animation object, the animation track and the animation event as animation elements;
determining a target prefabricated member resource associated with at least one prefabricated member animation element in the animation elements, and determining a resource path of the target prefabricated member resource;
Updating the at least one prefabricated member animation element through the resource path according to the association relationship between the target prefabricated member resource and the at least one prefabricated member animation element;
Updating the animation object, the animation track and the animation event based on the updated at least one preform animation element;
and taking the updated animation objects, the updated animation tracks and the updated animation events as the animation objects, the updated animation tracks and the updated animation events.
Optionally, any one of the animation sub-events is queried by the following method:
receiving an event inquiry instruction submitted by a user in an event operation area;
traversing the event quadtree corresponding to the event operation area, and comparing the position information in the event inquiry instruction with the event operation subarea corresponding to the node in the event quadtree;
The event operation sub-region is the position of the operation sub-region corresponding to the animation sub-event in the animation event;
and determining and displaying a target animation sub-event in the animation sub-events according to the comparison result.
Optionally, the event quadtree is updated by:
After any one animation sub-event in the animation event is created, determining a target node corresponding to an event operation sub-region of the any one animation sub-event in the event quadtree;
And updating the target node according to the event operation sub-region of any animation sub-event to obtain an updated event quadtree.
Optionally, any one of the animation objects is created by:
in response to a received object creation instruction, displaying an animation object type list, wherein the animation object type list comprises a director type, a camera type, an actor type, a special effect type and a lamplight type;
And creating an animation object corresponding to the target animation object type based on the target animation object type selected in the animation object type list.
Optionally, any one of the sub-animation tracks is created by:
receiving a sub-track creation instruction aiming at any one of the animation objects;
Responding to the sub-track creation instruction, and displaying a sub-track type list associated with the object attribute of any one animation object;
Creating a sub-animation track corresponding to the target sub-track type based on the target sub-track type selected in the sub-track type list.
Optionally, the target animation object is determined by:
and selecting the animation object of the associated sub-animation track from the animation objects to determine as a target animation object.
Optionally, any one of the base animation sub-events is created by:
Receiving a basic sub-event creation instruction of an active drawing track corresponding to any one of the animation objects aiming at any one of the animation objects;
responding to the basic sub-event creation instruction, and displaying a basic sub-event type list corresponding to the active drawing track corresponding to any one animation object;
And creating a basic animation sub-event corresponding to the target basic sub-event type based on the target basic sub-event type selected in the basic sub-event type list.
Optionally, any one of the extended animation events is created by:
aiming at any one target animation object in the target animation objects, receiving an expansion sub-event creation instruction aiming at a sub-animation track associated with the any one target animation object;
Responding to the extended sub-event creation instruction, and displaying a corresponding extended sub-event type list of a sub-animation track associated with any one target animation object;
Creating an extended animation sub-event corresponding to the target extended sub-event type based on the target extended sub-event type selected in the extended sub-event type list.
According to a second aspect of an embodiment of the present application, there is provided an animation generation device including:
a determining module configured to determine an animation object based on the received animation creation instruction;
the reading module is configured to read an animation event which is created in advance for an animation track corresponding to the animation object;
A creation module configured to create a target animation file based on the animation object, the animation track, and the animation event.
According to a third aspect of embodiments of the present application, there is provided a computing device comprising a memory, a processor and computer instructions stored on the memory and executable on the processor, the processor implementing the steps of the animation generation method when executing the computer instructions.
According to a fourth aspect of embodiments of the present application, there is provided a computer readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the animation generation method.
The animation generation method provided by the embodiment of the application determines the created animation object based on the received animation creation instruction, reads the animation event which is created in advance for the animation track corresponding to the animation object, and creates the target animation file based on the animation object, the animation track and the animation event according to the hierarchical structure of the animation object, the animation track and the animation event, thereby realizing that the animation creation process is more visual and the interpretability is stronger by dividing the animation hierarchy and improving the creation efficiency of the animation.
Drawings
FIG. 1 is a block diagram of a computing device provided by an embodiment of the present application;
FIG. 2 is a flow chart of an animation generation method provided by an embodiment of the present application;
FIG. 3 is an interface schematic of an animation editor in an animation generation method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an event handling area in an animation generation method according to an embodiment of the present application;
FIG. 5 is a process flow diagram of an animation generation method applied to an animation editor provided by an embodiment of the present application;
Fig. 6 is a schematic structural diagram of an animation generating device according to an embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. The present application may be embodied in many other forms than those herein described, and those skilled in the art will readily appreciate that the present application may be similarly embodied without departing from the spirit or essential characteristics thereof, and therefore the present application is not limited to the specific embodiments disclosed below.
The terminology used in the one or more embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the application. As used in one or more embodiments of the application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present application refers to any or all possible combinations of one or more of the associated listed items.
It should be understood that, although the terms first, second, etc. may be used in one or more embodiments of the application to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first may also be referred to as a second, and similarly, a second may also be referred to as a first, without departing from the scope of one or more embodiments of the application. The term "if" as used herein may be interpreted as "at..once" or "when..once" or "in response to a determination", depending on the context.
In the present application, an animation generation method and apparatus, a computing device, and a computer-readable storage medium are provided, and detailed description is given one by one in the following embodiments.
FIG. 1 illustrates a block diagram of a computing device 100, according to an embodiment of the application. The components of the computing device 100 include, but are not limited to, a memory 110 and a processor 120. Processor 120 is coupled to memory 110 via bus 130 and database 150 is used to store data.
Computing device 100 also includes access device 140, access device 140 enabling computing device 100 to communicate via one or more networks 160. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. The access device 140 may include one or more of any type of network interface, wired or wireless (e.g., a Network Interface Card (NIC)), such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the application, the above-described components of computing device 100, as well as other components not shown in FIG. 1, may also be connected to each other, such as by a bus. It should be understood that the block diagram of the computing device shown in FIG. 1 is for exemplary purposes only and is not intended to limit the scope of the present application. Those skilled in the art may add or replace other components as desired.
Computing device 100 may be any type of stationary or mobile computing device including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), mobile phone (e.g., smart phone), wearable computing device (e.g., smart watch, smart glasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 100 may also be a mobile or stationary server.
Wherein the processor 120 may perform the steps of the animation generation method shown in fig. 2. Fig. 2 shows a flowchart of an animation generation method according to an embodiment of the present application, which specifically includes the following steps:
Step 202, determining an animation object based on the received animation creation instruction.
The animation creation instruction is an instruction for creating a playable animation file, and the animation object can be understood as animation resources such as animation characters, cameras, animation props and the like. In practical applications, when the animation editor receives the animation creation instruction, it is necessary to determine animation data for creating an animation file, and then create the animation file based on the animation data. The animation editor may be an animation editor for executing an animation, or may be a plug-in editor in an animation editor as another animation, which is not limited herein.
In particular, in order to optimize the process of creating the animation by the user and facilitate the user to manage the animation elements (including the animation objects) in the animation, the animation objects may be created first, and then other animation elements that affect the animation objects may be created based on the created animation objects, so as to perform classification management on various animation elements based on the animation objects.
Therefore, how to create the animation object has great influence on the animation creation process, if the created animation object can well classify the animation elements and meet the creation requirement, the efficiency of creating the animation by a user can be greatly improved, and when the method is implemented, some animation resources mainly or frequently used in the animation creation process can be firstly classified according to functions or categories, and then the animation object is created based on the classified animation object types.
In response to a received object creation instruction, displaying an animation object type list, wherein the animation object type list comprises a director type, a camera type, an actor type, a special effect type and a lamplight type;
And creating an animation object corresponding to the target animation object type based on the target animation object type selected in the animation object type list.
In practical application, after the user submits the object creation instruction, the user is presented with the types of animation objects that can be created, and these types of animation objects are presented in the form of an animation object list, and specifically, the animation object type list includes animation object types such as director type, camera type, actor type, special effect type, and lamplight type. Further, the director type animation object is used for controlling the whole animation, such as controlling shot switching, animation playing speed and the like in the animation, the camera type animation object determines an observation view angle and a display range of the played animation, the actor type animation object refers to a virtual object in the animation, the special effect type animation object determines a special effect in the animation, the light type animation object determines a light effect in the animation, and in addition, the animation object type list can also comprise a scene type animation object type and the like used for setting a scene, which is not limited herein.
Based on this, the user can select the animation object type that he wants to create (i.e., the target animation object type) in the list of animation object types, and then can create an animation object based on the target animation object type.
In the implementation, the main animation object types are abstracted in an animation editor, and the animation objects required by the requirements of the animation scene can be created according to the animation object types in the animation editing process.
For example, an object creation instruction submitted by a user U in an animation editor is received, an animation object type list preset in the animation editor is displayed in response to the object creation instruction, the animation object type list comprises a director type, a camera type, an actor type, a special effect type and a lamplight type, an actor type selected by the user U in the animation object type list is received as a target animation object type, and an animation object of the actor type, actor 1, is created.
In summary, in response to the received object creation instruction, the animation object type list is displayed, and the animation object corresponding to the target animation object type is created based on the target animation object type selected in the animation object type list, so that the creation of the corresponding animation object based on the pre-divided animation object types is realized, the type division of the animation object is realized, the creation of the animation object by flexibly selecting the required creation animation object type from a plurality of animation object types is realized, and the flexibility of creating the animation object is increased.
And 204, reading an animation event which is created in advance for the animation track corresponding to the animation object. Specifically, on the basis of the above determination of the animation objects, since the animation editor performs hierarchical management on other animation elements on the basis of the animation objects in order to avoid the influence of the animation objects on each other during the animation editing and to increase the difficulty of the animation creation due to improper management of the animation elements, when creating the animation file, the animation editor may read the corresponding animation elements according to the hierarchical relations and then create the animation file.
Wherein the animation track is a time axis set by the pointer to the animation object, and the animation track can be used for managing the animation event of the animation object. Wherein, the animation event refers to a preconfigured set of animation functions for enabling an animation object to execute a specific animation function at a specific moment.
In the embodiment of the application, the method is realized by reading a basic animation event which is pre-created for an active picture track corresponding to the animation object and reading an extended animation event which is pre-created for a sub-animation track associated with a target animation object, wherein the target animation object is an animation object of the associated sub-animation track in the animation object, and particularly, dividing the animation track into two track types of an active picture track and a sub-animation track, and dividing the animation event into two event type sets of a basic animation event and an extended animation event, wherein any basic animation sub-event in the basic animation event is an animation sub-event, and any extended animation sub-event in the extended animation event is also an animation sub-event.
In practical application, each animation object corresponds to a main animation track, and a plurality of basic animation sub-events can be created or configured on the main animation track, and the basic animation sub-events form a basic animation event. In addition, when the base animation event fails to meet the configuration requirements for the animation object, at least one associated sub-animation track may also be created or added for the animation object such that a corresponding at least one extended animation sub-event is created or configured on each sub-animation track, the sub-extended animation sub-events comprising the extended animation event.
The active drawing track is a basic time axis which is set by a pointer to an animation object and used for carrying out basic animation event management, and is used for carrying out time management on basic animation events of the animation object during the animation playing. Accordingly, the base animation sub-events in the base animation event are used to cause the animation object to perform base animation functions such as binding objects, binding actors, following, etc. The sub-animation track is an extended time axis for extended animation event management, which is set for the animation object by the pointer, and is used for time management of the extended animation event of the animation object during the animation playing. The number of the sub-animation tracks may be one or more, and is not limited herein. Accordingly, the extended animation sub-event in the extended animation event is used to cause the animation object to perform an extended animation function such as an action, a skeletal action, a facial animation, etc. In practical application, which animation sub-events belong to basic animation sub-events and which animation sub-events belong to extension sub-events are set in an animation editor in advance according to the types of the animation sub-events.
In the implementation, considering that the types of the extended animation sub-events are also various, in order to avoid that the extended animation sub-events of various types are managed on one sub-animation track, so as to influence the editing efficiency of the animation, the sub-animation track types can be set in advance according to the types of the extended animation sub-events corresponding to each animation object type, and then the sub-animation track is created according to the sub-animation track types.
Receiving a sub-track creation instruction aiming at any one of the animation objects;
Responding to the sub-track creation instruction, and displaying a sub-track type list associated with the object attribute of any one animation object;
Creating a sub-animation track corresponding to the target sub-track type based on the target sub-track type selected in the sub-track type list.
The sub-track creation instruction is an instruction for creating a sub-animation track. In practical applications, since the sub-animation track exists depending on the animation object, it is used to manage the extended animation event of the animation object. Thus, sub-track creation instructions may be submitted for an animation object that configures an extended animation sub-event. The purpose of creating the sub-animation track is to create the extended animation sub-event thereon and manage the extended animation sub-event through the sub-animation track.
Further, since each type of animation object may correspond to multiple types of animation implementations, these different types of animation implementations are typically related to object properties (e.g., functions, object locations, etc.) of the animation object. Accordingly, a plurality of sub-animation track types may be previously divided according to object attributes of each animation object, and the sub-animation track types may be formed into a sub-animation track list. So that the corresponding type of extended animation event is managed on different types of sub-animation tracks. When creating the sub-animation track for any one animation object, the sub-animation track list associated with the object attribute of any one animation object can be displayed. And creating a sub-animation track corresponding to the target sub-track type based on the target sub-track type selected by the user in the sub-track type list, and taking the created sub-animation track as the sub-animation track associated with any one of the animation objects.
On the basis of the creation of the animation object, namely, the actor 1, the method receives a sub-track creation instruction submitted by a user U in an animation editor aiming at the actor 1, and responds to the sub-track creation instruction to display a sub-animation track type list which is preset in the animation editor and is associated with functions and parts of the actor 1, wherein the sub-animation track type list comprises action types, facial action types and the like. And receiving the action type selected by the user U in the sub-animation track type list as a target sub-animation track type, creating a sub-animation track of the action type, namely, an action track, and taking the created action track as a sub-animation track associated with the animation object actors.
In summary, a sub-animation track is created based on a target sub-track type selected in a sub-track type list by receiving a sub-track creation instruction for any one of a plurality of animation objects and exposing a sub-track type list associated with an object attribute of the any one of the animation objects in response to the sub-track creation instruction. The method realizes the type division of the sub-animation track, flexibly selects the required sub-animation track type from a plurality of sub-animation track types to create the sub-animation track, and increases the flexibility of creating the sub-animation track.
In practical application, since some animation objects may have associated sub-animation tracks, and some animation objects do not have associated sub-animation tracks created, in order to read an extended animation event created in advance on the sub-animation tracks, it is necessary to determine the animation object associated with the sub-animation track (i.e., a target animation object) first.
In the implementation, on the basis of creating the animation objects, since different basic functions can be implemented for each animation object, in order to avoid the mutual influence between different basic functions, different basic functions of the animation objects can be divided into different basic sub-event types in advance, and then the basic animation sub-events are created according to the basic sub-event types.
Receiving a basic sub-event creation instruction of an active drawing track corresponding to any one of the animation objects aiming at any one of the animation objects;
responding to the basic sub-event creation instruction, and displaying a basic sub-event type list corresponding to the active drawing track corresponding to any one animation object;
And creating a basic animation sub-event corresponding to the target basic sub-event type based on the target basic sub-event type selected in the basic sub-event type list.
The basic sub-event adding instruction refers to an instruction for adding a basic animation sub-event. In practical application, since any one basic animation sub-event has a time interval for event execution, the time interval is composed of an event start time and an event end time. The time interval may be embodied on an animation timeline. And the basic animation sub-event is used for realizing the basic function of the animation object, so that the basic animation sub-event is created on a basic time axis (active picture track) corresponding to the animation object. Also, therefore, when creating a base animation sub-event, a base sub-event creation instruction needs to be submitted for a base time axis (active drawing track).
Further, since the event types of the base animation sub-events corresponding to each animation object are various, different types of base animation sub-events may require different properties to be set. To avoid interaction between different types of base animation sub-events, it is necessary to divide the base animation sub-events into at least one base sub-event type according to the animation objects and form the base sub-event types into a base sub-event type list. And when the basic animation sub-event is created on the active picture track corresponding to any one animation object, the basic sub-event type list corresponding to the active picture track corresponding to the animation object can be displayed. And creating a basic animation sub-event corresponding to the target basic sub-event type based on the target basic sub-event type selected by the user in the basic sub-event type list. It should be noted that at least one base animation sub-event may be created for the main animation track, and the set of base animation sub-events is referred to as a base animation event.
On the basis of creating the animation object, namely, the actor 1, a basic sub-event creation instruction submitted by a user U in an animation editor aiming at an active drawing track corresponding to the actor 1 is received, a basic sub-event type list corresponding to the active drawing track of the actor 1 is preset in the animation editor in response to the basic sub-event creation instruction, and the basic sub-event type list comprises a binding object type, a binding actor type and a following type. And receiving a binding object type selected by the user U in the basic sub-event type list as a target basic sub-event type, and creating a basic animation sub-event of the binding object type, wherein the binding object A specifically shows that the binding object A is displayed in a scene in a time interval corresponding to the event.
In summary, a basic sub-event creation instruction for an active drawing track corresponding to any one of a plurality of animation objects is received by aiming at any one of the animation objects, and in response to the basic sub-event creation instruction, the corresponding basic animation sub-event is created by a target basic sub-event type selected in a basic sub-event type list. The method and the device realize the type division of the basic animation sub-event of the animation object, flexibly select the needed basic sub-event type from a plurality of basic sub-event types to create the basic animation sub-event, and increase the flexibility of creating the basic animation sub-event.
Similarly, on the basis of creating the animation objects, since different expansion functions can be implemented for each animation object, in order to avoid the mutual influence between different expansion functions, different expansion functions of the animation objects can be divided into different expansion sub-event types in advance, and then the expansion animation sub-events are created according to the expansion sub-event types.
Aiming at any one target animation object in the target animation objects, receiving an expansion sub-event creation instruction aiming at a sub-animation track associated with the any one target animation object;
Responding to the extended sub-event creation instruction, and displaying a corresponding extended sub-event type list of a sub-animation track associated with any one target animation object;
Creating an extended animation sub-event corresponding to the target extended sub-event type based on the target extended sub-event type selected in the extended sub-event type list.
The extended sub-event creation instruction refers to an instruction for adding an extended animation sub-event. In practical application, since any one of the extended animation sub-events has a time interval for event execution, the time interval is also composed of an event start time and an event end time. The time interval also needs to be represented on the animation time axis. And the extended animation sub-event is to implement an extended function of the animation object, so the extended animation sub-event is created on an extended time axis (i.e., sub-animation track) associated with the animation object. Also, therefore, when creating an extended animation sub-event, it is necessary to submit an extended sub-event creation instruction for a sub-animation track.
Further, since event types of the extended animation sub-event are diverse, different types of extended animation sub-events may require setting different attributes. To avoid interaction between different types of extended animation sub-events, it is necessary to divide the extended animation sub-events into at least one extended sub-event type according to the animation object and form the extended sub-event types into an extended sub-event type list. And when the extended animation sub-event is created on the sub-animation track associated with any one animation object, the extended sub-event type list corresponding to the sub-animation track associated with the animation object can be displayed. And creating an extended animation sub-event corresponding to the target extended sub-event type based on the target extended sub-event type selected by the user in the extended sub-event type list. It should be noted that at least one extended animation sub-event may also be created for the sub-animation track, and the set of extended animation sub-events is referred to as an extended animation event.
On the basis of creating the animation object, namely, the actor 1, an expansion sub-event creation instruction submitted by a user U aiming at an action track associated with the actor 1 in an animation editor is received, and an expansion sub-event type list corresponding to the action track of the actor 1 is preset in the animation editor in response to the expansion sub-event creation instruction, wherein the expansion sub-event type list comprises a leg action type, a hand action type and a head action type. And receiving the leg action type selected by the user U in the extended sub-event type list as a target extended sub-event type, and creating an extended animation sub-event of the leg action type, namely a kicking sub-event. The kicking event means that the actor 1 is controlled to perform a kicking operation in a time zone corresponding to the event.
In particular, as shown in fig. 3, an animation object, actor 1, has been created in the animation editor, and the animated character of actor 1 is shown in the animation display region of the animation editor. In addition, actor 1 has a corresponding active picture track and a sub-animation track, and a base animation sub-event, binding object A, is created for the corresponding active picture track of actor 1, and an extended animation sub-event, kicking sub-event, is created for the sub-animation track associated with actor 1.
In summary, by aiming at any one of a plurality of target animation objects, an expansion sub-event creation instruction of a sub-animation track associated with the any one target animation object is received, and in response to the expansion sub-event creation instruction, a corresponding expansion animation sub-event is created by a target expansion sub-event type selected in the expansion sub-event type list. The method realizes the type division of the extension animation sub-event of the animation object, flexibly selects the required extension sub-event type from a plurality of extension sub-event types to create the extension animation sub-event, and increases the flexibility of creating the extension animation sub-event.
In practice, it may be desirable to view the created animation sub-event (base animation sub-event or extended animation sub-event). When editing the animation sub-events in the animation editor, if the event operation sub-region of each animation sub-event is traversed linearly, so as to determine the animation sub-event matched with the user click, the time consumption is long. Thus, the event operation sub-area of each animation sub-event can be traversed in the form of a quadtree, and the embodiment of the application further comprises:
receiving an event inquiry instruction submitted by a user in an event operation area;
traversing the event quadtree corresponding to the event operation area, and comparing the position information in the event inquiry instruction with the event operation subarea corresponding to the node in the event quadtree;
The event operation sub-region is the position of the operation sub-region corresponding to the animation sub-event in the animation event;
and determining and displaying a target animation sub-event in the animation sub-events according to the comparison result.
The event operation area refers to an interface area which is included in the animation editor and can operate at least one animation sub-event. The event inquiry instruction refers to an instruction for inquiring any one of the animation sub-events (basic animation sub-event or extended animation sub-event), each event operation sub-region refers to any one of the created animation sub-events, and the event operation sub-region corresponds to an operation sub-region of the event operation region, and can be understood as an event bounding box.
In practical application, in order to query an already created animation sub-event, the already created animation sub-event is generally mapped in an editing interface of an animation editor in a manner of an event operation sub-region, so that a user views any one animation sub-event when submitting an event query instruction by operating in the event operation sub-region corresponding to the any one animation sub-event. But because the animation sub-event which the user wants to view is determined according to the user operation position. Therefore, the position information carried in the event query instruction of the user needs to be compared with the event operation sub-area of the animation sub-event contained in the event operation area to determine the event operation sub-area matched with the position information. And taking the animation sub-event corresponding to the matched event operation sub-region as the animation sub-event corresponding to the event inquiry instruction.
In the implementation, the event operation sub-area corresponding to the position information can be determined by traversing an event quadtree formed based on the event operation sub-area corresponding to the created animation sub-event, that is, by traversing the event quadtree corresponding to the event operation area, comparing the position information carried in the event query instruction (for example, the click position information carried in the click submitted event query instruction) with the event operation sub-area corresponding to each leaf node in the event quadtree. And taking the animation sub-event corresponding to the corresponding event operation sub-region as a target animation sub-event, and displaying the target animation sub-event (a target basic animation sub-event or a target extension animation sub-event). Specifically, the target animation sub-event can be displayed through an event panel for displaying the target animation sub-event.
Further, the user can edit the queried animation sub-event in the displayed event panel. After editing is completed, the animation editor updates the queried animation sub-events based on the event information in the event panel.
Along the above example, assume that two basic animation sub-events, namely, a binding object A and a following object B, and three extended animation sub-events, namely, a kicking sub-event, a nodding sub-event and a waving sub-event, have been created for actor 1. Specifically, as shown in fig. 4, in the event operation region in the animation editor, the binding object a corresponds to the event operation sub-region cr1, the following object B corresponds to the event operation sub-region cr2, the kicking event corresponds to the event operation sub-region cr3, the nod event corresponds to the event operation sub-region cr4, and the hand waving event corresponds to the event operation sub-region cr5. These animation sub-events are presented within the event manipulation area of the animation editor in the manner of event manipulation sub-areas. Receiving an event query instruction submitted by a user U in an event operation area, traversing an event quadtree created in advance for the event operation area and event operation sub-areas of 5 animation sub-events, comparing the position information d1 carried in the event query instruction with the 5 event operation sub-areas corresponding to the nodes of the event quadtree, determining that the position information d1 is matched with the event operation sub-area cr4, and displaying an event panel corresponding to a nod sub-event corresponding to the event operation sub-area cr 4.
In conclusion, the animation sub-events to be checked are queried in a traversing mode of the quadtree, so that the query efficiency of the animation sub-events corresponding to the query instruction of the query event is improved.
In specific implementation, each created animation sub-event needs to be included in the event quad-tree, in order to avoid that the event quad-tree needs to be reconstructed after a new animation sub-event is created each time, a new animation sub-event can be created each time, and the created event quad-tree is updated based on the event operation sub-region of the newly created animation sub-event, and in the embodiment of the present application, the event quad-tree is updated by the following ways:
After any one animation sub-event in the animation event is created, determining a target node corresponding to an event operation sub-region of the any one animation sub-event in the event quadtree;
And updating the target node according to the event operation sub-region of any animation sub-event to obtain an updated event quadtree.
In practical application, since each leaf node in the event quadtree corresponds to a piece of node area in the event operation area, and each leaf node corresponds to the event operation sub-area contained in the piece of node area. Therefore, it is necessary to determine the node area where the position of the operation event sub-area of the newly created animation sub-event (base animation sub-event or extension animation sub-event) is located, and then determine the corresponding target node of the located node area in the event quadtree. And updating the event quadtree by adding the position of the operation event sub-region of the newly created animation sub-event to the target node for updating, thereby obtaining the updated event quadtree. For example, the target node originally corresponds to 1 event operation sub-region, and after the target node is updated through the operation event sub-region of the newly created animation sub-event, the target node corresponds to 2 event operation sub-regions.
Along the above example, it is assumed that there are 4 leaf nodes in the event quadtree, and the 4 leaf nodes are respectively a leaf node a, a leaf node B, a leaf node C and a leaf node D, where the leaf node a corresponds to the event operation sub-region cr1, the leaf node B corresponds to the event operation sub-region cr3, the leaf node C corresponds to the event operation sub-region cr2, and the leaf node D corresponds to the event operation sub-region cr4. After the extended animation sub-event, namely the hand swing sub-event, is created, determining that an event operation sub-region cr5 of the hand swing sub-event is in a node region corresponding to the leaf node C, and taking the leaf node C as a target node. The leaf node C originally corresponds to the event operation sub-region cr2 of the follower object B, and updates the leaf node C by adding the event operation sub-region r5 to the leaf node C, where the updated leaf node C corresponds to the event operation sub-region cr2 of the follower object B and the event operation sub-region cr5 of the hand swing event. Further, by updating the leaf node C in the event quadtree, an updated event quadtree is obtained.
In sum, after the event quadtree is created for the first time, the event quadtree is only updated through the event operation sub-region corresponding to the newly created animation sub-event after the animation sub-event is created each time, so that the efficiency of viewing the event is indirectly improved.
And 206, creating a target animation file based on the animation object, the animation track and the animation event.
Specifically, on the basis of the above-described read animation event, in order to include all animation elements in the created target animation file, a playable animation (i.e., target animation file) is created based on animation elements such as animation objects, animation tracks, and animation events.
When the method is implemented, a blank initial animation file can be created first, then animation data corresponding to an animation object, an animation track and an animation event are read from a memory, and the animation data are added to the initial animation file according to a preset format to generate a target animation file. Furthermore, the playing interface can be called to play the target animation file so as to play the created target animation.
In order to make the created target animation file include all animation elements created in the animation editing process on the basis that the animation track is divided into an active picture track and a sub-animation track, and the animation event is divided into a basic animation event and an extended animation event, the embodiment of the application creates the target animation file based on the animation object, the active picture track, the basic animation event, the sub-animation track and the extended animation event. The created target animation file has clear level and stronger interpretability. In the implementation, in the process of editing and creating the animation through the animation editor, some animation resources of preset types in the animation editor, such as material resources, map pasting resources, animation editing resources and the like, which are pre-imported or configured in the animation editor, are possibly referred to, if only the reference relationship is directly saved for the resources in the process of creating the animation file, the migration of the animation file is not facilitated, so that the resource information of the referred original resources can be directly saved in the animation file, and the embodiment of the application is realized in the following manner:
taking the animation object, the animation track and the animation event as animation elements;
Determining a target animation resource of the preset type associated with at least one target animation element in the animation elements, and reading resource information of the target animation resource;
Updating the at least one target animation element through the resource information according to the association relationship between the target animation resource and the at least one target animation element;
updating the animation object, the animation track and the animation event based on the updated at least one target animation element;
and taking the updated animation objects, the updated animation tracks and the updated animation events as the animation objects, the updated animation tracks and the updated animation events.
The preset type refers to a preset resource type of the animation resource. In practical application, the resource type independent to the animation editor can be set as a preset type, and in this case, the resource information in the animation editor is also easy to read, so that the resource information of the animation resources can be directly read.
In practice, the created animation objects, animation tracks and animation events are taken as animation elements, and each of the animation elements possibly refers to animation resources, so that the animation elements associated with the animation resources of the preset type are selected as at least one target animation element in the animation objects, the animation tracks and the animation events. So as to read animation resources of a preset type (i.e., target animation resources) associated with (referenced by or dependent on) the target animation elements and to read resource information of the target animation resources.
And updating the target animation elements with the association relationship by the read resource information according to the association relationship (the reference relationship or the dependency relationship) between the target animation resources and at least one target animation element. And updating the animation object, the animation track and the animation element corresponding to the animation object, the animation track and the animation event based on the updated at least one target animation element to obtain updated animation elements so as to create a target animation file based on the updated animation elements.
Along the above example, an animation creation instruction submitted by the user U is received, and in response to the animation creation instruction, all animation objects created in the animation editor are determined, including actor 1, director 1, and camera 1, and a base animation event created in advance for the active picture tracks corresponding to the three animation objects is read, including binding object a (base animation sub-event) in the base animation event. And selecting an animation object, namely actor 1, of the three animation objects, wherein the animation object is associated with the sub-animation track, and taking the animation object, namely actor 1, as a target animation object. And reads an extended animation event created in advance of an action track (sub-animation track) associated with the target animation object actor 1, the extended animation event including a kick sub-event (extended animation sub-event).
And taking three animation objects, an active drawing track corresponding to the three animation objects, an action track associated with the actor 1, a binding object A and a kicking event as animation elements, and determining that a target animation element associated with animation resources of a preset type in the animation elements is the binding object A under the condition that animation resources of a preset type are preset in the animation elements, wherein the preset type is a map type and a material type. And determining the (associated) target animation resource of the preset type configured in the binding object A as a file A, and reading file information contained in the file A. And updating the binding object A based on the read file information to obtain the updated binding object A.
In summary, the target animation file is convenient to migrate by reading the resource information of the target animation resource associated with the target animation element and updating the target animation element based on the resource information so that the created target animation file directly contains the resource information of the animation resource.
In addition, the animation element also contains animation elements related to the prefabricated member resources, and the prefabricated member resources are created in advance through an animation editor, and the stored information needs to be analyzed by the animation editor, so that the resource paths of the prefabricated member resources can be saved for avoiding the analysis of the prefabricated member resources, and the embodiment of the application is realized in the following way:
taking the animation object, the animation track and the animation event as animation elements;
determining a target prefabricated member resource associated with at least one prefabricated member animation element in the animation elements, and determining a resource path of the target prefabricated member resource;
Updating the at least one prefabricated member animation element through the resource path according to the association relationship between the target prefabricated member resource and the at least one prefabricated member animation element;
Updating the animation object, the animation track and the animation event based on the updated at least one preform animation element;
and taking the updated animation objects, the updated animation tracks and the updated animation events as the animation objects, the updated animation tracks and the updated animation events.
Because the dependency of the prefabricated member resources on the animation editor is larger, the difficulty in reading the resource information is larger, particularly, the animation editor is used as a plug-in editor, and the resource information in the prefabricated member resources is difficult to analyze under the condition that the original editor corresponding to the plug-in editor is pre-created, so that the resource paths of the prefabricated member resources can be directly determined.
In practice, the created animation object, the active drawing track, the basic animation event, the sub-animation track and the extended animation event are taken as animation elements, and each of the animation elements can possibly refer to the prefabricated member resource, so that the animation element associated with the prefabricated member resource needs to be selected as at least one prefabricated member animation element in the animation object, the active drawing track, the basic animation event, the sub-animation track and the extended animation event. To determine the resource path of the target preform resource to which these preform animation elements are associated (referenced or dependent).
And updating the prefabricated member animation elements with the association relationship through the determined resource path according to the association relationship (the reference relationship or the dependency relationship) between the target prefabricated member resource and at least one prefabricated member animation element. And updating the animation objects, the animation tracks and the animation elements corresponding to the animation objects, the animation tracks and the animation events based on the updated at least one prefabricated member animation element to obtain updated animation elements so as to create a target animation file based on the updated animation elements.
Along the above example, three animation objects, the active drawing track corresponding to the three animation objects, the action track associated with actor 1, the binding object A and the kicking event are taken as animation elements, and the prefabricated member elements associated with the prefabricated member resources in the animation elements are determined to be actor 1. A (associated) target pre-cast asset file B configured in the actor 1 is determined, and an asset path p1 of the target pre-cast asset file B is determined. And updating the actor 1 based on the resource path p1 to obtain the updated actor 1.
In summary, by determining the resource path of the target pre-form resource associated with the pre-form animation element and updating the pre-form animation element based on the resource path, so that the created target animation file directly contains the resource path of the pre-form resource, the generation efficiency of generating the target animation file is ensured.
In summary, the animation generation method provided by the embodiment of the application determines the created animation object based on the received animation creation instruction, reads the animation event created in advance for the animation track corresponding to the animation object, creates the target animation file based on the animation object, the animation track and the animation event according to the hierarchical structure of the animation object, the animation track and the animation event, realizes the creation of the target animation file according to the hierarchical structure of the animation track and the animation event corresponding to the animation object, ensures that the animation hierarchy is more visual and convenient to understand, improves the efficiency of creating the animation for the user, and improves the creation experience of the user.
Fig. 5 is a process flow chart of an animation generation method applied to an animation editor, which is described by taking the animation editor as an example, and specifically includes the following steps:
Step 502, receiving an object creation instruction submitted by a user.
Specifically, the object creation instruction refers to an object creation instruction submitted by a user through operation on an object creation control in an animation editor.
Step 504, responding to the object creation instruction, and displaying an animation object type list, wherein the animation object type list comprises a director type, a camera type, an actor type, a special effect type and a lamplight type.
Specifically, the director type, the camera type, the actor type, the special effect type and the lamplight type are all animation types in the animation object type list.
Step 506, creating a camera 01 corresponding to the camera type based on the camera type selected by the user in the animation object type list.
Specifically, the camera 01 is an animation object corresponding to the created camera type. In practical application, the created camera 01 directly corresponds to an active picture track.
Step 508, receiving a user sub-track creation instruction for camera 01.
Specifically, the sub-track creation instruction refers to a sub-track creation instruction submitted by a user through an operation on a sub-track creation control for the camera 01 in the animation editor.
Step 510, in response to the sub-track creation instruction, presents a list of sub-track types associated with the function of camera 01.
Specifically, the sub-track type list may include sub-track types such as a moving track type, a display track type, and the like.
Step 512, creating the mobile track 1 corresponding to the mobile track type based on the mobile track type selected by the user in the sub-track type list.
Step 514, receiving a basic sub-event creation instruction of the user for the active drawing track corresponding to the camera 01.
In practical application, when the camera 01 can be created, the corresponding active picture track can be automatically created.
And step 516, responding to the basic sub-event creation instruction, and displaying a basic sub-event type list corresponding to the active drawing track corresponding to the camera 01.
Specifically, the basic sub-event type list includes basic sub-event types such as a lens parameter type, a depth of field type, and the like.
Step 518, creating a lens parameter sub-event corresponding to the lens parameter type based on the lens parameter type selected by the user in the basic sub-event type list.
Step 520, receiving an extended sub-event creation instruction of a user for the mobile track 1 associated with the camera 01.
And 522, responding to the extended sub-event creation instruction, and displaying an extended sub-event type list corresponding to the mobile track 1 associated with the camera 01.
Specifically, the extended sub-event type list can include extended sub-event types such as displacement type, straight-line type, turning type, variable speed displacement and the like.
And 524, creating a turning sub-event corresponding to the turning type based on the turning type selected by the user in the extended sub-event type list.
Specifically, the turning sub-event is an extended animation sub-event.
Step 526, determining three animation objects of the camera 01, the director 01 and the actor 01 based on the received animation creation instruction.
In particular, in addition to the camera 01 created in the animation editor described above, two animation objects, director 01 and actor 01, are created in the animation editor before receiving an animation creation instruction.
Step 528, reading the pre-created lens parameter sub-event for the common track corresponding to each animation object, and reading the pre-created turning sub-event for the moving track 1 associated with the camera 01 in the three animation objects.
Step 530, determining three animation objects of the camera 01, the director 01 and the actor 01, and the general track, the moving track 1, the lens parameter sub-event and the animation resources of the preset type associated with the turning sub-event corresponding to the three animation objects respectively, and reading the resource information of the target animation resources associated with the turning sub-event.
Specifically, the preset types include, but are not limited to, a material type, a map type, and/or a resource type such as an animation clip type. The target animation resource is a preset type of animation resource associated with a turn sub-event.
And 532, updating the turning sub-event through the resource information according to the association relationship between the target animation resource and the turning sub-event.
Step 534, creating a target animation file based on the three animation objects of the camera 01, the director 01 and the actor 01, wherein the three animation objects respectively correspond to the general track, the moving track 1, the lens parameter sub-event and the updated turning sub-event.
In summary, according to the animation generation method provided by the embodiment of the application, on the basis of creating the animation object, a basic animation sub-event is created for the active picture track corresponding to the animation object, and an extended animation sub-event is created for the sub-animation track corresponding to the animation track. Based on the received animation creation instruction, the created animation object, the active drawing track, the basic animation event, the sub-animation track and the extended animation event are read according to the hierarchical structure of the animation object, the active drawing track and the basic animation event and the hierarchical structure of the extended animation event, so that a target animation file is created, the animation creation process is more visual through dividing the animation hierarchy, the interpretability is stronger, and the creation efficiency of the animation is also improved.
Corresponding to the method embodiment, the application also provides an embodiment of the animation generation device, and fig. 6 shows a schematic structural diagram of the animation generation device according to one embodiment of the application. As shown in fig. 6, the apparatus 600 includes:
a determining module 602 configured to determine an animation object based on the received animation creation instruction;
A reading module 604 configured to read an animation event created in advance for an animation track corresponding to the animation object;
a creation module 606 configured to create a target animation file based on the animation object, the animation track, and the animation event.
Optionally, the reading module 604 is further configured to:
Reading a basic animation event which is pre-established for an active picture track corresponding to the animation object, and reading an extended animation event which is pre-established for a sub-animation track associated with a target animation object, wherein the target animation object is an animation object associated with the sub-animation track in the animation object;
accordingly, the creation module 606 is further configured to:
a target animation file is created based on the animation object, the active drawing track, the base animation event, the sub-animation track, and the extended animation event.
Optionally, in a case where a preset type of animation resource is preconfigured, the animation generating device includes:
the information reading module is configured to take the animation object, the animation track and the animation event as animation elements, determine the target animation resources of the preset type associated with at least one target animation element in the animation elements, and read the resource information of the target animation resources;
a first updating module configured to update the at least one target animation element through the resource information according to an association relationship between the target animation resource and the at least one target animation element;
And the second updating module is configured to update the animation object, the animation track and the animation event based on the updated at least one target animation element, and takes the updated animation object, the updated animation track and the updated animation event as the animation object, the updated animation track and the updated animation event.
Optionally, the at least one target animation element is determined by running the following module:
A selection module configured to select an animation element associated with the animation resource of the preset type as at least one target animation element in the animation object, the animation track, and the animation event.
Optionally, in the case of pre-configuring the pre-form resource, the animation generating device includes:
a determining path module configured to take the animation object, the animation track, and the animation event as animation elements, determine a target preform resource associated with at least one of the animation elements, and determine a resource path of the target preform resource;
a third updating module configured to update the at least one preform animation element through the resource path according to an association relationship between the target preform resource and the at least one preform animation element;
and a fourth updating module configured to update the animation object, the animation track and the animation event based on the updated at least one prefabricated member animation element, and taking the updated animation object, animation track and animation event as the animation object, animation track and animation event.
Optionally, any one of the animation sub-events is queried by running the following modules:
the first receiving module is configured to receive an event query instruction submitted by a user in an event operation area;
The comparison module is configured to compare the position information in the event inquiry instruction with the event operation subarea corresponding to the node in the event quadtree by traversing the event quadtree corresponding to the event operation area, wherein the event operation subarea is the position of the operation subarea corresponding to the animation sub-event in the animation event;
and the determining sub-event module is configured to determine and display a target animation sub-event in the animation sub-events according to the comparison result.
Optionally, the event quadtree is updated by running the following modules:
A determining node module configured to determine a target node corresponding to an event operation sub-region of any one of the animation sub-events in the event quadtree after any one of the animation sub-events is created;
And a fifth updating module, configured to update the target node according to the event operation sub-area of any one animation sub-event, so as to obtain an updated event quadtree.
Optionally, any one of the animation objects is created by running the following modules:
A first display module configured to display an animation object type list in response to a received object creation instruction, wherein the animation object type list includes, but is not limited to, a director type, a camera type, an actor type, a special effect type and a lamplight type;
And the object creating module is configured to create an animation object corresponding to the target animation object type based on the target animation object type selected in the animation object type list.
Optionally, any one of the sub-animation tracks is created by:
A second receiving module configured to receive a sub-track creation instruction for any one of the animation objects;
A second presentation module configured to present a list of sub-track types associated with object attributes of the arbitrary one of the animation objects in response to the sub-track creation instruction;
A creation track module configured to create a sub-animation track corresponding to a target sub-track type based on the target sub-track type selected in the sub-track type list.
Optionally, the target animation object is determined by running the following modules:
and the object selection module is configured to select an animation object of the associated sub-animation track from the animation objects to determine as a target animation object.
Optionally, any one of the base animation sub-events is created by running the following modules:
The third receiving module is configured to receive a basic sub-event creation instruction of an active drawing track corresponding to any one of the animation objects;
The third display module is configured to respond to the basic sub-event creation instruction and display a basic sub-event type list corresponding to the active drawing track corresponding to any one animation object;
The first creating sub-event module is configured to create a basic animation sub-event corresponding to a target basic sub-event type based on the target basic sub-event type selected in the basic sub-event type list.
Optionally, any one of the extended animation events is created by running the following modules:
A fourth receiving module configured to receive, for any one of the target animation objects, an extended sub-event creation instruction for a sub-animation track associated with the any one of the target animation objects;
A fourth display module configured to display a corresponding extended sub-event type list of the sub-animation track associated with the arbitrary one target animation object in response to the extended sub-event creation instruction;
And the second creation sub-event module is configured to create an extended animation sub-event corresponding to the target extended sub-event type based on the target extended sub-event type selected in the extended sub-event type list.
In summary, the animation generating device provided by the embodiment of the application determines the created animation object based on the received animation creation instruction, reads the animation event created in advance for the animation track corresponding to the animation object, and creates the target animation file based on the animation object, the animation track and the animation event according to the hierarchical structure of the animation object, the animation track and the animation event, thereby realizing that the animation creation process is more visual and the interpretability is stronger by dividing the animation hierarchy, and improving the creation efficiency of the animation.
The above is a schematic scheme of an animation generation device of the present embodiment. It should be noted that, the technical solution of the animation generating device and the technical solution of the animation generating method belong to the same conception, and the details of the technical solution of the animation generating device, which are not described in detail, can be referred to the description of the technical solution of the animation generating method.
In one embodiment, the application also provides a computing device, including a memory, a processor, and computer instructions stored on the memory and executable on the processor, where the processor implements the steps of the animation generation method when executing the computer instructions.
The foregoing is a schematic illustration of a computing device of this embodiment. It should be noted that, the technical solution of the computing device and the technical solution of the animation generation method belong to the same concept, and details of the technical solution of the computing device, which are not described in detail, can be referred to the description of the technical solution of the animation generation method.
An embodiment of the application also provides a computer-readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the animation generation method as described above.
The above is an exemplary version of a computer-readable storage medium of the present embodiment. It should be noted that, the technical solution of the storage medium and the technical solution of the animation generation method belong to the same concept, and details of the technical solution of the storage medium, which are not described in detail, can be referred to the description of the technical solution of the animation generation method.
The foregoing describes certain embodiments of the present application. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The computer instructions include computer program code that may be in source code form, object code form, executable file or some intermediate form, etc. The computer readable medium may include any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
It should be noted that, for the sake of simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily all required for the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
The preferred embodiments of the application disclosed above are intended only to assist in the explanation of the application. Alternative embodiments are not intended to be exhaustive or to limit the application to the precise form disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and the practical application, to thereby enable others skilled in the art to best understand and utilize the application. The application is limited only by the claims and the full scope and equivalents thereof.

Claims (14)

1. An animation generation method, comprising:
determining an animation object based on the received animation creation instruction;
Reading an animation event pre-created for an animation track corresponding to the animation object, including:
Reading a basic animation event which is pre-established for an active picture track corresponding to the animation object, and reading an extended animation event which is pre-established for a sub-animation track associated with a target animation object, wherein the target animation object is an animation object associated with the sub-animation track in the animation object, the animation track is a time axis set for the animation object, the animation track comprises the active picture track and the sub-animation track, the sub-animation track is established according to a selected sub-animation track type, and the sub-animation track type is divided according to object attributes of the animation object;
creating a target animation file based on the animation object, the animation track, and the animation event includes creating a target animation file based on the animation object, the active picture track, the base animation event, the sub-animation track, and the extended animation event.
2. The animation generation method according to claim 1, wherein, in a case where a preset type of animation resource is preconfigured, before the creating of the target animation file based on the animation object, the animation track, and the animation event, further comprising:
taking the animation object, the animation track and the animation event as animation elements;
Determining a target animation resource of the preset type associated with at least one target animation element in the animation elements, and reading resource information of the target animation resource;
Updating the at least one target animation element through the resource information according to the association relationship between the target animation resource and the at least one target animation element;
updating the animation object, the animation track and the animation event based on the updated at least one target animation element;
and taking the updated animation objects, the updated animation tracks and the updated animation events as the animation objects, the updated animation tracks and the updated animation events.
3. The animation generation method of claim 2, wherein the at least one target animation element is determined by:
and selecting an animation element associated with the animation resource of the preset type as at least one target animation element in the animation object, the animation track and the animation event.
4. The animation generation method according to claim 1, wherein, in a case where a preform resource is preconfigured, before the creating of the target animation file based on the animation object, the animation track, and the animation event, further comprises:
taking the animation object, the animation track and the animation event as animation elements;
determining a target prefabricated member resource associated with at least one prefabricated member animation element in the animation elements, and determining a resource path of the target prefabricated member resource;
Updating the at least one prefabricated member animation element through the resource path according to the association relationship between the target prefabricated member resource and the at least one prefabricated member animation element;
Updating the animation object, the animation track and the animation event based on the updated at least one preform animation element;
and taking the updated animation objects, the updated animation tracks and the updated animation events as the animation objects, the updated animation tracks and the updated animation events.
5. The animation generation method of claim 1, wherein any one of the animation events is queried by:
receiving an event inquiry instruction submitted by a user in an event operation area;
traversing the event quadtree corresponding to the event operation area, and comparing the position information in the event inquiry instruction with the event operation subarea corresponding to the node in the event quadtree;
The event operation sub-region is the position of the operation sub-region corresponding to the animation sub-event in the animation event;
and determining and displaying a target animation sub-event in the animation sub-events according to the comparison result.
6. The animation generation method of claim 5, wherein the event quadtree is updated by:
After any one animation sub-event in the animation event is created, determining a target node corresponding to an event operation sub-region of the any one animation sub-event in the event quadtree;
And updating the target node according to the event operation sub-region of any animation sub-event to obtain an updated event quadtree.
7. The animation generation method of claim 1, wherein any one of the animation objects is created by:
in response to a received object creation instruction, displaying an animation object type list, wherein the animation object type list comprises a director type, a camera type, an actor type, a special effect type and a lamplight type;
And creating an animation object corresponding to the target animation object type based on the target animation object type selected in the animation object type list.
8. The animation generation method of claim 1, wherein any one of the sub-animation tracks is created by:
receiving a sub-track creation instruction aiming at any one of the animation objects;
Responding to the sub-track creation instruction, and displaying a sub-track type list associated with the object attribute of any one animation object;
Creating a sub-animation track corresponding to the target sub-track type based on the target sub-track type selected in the sub-track type list.
9. The animation generation method of claim 1, wherein the target animation object is determined by:
and selecting the animation object of the associated sub-animation track from the animation objects to determine as a target animation object.
10. The animation generation method of claim 1, wherein any one of the base animation sub-events is created by:
Receiving a basic sub-event creation instruction of an active drawing track corresponding to any one of the animation objects aiming at any one of the animation objects;
responding to the basic sub-event creation instruction, and displaying a basic sub-event type list corresponding to the active drawing track corresponding to any one animation object;
And creating a basic animation sub-event corresponding to the target basic sub-event type based on the target basic sub-event type selected in the basic sub-event type list.
11. The animation generation method of claim 1, wherein any one of the extended animation events is created by:
aiming at any one target animation object in the target animation objects, receiving an expansion sub-event creation instruction aiming at a sub-animation track associated with the any one target animation object;
Responding to the extended sub-event creation instruction, and displaying a corresponding extended sub-event type list of a sub-animation track associated with any one target animation object;
Creating an extended animation sub-event corresponding to the target extended sub-event type based on the target extended sub-event type selected in the extended sub-event type list.
12. An animation generation device, comprising:
a determining module configured to determine an animation object based on the received animation creation instruction;
A reading module configured to read an animation event created in advance for an animation track corresponding to the animation object, including:
Reading a basic animation event which is pre-established for an active picture track corresponding to the animation object, and reading an extended animation event which is pre-established for a sub-animation track associated with a target animation object, wherein the target animation object is an animation object associated with the sub-animation track in the animation object, the animation track is a time axis set for the animation object, the animation track comprises the active picture track and the sub-animation track, the sub-animation track is established according to a selected sub-animation track type, and the sub-animation track type is divided according to object attributes of the animation object;
A creation module configured to create a target animation file based on the animation object, the animation track, and the animation event includes creating a target animation file based on the animation object, the active picture track, the base animation event, the sub-animation track, and the extended animation event.
13. A computing device comprising a memory, a processor, and computer instructions stored on the memory and executable on the processor, wherein the processor, when executing the computer instructions, performs the steps of the method of any one of claims 1-11.
14. A computer readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the method of any one of claims 1-11.
CN202111113011.7A 2021-09-18 2021-09-18 Animation generation method and device Active CN113808237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111113011.7A CN113808237B (en) 2021-09-18 2021-09-18 Animation generation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111113011.7A CN113808237B (en) 2021-09-18 2021-09-18 Animation generation method and device

Publications (2)

Publication Number Publication Date
CN113808237A CN113808237A (en) 2021-12-17
CN113808237B true CN113808237B (en) 2024-12-27

Family

ID=78896333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111113011.7A Active CN113808237B (en) 2021-09-18 2021-09-18 Animation generation method and device

Country Status (1)

Country Link
CN (1) CN113808237B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112667942A (en) * 2019-10-16 2021-04-16 腾讯科技(深圳)有限公司 Animation generation method, device and medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7173623B2 (en) * 2003-05-09 2007-02-06 Microsoft Corporation System supporting animation of graphical display elements through animation object instances
US20100134499A1 (en) * 2008-12-03 2010-06-03 Nokia Corporation Stroke-based animation creation
CN105469438A (en) * 2015-11-11 2016-04-06 广州大学 Animation button device and method thereof for controlling image conversion into animation
CN106815882B (en) * 2015-12-01 2020-12-29 厦门雅基软件有限公司 Method capable of infinitely extending animation attributes
CN106887029A (en) * 2016-06-14 2017-06-23 阿里巴巴集团控股有限公司 Animation control methodses, device and terminal
CN109242935A (en) * 2018-08-21 2019-01-18 北京奔流网络信息技术有限公司 A kind of Masking animation implementation method and device based on android system
CN112927331B (en) * 2021-03-31 2023-09-22 腾讯科技(深圳)有限公司 Character model animation generation method and device, storage medium and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112667942A (en) * 2019-10-16 2021-04-16 腾讯科技(深圳)有限公司 Animation generation method, device and medium

Also Published As

Publication number Publication date
CN113808237A (en) 2021-12-17

Similar Documents

Publication Publication Date Title
TWI808393B (en) Page processing method, device, apparatus and storage medium
US20250341935A1 (en) System and method for interface display screen manipulation
WO2025168004A1 (en) Method and apparatus for creating virtual object, device and storage medium
CN113535165A (en) Interface generation method, apparatus, electronic device, and computer-readable storage medium
CN112138380B (en) Method and device for editing data in game
CN109471580B (en) A visual 3D courseware editor and courseware editing method
WO2025092766A1 (en) Method and apparatus for displaying work, and device and storage medium
EP2911049B1 (en) Method and system for generating crowd animation and computer-readable recording medium
CN115544311A (en) Data analysis method and device
CN114995699B (en) Interface interaction method and device
CN113808237B (en) Animation generation method and device
US11714691B2 (en) Extensible command pattern
WO2025195387A1 (en) Work publishing method and apparatus, work viewing method and apparatus, device, and storage medium
WO2025157193A1 (en) Method and apparatus for creating media content, device, and storage medium
CN117611711A (en) Methods, apparatus, equipment and media for generating comics
WO2025020826A1 (en) Data processing method and apparatus applied to recommendation scenario, and device and storage medium
KR102385381B1 (en) Method and system for generating script forcamera effect
CN115756692A (en) Method for automatically combining and displaying pages based on style attributes and related equipment thereof
CN114797095A (en) Scene switching method and device
CN119680186B (en) Model resource acceptance methods, devices, equipment, storage media, and program products
CN119443276B (en) Document generation methods, apparatus, devices and storage media
CN118642809B (en) Method, apparatus, storage medium, and program product for dynamically rendering pages
CN120540565A (en) Media resource input method, device, equipment and storage medium
CN116206016A (en) Method and device for processing special effect event in animation
CN120723222A (en) Method, device, equipment and storage medium for creating application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 519000 Room 102, 202, 302 and 402, No. 325, Qiandao Ring Road, Tangjiawan Town, high tech Zone, Zhuhai City, Guangdong Province, Room 102 and 202, No. 327 and Room 302, No. 329

Applicant after: Zhuhai Jinshan Digital Network Technology Co.,Ltd.

Address before: 519000 Room 102, 202, 302 and 402, No. 325, Qiandao Ring Road, Tangjiawan Town, high tech Zone, Zhuhai City, Guangdong Province, Room 102 and 202, No. 327 and Room 302, No. 329

Applicant before: ZHUHAI KINGSOFT ONLINE GAME TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant