Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
The terminology used in the one or more embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the present application. As used in one or more embodiments of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present application is intended to encompass any and all possible combinations of one or more of the associated listed items.
It will be understood that, although the terms first, second, etc. may be used herein in one or more embodiments of the present application to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first aspect may be termed a second aspect, and, similarly, a second aspect may be termed a first aspect, without departing from the scope of one or more embodiments of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In the present application, an animation generation method and apparatus, a computing device, and a computer-readable storage medium are provided, which are described in detail in the following embodiments one by one.
FIG. 1 shows a block diagram of a computing device 100 according to an embodiment of the present application. The components of the computing device 100 include, but are not limited to, memory 110 and processor 120. The processor 120 is coupled to the memory 110 via a bus 130 and a database 150 is used to store data.
Computing device 100 also includes access device 140, access device 140 enabling computing device 100 to communicate via one or more networks 160. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. Access device 140 may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present application, the above-mentioned components of the computing device 100 and other components not shown in fig. 1 may also be connected to each other, for example, by a bus. It should be understood that the block diagram of the computing device architecture shown in FIG. 1 is for purposes of example only and is not limiting as to the scope of the present application. Those skilled in the art may add or replace other components as desired.
Computing device 100 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), a mobile phone (e.g., smartphone), a wearable computing device (e.g., smartwatch, smartglasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 100 may also be a mobile or stationary server.
Wherein the processor 120 may perform the steps of the animation generation method shown in fig. 2. Fig. 2 shows a flowchart of an animation generation method according to an embodiment of the present application, which specifically includes the following steps:
step 202: based on the received animation creation instruction, an animation object is determined.
The animation creating instruction is an instruction for creating a playable animation file; the animation object can be understood as animation resources such as an animation character, a camera, an animation prop and the like. In practical applications, when the animation editor receives the animation creation instruction, animation data for creating an animation file needs to be determined, and the animation file needs to be created based on the animation data. The animation editor may be an animation editor for executing an animation, or may be a plug-in editor in an animation editor for another animation, which is not limited herein.
In specific implementation, in order to optimize the process of creating the animation by the user and facilitate the user to manage animation elements (including animation objects) in the animation, the animation object may be created first, and then other animation elements that affect the animation object may be created based on the created animation object, so as to perform classification management on various animation elements based on the animation object.
Therefore, how to create an animation object has a significant influence on the animation creation process, if the created animation object can well classify animation elements and meet the creation requirement, the animation creation efficiency of a user can be greatly improved, and in the specific implementation, some animation resources which are mainly or frequently used in the animation creation process can be firstly classified according to functions or categories, and then the animation object is created based on the classified animation object types.
In response to a received object creating instruction, displaying an animation object type list, wherein the animation object type list comprises a director type, a camera type, an actor type, a special effect type and a light type;
and creating an animation object corresponding to the target animation object type based on the target animation object type selected from the animation object type list.
In practical application, after a user submits an object creating instruction, animation object types which can be created are displayed to the user, the animation object types are displayed in an animation object list, and specifically, the animation object type list comprises animation object types which are not limited to a director type, a camera type, an actor type, a special effect type, a light type and the like. Further, the director-type animation object is used for controlling the whole animation, such as shot switching, animation playing speed and the like in the animation; the camera type animation object determines the observation visual angle and the display range of playing animation; the animation object of the actor type refers to a virtual object in the animation; determining the special effect in the animation by the animation object of the special effect type; the animation object of the light type determines the light effect in the animation; in addition, the animation object type list may further include an animation object type of a scene type for setting a scene, and the like, which is not limited herein.
Based on this, the user can select the animation object type (i.e. the target animation object type) that he wants to create from the list of animation object types, and then the animation object can be created based on the target animation object type.
During specific implementation, the main animation object types are abstracted from the animation editor, and animation objects required by animation scene requirements can be created according to the animation object types in the animation editing process.
Such as: receiving an object creating instruction submitted by a user U in the animation editor, and displaying an animation object type list preset in the animation editor in response to the object creating instruction, wherein the animation object type list comprises: the method comprises the following steps of receiving an actor type selected by a user U in an animation object type list as a target animation object type, and creating an animation object of the actor type: actor 1.
In summary, in response to the received object creation instruction, the animation object type list is displayed, and the animation object corresponding to the target animation object type is created based on the target animation object type selected from the animation object type list, so that the corresponding animation object is created based on the animation object types divided in advance, the type division of the animation object is realized, the required creation of the animation object type is flexibly selected from multiple animation object types to create the animation object, and the flexibility of creating the animation object is increased.
Step 204: and reading animation events which are created in advance aiming at the animation tracks corresponding to the animation objects. Specifically, on the basis of the determination of the animation objects, since the animation editor hierarchically manages other animation elements on the basis of the animation objects in order to avoid mutual influence between the animation objects and increase difficulty in animation creation due to mismanagement of the animation elements during animation editing, when creating an animation file, the animation file can be created by reading the corresponding animation elements according to these hierarchical relationships.
The animation track refers to a time axis set for an animation object, and can be used for managing animation events of the animation object. The animation event refers to a set configured in advance to cause an animation object to execute a specific animation function at a specific time.
In specific implementation, because the types of the animation sub-events in the animation event set are various, the animation sub-events are interlaced together, which is not beneficial to animation creation and management of animation elements, and in order to further improve the efficiency of animation creation and the management convenience of animation elements, the animation sub-events can be further classified and managed, in the embodiment of the application, the following method is specifically adopted: reading a basic animation event which is pre-created aiming at an active animation track corresponding to the animation object, and reading an extended animation event which is pre-created aiming at a sub-animation track associated with a target animation object, wherein the target animation object is the animation object associated with the sub-animation track in the animation object; specifically, the animation track is divided into two track types, namely a main animation track and a sub animation track, and the animation event is divided into two event type sets, namely a basic animation event and an extended animation event, wherein any one basic animation sub-event in the basic animation event is an animation sub-event, and any one extended animation sub-event in the extended animation event is an animation sub-event.
In practical application, each animation object corresponds to an active drawing track, a plurality of basic animation sub-events can be created or configured on the active drawing track, and the basic animation sub-events form a basic animation event. In addition, when the basic animation event can not meet the configuration requirement of the animation object, at least one associated sub-animation track can be created or added for the animation object, so that at least one corresponding extended animation sub-event is created or configured on each sub-animation track, and the sub-extended animation sub-events form an extended animation event.
The active drawing track is a basic time axis which is set for the animation object and is used for managing basic animation events, and is used for managing the time of the basic animation events of the animation object during the animation playing. Accordingly, a base animation sub-event in the base animation event is used to cause the animation object to perform a base animation function such as binding an object, binding an actor, following, and the like. The sub animation track is an extended time axis which is set for the animation object and is used for performing extended animation event management, and is used for performing time management on the extended animation event of the animation object during animation playing. The number of the sub animation tracks may be one or more, and is not limited herein. Accordingly, an extended animation sub-event in an extended animation event is used to cause an animated object to perform an extended animation function such as an action, a skeletal action, a facial animation, etc. In practical applications, which animation sub-events belong to the basic animation sub-events and which animation sub-events belong to the extended sub-events are set in advance in the animation editor according to the types of the animation sub-events.
In specific implementation, considering that the types of the extended animation sub-events are also various, in order to avoid that multiple types of extended animation sub-events are managed on one sub-animation track, so that the editing efficiency of the animation is affected, the types of the sub-animation tracks can be set in advance according to the types of the extended animation sub-events corresponding to each animation object type, and then the sub-animation tracks are created according to the types of the sub-animation tracks, in the embodiment of the application, any one sub-animation track is created in the following manner:
receiving a sub-track creating instruction aiming at any one of the animation objects;
in response to the sub-track creating instruction, displaying a sub-track type list associated with the object attribute of the any one animation object;
and creating a sub animation track corresponding to the target sub track type based on the target sub track type selected from the sub track type list.
The sub-track creation instruction refers to an instruction for creating a sub-animation track. In practical application, since the sub animation track exists depending on the animation object, the extended animation event of the animation object is managed. Thus, a sub-track creation instruction may be submitted for configuring an animation object that creates an extended animation sub-event. And the purpose of creating the sub animation track is to create an extended animation sub-event thereon and manage the extended animation sub-event through the sub animation track.
Further, since each animation object can correspond to multiple types of animation implementations, these different types of animation implementations are generally related to object properties (such as functions, object parts, etc.) of the animation object. Therefore, it is possible to previously divide a plurality of sub animation track types according to the object attribute of each animation object, and form these sub animation track types into a sub animation track list. So that extended animation events of corresponding types are managed on different types of sub-animation tracks. When the sub animation track is created for any animation object, the sub animation track list associated with the object property of any animation object can be displayed. And then based on the target sub-track type selected by the user in the sub-track type list, creating a sub-animation track corresponding to the target sub-track type, and taking the created sub-animation track as the sub-animation track associated with any one animation object.
Following the above example: creating an animation object as described above: on the basis of the actor 1, receiving a sub-track creating instruction submitted by a user U in an animation editor aiming at the actor 1, and displaying a sub-animation track type list which is preset in the animation editor and is related to the function and the part of the actor 1 in response to the sub-track creating instruction, wherein the sub-animation track type list comprises: action type, facial action type, etc. Receiving an action type selected by a user U in the sub animation track type list as a target sub animation track type, and creating a sub animation track of the action type: and the created motion track is used as a sub-animation track associated with the actor of the animation object.
In summary, by receiving a sub-track creation instruction for any one of the plurality of animation objects, and in response to the sub-track creation instruction, presenting a sub-track type list associated with an object attribute of the any one animation object, and then creating a sub-animation track based on a target sub-track type selected in the sub-track type list. The method and the device realize the type division of the sub-animation tracks, flexibly select the required sub-animation track type from various sub-animation track types to create the sub-animation track, and increase the flexibility of creating the sub-animation track.
In practical applications, because some animation objects may be associated with sub-animation tracks and some animation objects do not create associated sub-animation tracks, in order to read an extended animation event created in advance on a sub-animation track, it is necessary to determine an animation object associated with a sub-animation track (i.e., a target animation object).
In specific implementation, on the basis of creating an animation object, different basic functions can be implemented for each animation object, so that in order to avoid mutual influence among different basic functions, different basic functions of the animation object can be divided into different basic sub-event types in advance, and then a basic animation sub-event is created according to the basic sub-event type.
Aiming at any one animation object in the animation objects, receiving a basic sub-event creating instruction aiming at an active drawing track corresponding to the any one animation object;
responding to the basic sub-event creating instruction, and displaying a basic sub-event type list corresponding to the active drawing track corresponding to any one animation object;
and creating a basic animation sub-event corresponding to the target basic sub-event type based on the target basic sub-event type selected from the basic sub-event type list.
The basic sub-event adding instruction refers to an instruction for adding a basic animation sub-event. In practical applications, any one of the basic animation sub-events has a time interval for event execution, and the time interval is composed of an event starting time and an event ending time. The time interval may be embodied on an animation timeline. And the basic animation sub-event is used for realizing the basic function of the animation object, so the basic animation sub-event is created on the basic time axis (active drawing track) corresponding to the animation object. And therefore when creating the base animation sub-event, the base sub-event creation instruction needs to be submitted for the base timeline (active drawing track).
Further, since the event types of the basic animation sub-event corresponding to each animation object are various, different types of basic animation sub-events may need to set different properties. In order to avoid the mutual influence among different types of basic animation sub-events, the basic animation sub-events need to be divided into at least one basic sub-event type according to the animation object, and the basic sub-event types form a basic sub-event type list. When a basic animation sub-event is created on the active drawing track corresponding to any one animation object, a basic sub-event type list corresponding to the active drawing track corresponding to the animation object can be displayed. And then, based on the target basic sub-event type selected by the user in the basic sub-event type list, creating a basic animation sub-event corresponding to the target basic sub-event type. It should be noted that at least one base animation sub-event may be created for the main animation track, and a set of these base animation sub-events is referred to as a base animation event.
Following the above example: creating an animation object as described above: on the basis of the actor 1, receiving a basic sub-event creating instruction submitted by a user U in an animation editor aiming at an active drawing track corresponding to the actor 1, and displaying a basic sub-event type list corresponding to the active drawing track of the actor 1 in the animation editor in response to the basic sub-event creating instruction, wherein the basic sub-event type list comprises: bound object type, bound actor type, follow type. Receiving a binding object type selected by a user U in the basic sub-event type list as a target basic sub-event type, and creating a basic animation sub-event of the binding object type: the binding object a specifically indicates that the object a is displayed in a scene in a time interval corresponding to the event.
In summary, a basic sub-event creating instruction of an active drawing track corresponding to any one animation object is received by aiming at any one animation object in a plurality of animation objects, and the corresponding basic animation sub-event is created through a target basic sub-event type selected in a basic sub-event type list in response to the basic sub-event creating instruction. The method and the device realize the type division of the basic animation sub-event of the animation object, flexibly select the required basic sub-event type from a plurality of basic sub-event types to establish the basic animation sub-event, and increase the flexibility of establishing the basic animation sub-event.
Similarly, on the basis of creating an animation object, different extended functions can be implemented for each animation object, so that in order to avoid mutual influence between different extended functions, different extended functions of the animation object can be divided into different extended sub-event types in advance, and an extended animation sub-event is created according to the extended sub-event type.
Aiming at any one of the target animation objects, receiving an extended sub-event creating instruction aiming at a sub-animation track associated with the any one target animation object;
responding to the extended sub-event creating instruction, and displaying a corresponding extended sub-event type list of a sub-animation track associated with any one target animation object;
and creating an extended animation sub-event corresponding to the target extended sub-event type based on the target extended sub-event type selected from the extended sub-event type list.
The extension sub-event creating instruction refers to an instruction for adding an extension animation sub-event. In practical applications, any extended animation sub-event has a time interval for event execution, and the time interval is also composed of an event start time and an event end time. The time interval also needs to be embodied on the animation timeline. And the extended animation sub-event is an extended function to implement the animation object, and thus the extended animation sub-event is created on an extended time axis (i.e., a sub-animation track) associated with the animation object. Thus, when creating an extended animation sub-event, an extended sub-event creation instruction needs to be submitted for the sub-animation track.
Further, since the event types of the extended animation sub-event are various, different types of extended animation sub-events may need to set different properties. In order to avoid the mutual influence among different types of extended animation sub-events, the extended animation sub-events are divided into at least one extended sub-event type according to the animation object, and the extended sub-event types form an extended sub-event type list. When the extended animation sub-event is created on the sub-animation track associated with any animation object, the extended sub-event type list corresponding to the sub-animation track associated with the animation object can be displayed. And then based on the target extension sub-event type selected by the user in the extension sub-event type list, creating an extension animation sub-event corresponding to the target extension sub-event type. It should be noted that at least one extended animation sub-event may also be created for a sub-animation track, and a set of these extended animation sub-events is referred to as an extended animation event.
Following the above example: creating an animation object as described above: on the basis of the actor 1, receiving an extension sub-event creating instruction submitted by a user U in an animation editor aiming at an action track associated with the actor 1, and displaying an extension sub-event type list corresponding to the action track of the actor 1 in the animation editor in response to the extension sub-event creating instruction, wherein the extension sub-event type list comprises: leg action type, hand action type, head action type. Receiving a leg action type selected by a user U in the extended sub-event type list as a target extended sub-event type, and creating an extended animation sub-event of the leg action type: a kicking event. The kicking sub-event indicates that the actor 1 is controlled to perform a kicking action in a time interval corresponding to the event.
The animation object has been created in the animation editor specifically shown in fig. 3: an actor 1 whose animated image of the actor 1 is displayed in an animation display area of an animation editor. In addition, actor 1 has a corresponding active drawing track and a sub-animation track, and a base animation sub-event is created for the active drawing track corresponding to actor 1: binding object a, creates an extended animation sub-event for the sub-animation track associated with actor 1: a kicking event.
In summary, by aiming at any one of the plurality of target animation objects, an extended sub-event creation instruction of a sub-animation track associated with any one of the target animation objects is received, and in response to the extended sub-event creation instruction, a corresponding extended animation sub-event is created by the target extended sub-event type selected in the extended sub-event type list. The method and the device realize the type division of the extended animation sub-event of the animation object, flexibly select the required extended sub-event type from a plurality of extended sub-event types to establish the extended animation sub-event, and increase the flexibility of establishing the extended animation sub-event.
In practical applications, it may be necessary to view the created animation sub-event (base animation sub-event or extended animation sub-event). When editing the animation sub-events in the animation editor, if the sub-region of the event operation of each animation sub-event is linearly traversed to determine the animation sub-event matched with the click of the user, the time consumption is long. Therefore, the event operation sub-region of each animation sub-event can be traversed in the form of a quadtree, and the embodiment of the application further includes:
receiving an event query instruction submitted by a user in an event operation area;
traversing an event quad tree corresponding to the event operation area, and comparing the position information in the event query instruction with an event operation sub-area corresponding to a node in the event quad tree;
the event operation sub-region is the position of an operation sub-region corresponding to an animation sub-event in the animation event;
and determining and displaying the target animation sub-event in the animation sub-events according to the comparison result.
The event operation area refers to an interface area which is included in the animation editor and can operate at least one animation sub-event. An event query instruction is an instruction for querying any animation sub-event (basic animation sub-event or extended animation sub-event); each event operation sub-region refers to any one of the created animation sub-events and the operation sub-region corresponding to the event operation region; the event manipulation sub-region may also be understood as an event bounding box.
In practical applications, in order to query an animation sub-event that has been created, the animation sub-event that has been created is usually mapped in an editing interface of an animation editor in the form of an event operation sub-region, so that a user can check any animation sub-event when submitting an event query instruction by performing an operation in the event operation sub-region corresponding to the animation sub-event. But because the animation sub-event which the user wants to view is determined according to the operation position of the user. Therefore, the position information carried in the event query instruction of the user needs to be compared with the event operation sub-region of the animation sub-event included in the event operation region to determine the event operation sub-region matched with the position information. And using the animation sub-event corresponding to the matched event operation sub-region as the animation sub-event corresponding to the event query instruction.
In specific implementation, the event quadtree may be traversed, where the event quadtree is formed based on the event operation sub-region corresponding to the created animation sub-event, that is, the event quadtree corresponding to the event operation region is traversed, and the position information (for example, click position information carried in an event query command submitted by clicking) carried in the event query command is compared with the event operation sub-regions corresponding to the leaf nodes in the event quadtree, so as to determine the event operation sub-region corresponding to the position information. And the animation sub-event corresponding to the corresponding event operation sub-region is taken as a target animation sub-event, and the target animation sub-event (a target basic animation sub-event or a target extended animation sub-event) is displayed. Specifically, the target animation sub-event can be displayed through an event panel displaying the target animation sub-event.
Further, the user can edit the queried animation sub-event in the displayed event panel. After the editing is completed, the animation editor updates the queried animation sub-events based on the event information in the event panel.
Following the above example, assume that two base animation sub-events and three extension animation sub-events have been created for actor 1, these two base animation sub-events being: binding an object A and a following object B; the three extended animation sub-events are respectively: a leg kick event, a head nod event, and a hand swing event. Specifically, as shown in fig. 4, in the event operation area in the animation editor, the binding object a corresponds to the event operation sub-area cr1, and the following object B corresponds to the event operation sub-area cr 2; the kick sub-event corresponds to the event operation sub-region cr3, the click sub-event corresponds to the event operation sub-region cr4, and the hand swing sub-event corresponds to the event operation sub-region cr 5. The animation sub-events are displayed in an event operation area of the animation editor in a mode of the event operation sub-area. Receiving an event query instruction submitted by a user U in an event operation area, traversing an event quadtree created in advance for the event operation area and event operation sub-areas of 5 animation sub-events, comparing the position information d1 carried in the event query instruction with the 5 event operation sub-areas corresponding to nodes of the event quadtree, determining that the position information d1 is matched with the event operation sub-area cr4, and displaying an event panel corresponding to the head sub-event corresponding to the event operation sub-area cr 4.
In conclusion, the animation sub-events to be checked are inquired in a mode of traversing the quadtree, and the inquiry efficiency of the animation sub-events corresponding to the inquiry event inquiry instruction is improved.
In specific implementation, each created animation sub-event needs to be included in the event quadtree, and in order to avoid that the event quadtree needs to be reconstructed after a new animation sub-event is created each time, a new animation sub-event can be created each time, and the created event quadtree is updated based on the event operation sub-region of the newly created animation sub-event, in the embodiment of the present application, the event quadtree is updated in the following manner:
after any animation sub-event in the animation events is created, determining a corresponding target node of an event operation sub-region of the any animation sub-event in the event quadtree;
and updating the target node according to the event operation sub-region of any animation sub-event to obtain an updated event quadtree.
In practical application, each leaf node in the event quadtree corresponds to a node region in the event operation region, and each leaf node corresponds to an event operation sub-region included in the node region. Therefore, it is necessary to determine a node area where the operation event sub-area of the newly created animation sub-event (the basic animation sub-event or the extended animation sub-event) is located, and then determine a corresponding target node of the node area in the event quadtree. And updating the event quadtree by adding the operation event sub-region position of the newly created animation sub-event to the target node for updating, so as to obtain the updated event quadtree. For example, the target node originally corresponds to 1 event operation sub-region, and after the target node is updated by the operation event sub-region of the newly created animation sub-event, the target node corresponds to 2 event operation sub-regions.
Along the above example, it is assumed that there are 4 leaf nodes in the event quadtree, and the 4 leaf nodes are respectively leaf node a, leaf node B, leaf node C and leaf node D, where leaf node a corresponds to event operation sub-region cr1, leaf node B corresponds to event operation sub-region cr3, leaf node C corresponds to event operation sub-region cr2, and leaf node D corresponds to event operation sub-region cr 4. After the extended animation sub-event is created: after the hand swing sub-event, if it is determined that the event operation sub-region cr5 of the hand swing sub-event is in the node region corresponding to the leaf node C, the leaf node C is taken as the target node. The leaf node C originally corresponds to the event operation sub-region cr2 of the follower object B, and the leaf node C is updated by adding the event operation sub-region r5 to the leaf node C, and the updated leaf node C corresponds to the event operation sub-region cr2 of the follower object B and the event operation sub-region cr5 of the waving sub-event. Further, the updated event quadtree is obtained by updating the leaf node C in the event quadtree.
In summary, after the event quadtree is created for the first time, the event quadtree only needs to be updated through the event operation sub-region corresponding to the newly created animation sub-event after the animation sub-event is created each time, and the efficiency of checking the event is indirectly improved.
Step 206: and creating a target animation file based on the animation object, the animation track and the animation event.
Specifically, in addition to reading the animation event, in order to include all animation elements in the created target animation file, a playable animation (i.e., a target animation file) is created based on animation elements such as an animation object, an animation track, and an animation event.
During specific implementation, a blank initial animation file can be created, animation data corresponding to an animation object, an animation track and an animation event are read from a memory, and the animation data are added to the initial animation file according to a preset format to generate a target animation file. Furthermore, the target animation file can be played by calling a playing interface so as to play the created target animation.
In order to enable the created target animation file to include all animation elements created in an animation editing process on the basis that the animation track is divided into a main animation track and a sub animation track and an animation event is divided into a basic animation event and an extended animation event, according to the embodiment of the application, the target animation file is created on the basis of the animation object, the main animation track, the basic animation event, the sub animation track and the extended animation event. The created target animation file is well-defined in hierarchy and is more interpretable. In particular, in the process of creating an animation by editing an animation editor, some animation resources of a preset type in the animation editor may be referenced, for example: the method and the system for creating the animation file have the advantages that the material resources, the map resources, the animation clip resources and the like which are imported or configured in advance in the animation editor are not beneficial to the migration of the animation file if only the reference relations are directly saved for the resources in the process of creating the animation file, so that the resource information of the referenced original resources can be directly saved in the animation file, and the method and the system for creating the animation file are specifically realized by the following steps:
taking the animation object, the animation track and the animation event as animation elements;
determining the preset type of target animation resources associated with at least one target animation element in the animation elements, and reading resource information of the target animation resources;
updating the at least one target animation element through the resource information according to the incidence relation between the target animation resource and the at least one target animation element;
updating the animation object, the animation track and the animation event based on the updated at least one target animation element;
and taking the updated animation object, animation track and animation event as the animation object, the animation track and the animation event.
The preset type refers to a preset resource type of the animation resource. In practical application, the resource type which is relatively independent relative to the animation editor can be set as a preset type, and in this case, reading the resource information in the animation editor is relatively simple, so that the resource information of the animation resources can be directly read.
In specific implementation, the created animation object, animation track and animation event are used as animation elements, each of the animation elements may refer to animation resources, and therefore, the following steps are required: and selecting an animation element associated with a preset type of animation resource from the animation object, the animation track and the animation event as at least one target animation element. So as to read preset types of animation resources (i.e., target animation resources) associated (referenced or depended on) with the target animation elements and read resource information of the target animation resources.
And updating the target animation elements with the association relationship through the read resource information according to the association relationship (reference relationship or dependency relationship) between the target animation resources and at least one target animation element. And updating the animation object, the animation track and the animation element corresponding to the animation object in the animation event based on the updated at least one target animation element to obtain the updated animation element so as to create a target animation file based on the updated animation elements.
Following the above example, an animation creation instruction submitted by a user U is received, and all animation objects created in an animation editor are determined in response to the animation creation instruction, the animation objects including: the actor 1, the director 1 and the camera 1 read basic animation events which are created in advance aiming at active drawing tracks corresponding to the three animation objects, wherein the basic animation events comprise: binding object a (basic animation sub-event). And selecting an animation object associated with the sub-animation track among the three animation objects: actor 1, the animation object: actor 1 serves as the target animation object. And reads an extended animation event created in advance by an action track (sub-animation track) associated with the target animation object actor 1, the extended animation event including: a kicking sub-event (an extended animation sub-event).
Three animation objects, active drawing tracks corresponding to the three animation objects, action tracks associated with actors 1, a binding object A and kicking sub-events are used as animation elements, and preset types are preset in the animation elements: under the condition of the animation resources of the map type and the material type, determining that the target animation elements of the animation resources related to the preset type in the animation elements are as follows: the object a is bound. And determining that the preset type (associated) target animation resources configured in the binding object a are: and reading the file information contained in the file A. And updating the binding object A based on the read file information to obtain the updated binding object A.
In summary, by reading the resource information of the target animation resource associated with the target animation element and updating the target animation element based on the resource information, the created target animation file directly includes the resource information of the animation resource, which is convenient for transplanting the target animation file.
In addition, animation elements related to the preform resources also exist in the animation elements, and since the preform resources themselves are created in advance by an animation editor, and the stored information needs to be analyzed by the animation editor, for such preform resources, in order to avoid that it takes too long time and more resources to analyze the preform resources, resource paths of the preform resources can be saved, which is specifically implemented in the following manner in the embodiment of the present application:
taking the animation object, the animation track and the animation event as animation elements;
determining a target preform resource associated with at least one of the animation elements and determining a resource path of the target preform resource;
updating the at least one prefabricated animation element through the resource path according to the incidence relation between the target prefabricated resource and the at least one prefabricated animation element;
updating the animation object, the animation track, and the animation event based on the updated at least one preform animation element;
and taking the updated animation object, animation track and animation event as the animation object, the animation track and the animation event.
Because the dependence of the prefabricated part resources on the animation editor is larger, the difficulty in reading the resource information in the prefabricated part resources is larger, and especially, when the animation editor is used as a plug-in editor and the prefabricated part resources are pre-created by an original editor corresponding to the plug-in editor, the resource information in the prefabricated part resources is difficult to analyze, so that the resource paths of the prefabricated part resources can be directly determined.
In specific implementation, the created animation object, active drawing track, basic animation event, sub animation track and extended animation event are used as animation elements, each animation element in the animation elements may also refer to a preform resource, and therefore, the method needs to pass through: selecting an animation element associated with the pre-form resource as at least one pre-form animation element among an animation object, an active draw track, a base animation event, a sub-animation track, and an extended animation event. In order to determine the resource path of the target preform resource to which these preform animation elements are associated (referenced or dependent).
And updating the prefabricated animation elements with the association relationship through the determined resource path according to the association relationship (reference relationship or dependency relationship) between the target prefabricated resource and at least one prefabricated animation element. And updating the animation object, the animation track and the animation element corresponding to the animation object in the animation event based on the updated at least one prefabricated part animation element to obtain an updated animation element so as to create a target animation file based on the updated animation elements.
According to the above example, three animation objects, active drawing tracks corresponding to the three animation objects, action tracks associated with the actor 1, the bound object a and the kicking sub-event are taken as animation elements, and the prefab elements of the associated prefab resources in the animation elements are determined as follows: actor 1. A target prefabricated resource file B configured (associated) in the actor 1 is determined and a resource path p1 of the target prefabricated resource file B is determined. Actor 1 is updated based on resource path p1 to obtain updated actor 1.
In conclusion, the resource path of the target prefabricated part resource associated with the prefabricated part animation element is determined, and the prefabricated part animation element is updated based on the resource path, so that the created target animation file directly contains the resource path of the prefabricated part resource, and the generation efficiency of the target animation file is guaranteed.
To sum up, in the animation generation method provided in the embodiment of the present application, based on the received animation creation instruction, the created animation object is determined, and an animation event created in advance for an animation track corresponding to the animation object is read; and then according to the hierarchical structure of the animation object, the animation track and the animation event, the target animation file is created based on the animation object, the animation track and the animation event, so that the target animation file is created according to the hierarchical structure of the animation track and the animation event corresponding to the animation object, the animation layer is more visual and is convenient to understand, the animation creating efficiency of a user is improved, and the creating experience of the user is improved.
Fig. 5 shows a processing flow chart of an animation generating method applied to an animation editor, which is described by taking the animation editor as an example and specifically includes the following steps:
step 502: and receiving an object creating instruction submitted by a user.
Specifically, the object creation instruction refers to an object creation instruction submitted by a user through operating an object creation control in the animation editor.
Step 504: and responding to the object creating instruction, and displaying an animation object type list, wherein the animation object type list comprises a director type, a camera type, an actor type, a special effect type and a light type.
Specifically, the director type, the camera type, the actor type, the special effect type, and the light type are animation types in the animation object type list.
Step 506: and creating a camera 01 corresponding to the camera type based on the camera type selected by the user in the animation object type list.
Specifically, the camera 01 is an animation object corresponding to the created camera type. In practical applications, the created camera 01 directly corresponds to an active drawing track.
Step 508: a sub-track creation instruction of the user for the camera 01 is received.
Specifically, the sub-track creation instruction refers to a sub-track creation instruction submitted by a user through an operation performed on a sub-track creation control for the camera 01 in the animation editor.
Step 510: in response to the sub-track creation instruction, a list of sub-track types associated with the functions of the camera 01 is presented.
Specifically, the sub-track type list may include a moving track type, a display track type, and other sub-track types.
Step 512: and creating a moving track 1 corresponding to the moving track type based on the moving track type selected by the user in the sub-track type list.
Step 514: and receiving a basic sub-event creating instruction of the user for actively drawing the track corresponding to the camera 01.
In practical application, when the camera 01 is created, the corresponding active drawing track can be automatically created.
Step 516: and responding to the basic sub-event creating instruction, and showing a basic sub-event type list corresponding to the active drawing track corresponding to the camera 01.
Specifically, the basic sub-event type list includes basic sub-event types such as a lens parameter type and a depth of field type.
Step 518: and creating a lens parameter sub-event corresponding to the lens parameter type based on the lens parameter type selected by the user in the basic sub-event type list.
Step 520: an extended sub-event creation instruction of the user for the movement track 1 associated with the camera 01 is received.
Step 522: and responding to the extended sub-event creating instruction, and showing an extended sub-event type list corresponding to the moving track 1 associated with the camera 01.
Specifically, the extended sub-event type list may include: displacement type, straight type, turning type, shift displacement and other extended sub-event types.
Step 524: and creating a turning sub-event corresponding to the turning type based on the turning type selected by the user in the extended sub-event type list.
Specifically, the turning sub-event is an extended animation sub-event.
Step 526: based on the received animation creation instruction, three animation objects of the camera 01, the director 01, and the actor 01 are determined.
In particular, in addition to the above-described camera 01 created in the animation editor, two animation objects of a director 01 and an actor 01 are created in the animation editor before receiving an animation creation instruction.
Step 528: and reading a lens parameter sub-event pre-created for a general track corresponding to each animation object, and reading a turning sub-event pre-created for a moving track 1 associated with the camera 01 in the three animation objects.
Step 530: determining three animation objects of a camera 01, a director 01 and an actor 01, and turning sub-events which are related to animation resources of preset types in a general track, a moving track 1, a lens parameter sub-event and a turning sub-event corresponding to the three animation objects, and reading resource information of target animation resources related to the turning sub-event.
Specifically, the preset types include: the resource types, such as the material type, the map type, and/or the animation clip type, are not limited herein. The target animation resource is an animation resource of a preset type associated with the turning sub-event.
Step 532: and updating the turning sub-event through the resource information according to the incidence relation between the target animation resource and the turning sub-event.
Step 534: and creating a target animation file based on three animation objects, namely the camera 01, the director 01 and the actor 01, and the three animation objects, namely the corresponding general track, the moving track 1 and the lens parameter sub-event, and the updated turning sub-event.
To sum up, the animation generation method provided in the embodiment of the present application creates a basic animation sub-event for the active drawing track corresponding to the animation object, and creates an extended animation sub-event for the sub-animation track corresponding to the animation track, on the basis of creating the animation object. And based on the received animation creating instruction, reading the created animation object, active drawing track, basic animation event, sub-animation track and extended animation event according to the hierarchical structures of the animation object, the active drawing track and the basic animation event and the hierarchical structures of the animation object, the sub-animation track and the extended animation event to create a target animation file, so that the animation creating process is more visual by dividing the animation hierarchy, the interpretability is stronger, and the animation creating efficiency is improved.
Corresponding to the above method embodiment, the present application further provides an animation generation apparatus embodiment, and fig. 6 shows a schematic structural diagram of the animation generation apparatus provided in an embodiment of the present application. As shown in fig. 6, the apparatus 600 includes:
a determination module 602 configured to determine an animation object based on the received animation creation instruction;
a reading module 604 configured to read an animation event created in advance for an animation track corresponding to the animation object;
a creation module 606 configured to create a target animation file based on the animation object, the animation track, and the animation event.
Optionally, the reading module 604 is further configured to:
reading a basic animation event which is pre-created aiming at an active animation track corresponding to the animation object, and reading an extended animation event which is pre-created aiming at a sub-animation track associated with a target animation object, wherein the target animation object is the animation object associated with the sub-animation track in the animation object;
accordingly, the creating module 606 is further configured to:
creating a target animation file based on the animation object, the main animation track, the base animation event, the sub animation track, and the extended animation event.
Optionally, in a case where a preset type of animation resource is configured in advance, the animation generation apparatus includes:
a read information module configured to take the animation object, the animation track, and the animation event as animation elements; determining the preset type of target animation resources associated with at least one target animation element in the animation elements, and reading resource information of the target animation resources;
a first updating module configured to update the at least one target animation element through the resource information according to an incidence relation between the target animation resource and the at least one target animation element;
a second update module configured to update the animation object, the animation track, and the animation event based on the updated at least one target animation element; and taking the updated animation object, animation track and animation event as the animation object, the animation track and the animation event.
Optionally, the at least one target animation element is determined by running the following modules:
a selection module configured to select an animation element associated with the preset type of animation resource as at least one target animation element among the animation object, the animation track, and the animation event.
Optionally, in a case where the resource of the preform is configured in advance, the animation generation apparatus includes:
a determine path module configured to take the animation object, the animation track, and the animation event as animation elements; determining a target preform resource associated with at least one of the animation elements and determining a resource path of the target preform resource;
a third updating module configured to update the at least one prefabricated animation element through the resource path according to the association relationship between the target prefabricated resource and the at least one prefabricated animation element;
a fourth update module configured to update the animation object, the animation track, and the animation event based on the updated at least one preform animation element; and taking the updated animation object, animation track and animation event as the animation object, the animation track and the animation event.
Optionally, any one of the animation sub-events is queried by running the following modules:
the first receiving module is configured to receive an event query instruction submitted by a user in an event operation area;
the comparison module is configured to compare the position information in the event query instruction with the event operation sub-region corresponding to the node in the event quadtree by traversing the event quadtree corresponding to the event operation region; the event operation sub-region is the position of an operation sub-region corresponding to an animation sub-event in the animation event;
and the determining sub-event module is configured to determine and display a target animation sub-event in the animation sub-events according to the comparison result.
Optionally, the event quadtree is updated by running the following modules:
the node determining module is configured to determine a target node corresponding to an event operation sub-region of any animation sub-event in the event quadtree after the animation sub-event is created;
and the fifth updating module is configured to update the target node according to the event operation sub-region of any one animation sub-event to obtain an updated event quadtree.
Optionally, any one of the animation objects is created by running the following modules:
a first presentation module configured to present an animation object type list in response to a received object creation instruction, wherein the animation object type list includes, but is not limited to, a director type, a camera type, an actor type, a special effect type, and a light type;
and the object creating module is configured to create an animation object corresponding to the target animation object type based on the target animation object type selected from the animation object type list.
Optionally, any one of the sub animation tracks is created by:
a second receiving module configured to receive a sub-track creation instruction for any one of the animation objects;
a second presentation module configured to present a sub-track type list associated with an object property of the arbitrary one of the animation objects in response to the sub-track creation instruction;
and the track creating module is configured to create a sub animation track corresponding to the target sub track type based on the target sub track type selected from the sub track type list.
Optionally, the target animation object is determined by running the following modules:
and the selection object module is configured to select the animation object of the associated sub-animation track from the animation objects and determine the animation object as the target animation object.
Optionally, any one of the basic animation sub-events is created by running the following modules:
a third receiving module, configured to receive, for any one of the animation objects, a basic sub-event creation instruction for an active drawing track corresponding to the any one animation object;
a third presentation module, configured to present, in response to the basic sub-event creating instruction, a basic sub-event type list corresponding to an active drawing track corresponding to the arbitrary animation object;
and the first creating sub-event module is configured to create a basic animation sub-event corresponding to the target basic sub-event type based on the target basic sub-event type selected from the basic sub-event type list.
Optionally, any one of the extended animation sub-events is created by running the following modules:
a fourth receiving module, configured to receive, for any one of the target animation objects, an extended sub-event creation instruction for a sub-animation track associated with the any one target animation object;
a fourth presentation module, configured to present, in response to the extended sub-event creating instruction, a corresponding extended sub-event type list of a sub-animation track associated with the arbitrary one target animation object;
and the second creating sub-event module is configured to create an extended animation sub-event corresponding to the target extended sub-event type based on the target extended sub-event type selected from the extended sub-event type list.
To sum up, the animation generation apparatus provided in the embodiment of the present application determines, based on the received animation creation instruction, an animation object that has been created, and reads an animation event that is created in advance for an animation track corresponding to the animation object; and then according to the hierarchical structure of the animation object, the animation track and the animation event, the target animation file is created based on the animation object, the animation track and the animation event, so that the animation creation process is more visual by dividing the animation hierarchy, the interpretability is stronger, and the animation creation efficiency is improved.
The above is a schematic configuration of an animation generation apparatus of the present embodiment. It should be noted that the technical solution of the animation generation apparatus is the same concept as the technical solution of the animation generation method described above, and for details not described in detail in the technical solution of the animation generation apparatus, reference may be made to the description of the technical solution of the animation generation method described above.
There is also provided in an embodiment of the present application a computing device comprising a memory, a processor, and computer instructions stored on the memory and executable on the processor, wherein the processor implements the steps of the animation generation method when executing the computer instructions.
The above is an illustrative scheme of a computing device of the present embodiment. It should be noted that the technical solution of the computing device and the technical solution of the animation generation method belong to the same concept, and details that are not described in detail in the technical solution of the computing device can be referred to the description of the technical solution of the animation generation method.
An embodiment of the present application further provides a computer readable storage medium, which stores computer instructions, and the computer instructions, when executed by a processor, implement the steps of the animation generation method as described above.
The above is an illustrative scheme of a computer-readable storage medium of the present embodiment. It should be noted that the technical solution of the storage medium belongs to the same concept as the technical solution of the above animation generation method, and for details that are not described in detail in the technical solution of the storage medium, reference may be made to the description of the technical solution of the above animation generation method.
The foregoing description of specific embodiments of the present application has been presented. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The computer instructions comprise computer program code which may be in the form of source code, object code, an executable file or some intermediate form, or the like. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The preferred embodiments of the present application disclosed above are intended only to aid in the explanation of the application. Alternative embodiments are not exhaustive and do not limit the invention to the precise embodiments described. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and its practical applications, to thereby enable others skilled in the art to best understand and utilize the application. The application is limited only by the claims and their full scope and equivalents.