CN113808237A - Animation generation method and device - Google Patents

Animation generation method and device Download PDF

Info

Publication number
CN113808237A
CN113808237A CN202111113011.7A CN202111113011A CN113808237A CN 113808237 A CN113808237 A CN 113808237A CN 202111113011 A CN202111113011 A CN 202111113011A CN 113808237 A CN113808237 A CN 113808237A
Authority
CN
China
Prior art keywords
animation
event
sub
track
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111113011.7A
Other languages
Chinese (zh)
Other versions
CN113808237B (en
Inventor
黄锦寿
刘澈
唐磊
方泽华
张智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Kingsoft Online Game Technology Co Ltd
Original Assignee
Zhuhai Kingsoft Online Game Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Kingsoft Online Game Technology Co Ltd filed Critical Zhuhai Kingsoft Online Game Technology Co Ltd
Priority to CN202111113011.7A priority Critical patent/CN113808237B/en
Publication of CN113808237A publication Critical patent/CN113808237A/en
Application granted granted Critical
Publication of CN113808237B publication Critical patent/CN113808237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

本申请提供一种动画生成方法及装置,其中所述动画生成方法包括:基于接收的动画创建指令,确定动画对象;读取针对所述动画对象对应的动画轨道预先创建的动画事件;基于所述动画对象、所述动画轨道以及所述动画事件创建目标动画文件,实现了通过划分的动画层级使动画创建更加直观,并且可解释性更强,提高了动画的创建效率。

Figure 202111113011

The present application provides an animation generation method and device, wherein the animation generation method includes: determining an animation object based on a received animation creation instruction; reading a pre-created animation event for an animation track corresponding to the animation object; The animation object, the animation track and the animation event create the target animation file, so that the animation creation is more intuitive and interpretable through the divided animation levels, and the animation creation efficiency is improved.

Figure 202111113011

Description

Animation generation method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to an animation generation method and apparatus.
Background
With the development of animation technology, the design of animation scenes becomes more and more complex. At present, when complex animation scenes are processed, animation tracks used in the animation scenes are scattered. The user is required to actively typeset the animation tracks or actively group the animation tracks so as to show the functional relevance between the animation tracks. In this case, not only the efficiency of animation creation but also the animation creation experience of the user is affected.
Disclosure of Invention
In view of this, embodiments of the present application provide an animation generation method and apparatus, a computing device, and a computer-readable storage medium, so as to solve technical defects in the prior art.
According to a first aspect of embodiments of the present application, there is provided an animation generation method, including:
determining an animation object based on the received animation creating instruction;
reading animation events which are created in advance aiming at the animation tracks corresponding to the animation objects;
creating a target animation file based on the animation object, the animation track, and the animation event.
Optionally, the reading an animation event pre-created for an animation track corresponding to the animation object includes:
reading a basic animation event which is pre-created aiming at an active animation track corresponding to the animation object, and reading an extended animation event which is pre-created aiming at a sub-animation track associated with a target animation object, wherein the target animation object is the animation object associated with the sub-animation track in the animation object;
accordingly, the creating a target animation file based on the animation object, the animation track, and the animation event comprises:
creating a target animation file based on the animation object, the main animation track, the base animation event, the sub animation track, and the extended animation event.
Optionally, in a case that a preset type of animation resource is configured in advance, before creating a target animation file based on the animation object, the animation track, and the animation event, the method further includes:
taking the animation object, the animation track and the animation event as animation elements;
determining the preset type of target animation resources associated with at least one target animation element in the animation elements, and reading resource information of the target animation resources;
updating the at least one target animation element through the resource information according to the incidence relation between the target animation resource and the at least one target animation element;
updating the animation object, the animation track and the animation event based on the updated at least one target animation element;
and taking the updated animation object, animation track and animation event as the animation object, the animation track and the animation event.
Optionally, the at least one target animation element is determined by:
and selecting an animation element associated with the preset type of animation resource from the animation object, the animation track and the animation event as at least one target animation element.
Optionally, in a case where a preform resource is configured in advance, before creating a target animation file based on the animation object, the animation track, and the animation event, the method further includes:
taking the animation object, the animation track and the animation event as animation elements;
determining a target preform resource associated with at least one of the animation elements and determining a resource path of the target preform resource;
updating the at least one prefabricated animation element through the resource path according to the incidence relation between the target prefabricated resource and the at least one prefabricated animation element;
updating the animation object, the animation track, and the animation event based on the updated at least one preform animation element;
and taking the updated animation object, animation track and animation event as the animation object, the animation track and the animation event.
Optionally, any animation sub-event in the animation events is queried by:
receiving an event query instruction submitted by a user in an event operation area;
traversing an event quad tree corresponding to the event operation area, and comparing the position information in the event query instruction with an event operation sub-area corresponding to a node in the event quad tree;
the event operation sub-region is the position of an operation sub-region corresponding to an animation sub-event in the animation event;
and determining and displaying the target animation sub-event in the animation sub-events according to the comparison result.
Optionally, the event quadtree is updated as follows:
after any animation sub-event in the animation events is created, determining a corresponding target node of an event operation sub-region of the any animation sub-event in the event quadtree;
and updating the target node according to the event operation sub-region of any animation sub-event to obtain an updated event quadtree.
Optionally, any one of the animation objects is created by:
in response to a received object creating instruction, displaying an animation object type list, wherein the animation object type list comprises a director type, a camera type, an actor type, a special effect type and a light type;
and creating an animation object corresponding to the target animation object type based on the target animation object type selected from the animation object type list.
Optionally, any one of the sub animation tracks is created by:
receiving a sub-track creating instruction aiming at any one of the animation objects;
in response to the sub-track creating instruction, displaying a sub-track type list associated with the object attribute of the any one animation object;
and creating a sub animation track corresponding to the target sub track type based on the target sub track type selected from the sub track type list.
Optionally, the target animation object is determined by:
and selecting the animation object associated with the sub-animation track from the animation objects to determine the animation object as a target animation object.
Optionally, any one of the basic animation sub-events is created by:
aiming at any one animation object in the animation objects, receiving a basic sub-event creating instruction aiming at an active drawing track corresponding to the any one animation object;
responding to the basic sub-event creating instruction, and displaying a basic sub-event type list corresponding to the active drawing track corresponding to any one animation object;
and creating a basic animation sub-event corresponding to the target basic sub-event type based on the target basic sub-event type selected from the basic sub-event type list.
Optionally, any one of the extended animation sub-events is created by:
aiming at any one of the target animation objects, receiving an extended sub-event creating instruction aiming at a sub-animation track associated with the any one target animation object;
responding to the extended sub-event creating instruction, and displaying a corresponding extended sub-event type list of a sub-animation track associated with any one target animation object;
and creating an extended animation sub-event corresponding to the target extended sub-event type based on the target extended sub-event type selected from the extended sub-event type list.
According to a second aspect of embodiments of the present application, there is provided an animation generation apparatus including:
a determination module configured to determine an animation object based on the received animation creation instruction;
the reading module is configured to read animation events which are created in advance aiming at the animation tracks corresponding to the animation objects;
a creation module configured to create a target animation file based on the animation object, the animation track, and the animation event.
According to a third aspect of embodiments of the present application, there is provided a computing device comprising a memory, a processor and computer instructions stored on the memory and executable on the processor, the processor implementing the steps of the animation generation method when executing the computer instructions.
According to a fourth aspect of embodiments of the present application, there is provided a computer-readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the animation generation method.
The animation generation method provided by the embodiment of the application determines the created animation object based on the received animation creation instruction, and reads the animation event created in advance aiming at the animation track corresponding to the animation object; and then according to the hierarchical structure of the animation object, the animation track and the animation event, the target animation file is created based on the animation object, the animation track and the animation event, so that the animation creation process is more visual by dividing the animation hierarchy, the interpretability is stronger, and the animation creation efficiency is improved.
Drawings
FIG. 1 is a block diagram of a computing device provided by an embodiment of the present application;
FIG. 2 is a flow chart of an animation generation method provided by an embodiment of the present application;
FIG. 3 is a schematic interface diagram of an animation editor in an animation generation method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an event operation area in an animation generation method according to an embodiment of the present application;
FIG. 5 is a flowchart of an animation generation method applied to an animation editor according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an animation generation apparatus according to an embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
The terminology used in the one or more embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the present application. As used in one or more embodiments of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present application is intended to encompass any and all possible combinations of one or more of the associated listed items.
It will be understood that, although the terms first, second, etc. may be used herein in one or more embodiments of the present application to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first aspect may be termed a second aspect, and, similarly, a second aspect may be termed a first aspect, without departing from the scope of one or more embodiments of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In the present application, an animation generation method and apparatus, a computing device, and a computer-readable storage medium are provided, which are described in detail in the following embodiments one by one.
FIG. 1 shows a block diagram of a computing device 100 according to an embodiment of the present application. The components of the computing device 100 include, but are not limited to, memory 110 and processor 120. The processor 120 is coupled to the memory 110 via a bus 130 and a database 150 is used to store data.
Computing device 100 also includes access device 140, access device 140 enabling computing device 100 to communicate via one or more networks 160. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. Access device 140 may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present application, the above-mentioned components of the computing device 100 and other components not shown in fig. 1 may also be connected to each other, for example, by a bus. It should be understood that the block diagram of the computing device architecture shown in FIG. 1 is for purposes of example only and is not limiting as to the scope of the present application. Those skilled in the art may add or replace other components as desired.
Computing device 100 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), a mobile phone (e.g., smartphone), a wearable computing device (e.g., smartwatch, smartglasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 100 may also be a mobile or stationary server.
Wherein the processor 120 may perform the steps of the animation generation method shown in fig. 2. Fig. 2 shows a flowchart of an animation generation method according to an embodiment of the present application, which specifically includes the following steps:
step 202: based on the received animation creation instruction, an animation object is determined.
The animation creating instruction is an instruction for creating a playable animation file; the animation object can be understood as animation resources such as an animation character, a camera, an animation prop and the like. In practical applications, when the animation editor receives the animation creation instruction, animation data for creating an animation file needs to be determined, and the animation file needs to be created based on the animation data. The animation editor may be an animation editor for executing an animation, or may be a plug-in editor in an animation editor for another animation, which is not limited herein.
In specific implementation, in order to optimize the process of creating the animation by the user and facilitate the user to manage animation elements (including animation objects) in the animation, the animation object may be created first, and then other animation elements that affect the animation object may be created based on the created animation object, so as to perform classification management on various animation elements based on the animation object.
Therefore, how to create an animation object has a significant influence on the animation creation process, if the created animation object can well classify animation elements and meet the creation requirement, the animation creation efficiency of a user can be greatly improved, and in the specific implementation, some animation resources which are mainly or frequently used in the animation creation process can be firstly classified according to functions or categories, and then the animation object is created based on the classified animation object types.
In response to a received object creating instruction, displaying an animation object type list, wherein the animation object type list comprises a director type, a camera type, an actor type, a special effect type and a light type;
and creating an animation object corresponding to the target animation object type based on the target animation object type selected from the animation object type list.
In practical application, after a user submits an object creating instruction, animation object types which can be created are displayed to the user, the animation object types are displayed in an animation object list, and specifically, the animation object type list comprises animation object types which are not limited to a director type, a camera type, an actor type, a special effect type, a light type and the like. Further, the director-type animation object is used for controlling the whole animation, such as shot switching, animation playing speed and the like in the animation; the camera type animation object determines the observation visual angle and the display range of playing animation; the animation object of the actor type refers to a virtual object in the animation; determining the special effect in the animation by the animation object of the special effect type; the animation object of the light type determines the light effect in the animation; in addition, the animation object type list may further include an animation object type of a scene type for setting a scene, and the like, which is not limited herein.
Based on this, the user can select the animation object type (i.e. the target animation object type) that he wants to create from the list of animation object types, and then the animation object can be created based on the target animation object type.
During specific implementation, the main animation object types are abstracted from the animation editor, and animation objects required by animation scene requirements can be created according to the animation object types in the animation editing process.
Such as: receiving an object creating instruction submitted by a user U in the animation editor, and displaying an animation object type list preset in the animation editor in response to the object creating instruction, wherein the animation object type list comprises: the method comprises the following steps of receiving an actor type selected by a user U in an animation object type list as a target animation object type, and creating an animation object of the actor type: actor 1.
In summary, in response to the received object creation instruction, the animation object type list is displayed, and the animation object corresponding to the target animation object type is created based on the target animation object type selected from the animation object type list, so that the corresponding animation object is created based on the animation object types divided in advance, the type division of the animation object is realized, the required creation of the animation object type is flexibly selected from multiple animation object types to create the animation object, and the flexibility of creating the animation object is increased.
Step 204: and reading animation events which are created in advance aiming at the animation tracks corresponding to the animation objects. Specifically, on the basis of the determination of the animation objects, since the animation editor hierarchically manages other animation elements on the basis of the animation objects in order to avoid mutual influence between the animation objects and increase difficulty in animation creation due to mismanagement of the animation elements during animation editing, when creating an animation file, the animation file can be created by reading the corresponding animation elements according to these hierarchical relationships.
The animation track refers to a time axis set for an animation object, and can be used for managing animation events of the animation object. The animation event refers to a set configured in advance to cause an animation object to execute a specific animation function at a specific time.
In specific implementation, because the types of the animation sub-events in the animation event set are various, the animation sub-events are interlaced together, which is not beneficial to animation creation and management of animation elements, and in order to further improve the efficiency of animation creation and the management convenience of animation elements, the animation sub-events can be further classified and managed, in the embodiment of the application, the following method is specifically adopted: reading a basic animation event which is pre-created aiming at an active animation track corresponding to the animation object, and reading an extended animation event which is pre-created aiming at a sub-animation track associated with a target animation object, wherein the target animation object is the animation object associated with the sub-animation track in the animation object; specifically, the animation track is divided into two track types, namely a main animation track and a sub animation track, and the animation event is divided into two event type sets, namely a basic animation event and an extended animation event, wherein any one basic animation sub-event in the basic animation event is an animation sub-event, and any one extended animation sub-event in the extended animation event is an animation sub-event.
In practical application, each animation object corresponds to an active drawing track, a plurality of basic animation sub-events can be created or configured on the active drawing track, and the basic animation sub-events form a basic animation event. In addition, when the basic animation event can not meet the configuration requirement of the animation object, at least one associated sub-animation track can be created or added for the animation object, so that at least one corresponding extended animation sub-event is created or configured on each sub-animation track, and the sub-extended animation sub-events form an extended animation event.
The active drawing track is a basic time axis which is set for the animation object and is used for managing basic animation events, and is used for managing the time of the basic animation events of the animation object during the animation playing. Accordingly, a base animation sub-event in the base animation event is used to cause the animation object to perform a base animation function such as binding an object, binding an actor, following, and the like. The sub animation track is an extended time axis which is set for the animation object and is used for performing extended animation event management, and is used for performing time management on the extended animation event of the animation object during animation playing. The number of the sub animation tracks may be one or more, and is not limited herein. Accordingly, an extended animation sub-event in an extended animation event is used to cause an animated object to perform an extended animation function such as an action, a skeletal action, a facial animation, etc. In practical applications, which animation sub-events belong to the basic animation sub-events and which animation sub-events belong to the extended sub-events are set in advance in the animation editor according to the types of the animation sub-events.
In specific implementation, considering that the types of the extended animation sub-events are also various, in order to avoid that multiple types of extended animation sub-events are managed on one sub-animation track, so that the editing efficiency of the animation is affected, the types of the sub-animation tracks can be set in advance according to the types of the extended animation sub-events corresponding to each animation object type, and then the sub-animation tracks are created according to the types of the sub-animation tracks, in the embodiment of the application, any one sub-animation track is created in the following manner:
receiving a sub-track creating instruction aiming at any one of the animation objects;
in response to the sub-track creating instruction, displaying a sub-track type list associated with the object attribute of the any one animation object;
and creating a sub animation track corresponding to the target sub track type based on the target sub track type selected from the sub track type list.
The sub-track creation instruction refers to an instruction for creating a sub-animation track. In practical application, since the sub animation track exists depending on the animation object, the extended animation event of the animation object is managed. Thus, a sub-track creation instruction may be submitted for configuring an animation object that creates an extended animation sub-event. And the purpose of creating the sub animation track is to create an extended animation sub-event thereon and manage the extended animation sub-event through the sub animation track.
Further, since each animation object can correspond to multiple types of animation implementations, these different types of animation implementations are generally related to object properties (such as functions, object parts, etc.) of the animation object. Therefore, it is possible to previously divide a plurality of sub animation track types according to the object attribute of each animation object, and form these sub animation track types into a sub animation track list. So that extended animation events of corresponding types are managed on different types of sub-animation tracks. When the sub animation track is created for any animation object, the sub animation track list associated with the object property of any animation object can be displayed. And then based on the target sub-track type selected by the user in the sub-track type list, creating a sub-animation track corresponding to the target sub-track type, and taking the created sub-animation track as the sub-animation track associated with any one animation object.
Following the above example: creating an animation object as described above: on the basis of the actor 1, receiving a sub-track creating instruction submitted by a user U in an animation editor aiming at the actor 1, and displaying a sub-animation track type list which is preset in the animation editor and is related to the function and the part of the actor 1 in response to the sub-track creating instruction, wherein the sub-animation track type list comprises: action type, facial action type, etc. Receiving an action type selected by a user U in the sub animation track type list as a target sub animation track type, and creating a sub animation track of the action type: and the created motion track is used as a sub-animation track associated with the actor of the animation object.
In summary, by receiving a sub-track creation instruction for any one of the plurality of animation objects, and in response to the sub-track creation instruction, presenting a sub-track type list associated with an object attribute of the any one animation object, and then creating a sub-animation track based on a target sub-track type selected in the sub-track type list. The method and the device realize the type division of the sub-animation tracks, flexibly select the required sub-animation track type from various sub-animation track types to create the sub-animation track, and increase the flexibility of creating the sub-animation track.
In practical applications, because some animation objects may be associated with sub-animation tracks and some animation objects do not create associated sub-animation tracks, in order to read an extended animation event created in advance on a sub-animation track, it is necessary to determine an animation object associated with a sub-animation track (i.e., a target animation object).
In specific implementation, on the basis of creating an animation object, different basic functions can be implemented for each animation object, so that in order to avoid mutual influence among different basic functions, different basic functions of the animation object can be divided into different basic sub-event types in advance, and then a basic animation sub-event is created according to the basic sub-event type.
Aiming at any one animation object in the animation objects, receiving a basic sub-event creating instruction aiming at an active drawing track corresponding to the any one animation object;
responding to the basic sub-event creating instruction, and displaying a basic sub-event type list corresponding to the active drawing track corresponding to any one animation object;
and creating a basic animation sub-event corresponding to the target basic sub-event type based on the target basic sub-event type selected from the basic sub-event type list.
The basic sub-event adding instruction refers to an instruction for adding a basic animation sub-event. In practical applications, any one of the basic animation sub-events has a time interval for event execution, and the time interval is composed of an event starting time and an event ending time. The time interval may be embodied on an animation timeline. And the basic animation sub-event is used for realizing the basic function of the animation object, so the basic animation sub-event is created on the basic time axis (active drawing track) corresponding to the animation object. And therefore when creating the base animation sub-event, the base sub-event creation instruction needs to be submitted for the base timeline (active drawing track).
Further, since the event types of the basic animation sub-event corresponding to each animation object are various, different types of basic animation sub-events may need to set different properties. In order to avoid the mutual influence among different types of basic animation sub-events, the basic animation sub-events need to be divided into at least one basic sub-event type according to the animation object, and the basic sub-event types form a basic sub-event type list. When a basic animation sub-event is created on the active drawing track corresponding to any one animation object, a basic sub-event type list corresponding to the active drawing track corresponding to the animation object can be displayed. And then, based on the target basic sub-event type selected by the user in the basic sub-event type list, creating a basic animation sub-event corresponding to the target basic sub-event type. It should be noted that at least one base animation sub-event may be created for the main animation track, and a set of these base animation sub-events is referred to as a base animation event.
Following the above example: creating an animation object as described above: on the basis of the actor 1, receiving a basic sub-event creating instruction submitted by a user U in an animation editor aiming at an active drawing track corresponding to the actor 1, and displaying a basic sub-event type list corresponding to the active drawing track of the actor 1 in the animation editor in response to the basic sub-event creating instruction, wherein the basic sub-event type list comprises: bound object type, bound actor type, follow type. Receiving a binding object type selected by a user U in the basic sub-event type list as a target basic sub-event type, and creating a basic animation sub-event of the binding object type: the binding object a specifically indicates that the object a is displayed in a scene in a time interval corresponding to the event.
In summary, a basic sub-event creating instruction of an active drawing track corresponding to any one animation object is received by aiming at any one animation object in a plurality of animation objects, and the corresponding basic animation sub-event is created through a target basic sub-event type selected in a basic sub-event type list in response to the basic sub-event creating instruction. The method and the device realize the type division of the basic animation sub-event of the animation object, flexibly select the required basic sub-event type from a plurality of basic sub-event types to establish the basic animation sub-event, and increase the flexibility of establishing the basic animation sub-event.
Similarly, on the basis of creating an animation object, different extended functions can be implemented for each animation object, so that in order to avoid mutual influence between different extended functions, different extended functions of the animation object can be divided into different extended sub-event types in advance, and an extended animation sub-event is created according to the extended sub-event type.
Aiming at any one of the target animation objects, receiving an extended sub-event creating instruction aiming at a sub-animation track associated with the any one target animation object;
responding to the extended sub-event creating instruction, and displaying a corresponding extended sub-event type list of a sub-animation track associated with any one target animation object;
and creating an extended animation sub-event corresponding to the target extended sub-event type based on the target extended sub-event type selected from the extended sub-event type list.
The extension sub-event creating instruction refers to an instruction for adding an extension animation sub-event. In practical applications, any extended animation sub-event has a time interval for event execution, and the time interval is also composed of an event start time and an event end time. The time interval also needs to be embodied on the animation timeline. And the extended animation sub-event is an extended function to implement the animation object, and thus the extended animation sub-event is created on an extended time axis (i.e., a sub-animation track) associated with the animation object. Thus, when creating an extended animation sub-event, an extended sub-event creation instruction needs to be submitted for the sub-animation track.
Further, since the event types of the extended animation sub-event are various, different types of extended animation sub-events may need to set different properties. In order to avoid the mutual influence among different types of extended animation sub-events, the extended animation sub-events are divided into at least one extended sub-event type according to the animation object, and the extended sub-event types form an extended sub-event type list. When the extended animation sub-event is created on the sub-animation track associated with any animation object, the extended sub-event type list corresponding to the sub-animation track associated with the animation object can be displayed. And then based on the target extension sub-event type selected by the user in the extension sub-event type list, creating an extension animation sub-event corresponding to the target extension sub-event type. It should be noted that at least one extended animation sub-event may also be created for a sub-animation track, and a set of these extended animation sub-events is referred to as an extended animation event.
Following the above example: creating an animation object as described above: on the basis of the actor 1, receiving an extension sub-event creating instruction submitted by a user U in an animation editor aiming at an action track associated with the actor 1, and displaying an extension sub-event type list corresponding to the action track of the actor 1 in the animation editor in response to the extension sub-event creating instruction, wherein the extension sub-event type list comprises: leg action type, hand action type, head action type. Receiving a leg action type selected by a user U in the extended sub-event type list as a target extended sub-event type, and creating an extended animation sub-event of the leg action type: a kicking event. The kicking sub-event indicates that the actor 1 is controlled to perform a kicking action in a time interval corresponding to the event.
The animation object has been created in the animation editor specifically shown in fig. 3: an actor 1 whose animated image of the actor 1 is displayed in an animation display area of an animation editor. In addition, actor 1 has a corresponding active drawing track and a sub-animation track, and a base animation sub-event is created for the active drawing track corresponding to actor 1: binding object a, creates an extended animation sub-event for the sub-animation track associated with actor 1: a kicking event.
In summary, by aiming at any one of the plurality of target animation objects, an extended sub-event creation instruction of a sub-animation track associated with any one of the target animation objects is received, and in response to the extended sub-event creation instruction, a corresponding extended animation sub-event is created by the target extended sub-event type selected in the extended sub-event type list. The method and the device realize the type division of the extended animation sub-event of the animation object, flexibly select the required extended sub-event type from a plurality of extended sub-event types to establish the extended animation sub-event, and increase the flexibility of establishing the extended animation sub-event.
In practical applications, it may be necessary to view the created animation sub-event (base animation sub-event or extended animation sub-event). When editing the animation sub-events in the animation editor, if the sub-region of the event operation of each animation sub-event is linearly traversed to determine the animation sub-event matched with the click of the user, the time consumption is long. Therefore, the event operation sub-region of each animation sub-event can be traversed in the form of a quadtree, and the embodiment of the application further includes:
receiving an event query instruction submitted by a user in an event operation area;
traversing an event quad tree corresponding to the event operation area, and comparing the position information in the event query instruction with an event operation sub-area corresponding to a node in the event quad tree;
the event operation sub-region is the position of an operation sub-region corresponding to an animation sub-event in the animation event;
and determining and displaying the target animation sub-event in the animation sub-events according to the comparison result.
The event operation area refers to an interface area which is included in the animation editor and can operate at least one animation sub-event. An event query instruction is an instruction for querying any animation sub-event (basic animation sub-event or extended animation sub-event); each event operation sub-region refers to any one of the created animation sub-events and the operation sub-region corresponding to the event operation region; the event manipulation sub-region may also be understood as an event bounding box.
In practical applications, in order to query an animation sub-event that has been created, the animation sub-event that has been created is usually mapped in an editing interface of an animation editor in the form of an event operation sub-region, so that a user can check any animation sub-event when submitting an event query instruction by performing an operation in the event operation sub-region corresponding to the animation sub-event. But because the animation sub-event which the user wants to view is determined according to the operation position of the user. Therefore, the position information carried in the event query instruction of the user needs to be compared with the event operation sub-region of the animation sub-event included in the event operation region to determine the event operation sub-region matched with the position information. And using the animation sub-event corresponding to the matched event operation sub-region as the animation sub-event corresponding to the event query instruction.
In specific implementation, the event quadtree may be traversed, where the event quadtree is formed based on the event operation sub-region corresponding to the created animation sub-event, that is, the event quadtree corresponding to the event operation region is traversed, and the position information (for example, click position information carried in an event query command submitted by clicking) carried in the event query command is compared with the event operation sub-regions corresponding to the leaf nodes in the event quadtree, so as to determine the event operation sub-region corresponding to the position information. And the animation sub-event corresponding to the corresponding event operation sub-region is taken as a target animation sub-event, and the target animation sub-event (a target basic animation sub-event or a target extended animation sub-event) is displayed. Specifically, the target animation sub-event can be displayed through an event panel displaying the target animation sub-event.
Further, the user can edit the queried animation sub-event in the displayed event panel. After the editing is completed, the animation editor updates the queried animation sub-events based on the event information in the event panel.
Following the above example, assume that two base animation sub-events and three extension animation sub-events have been created for actor 1, these two base animation sub-events being: binding an object A and a following object B; the three extended animation sub-events are respectively: a leg kick event, a head nod event, and a hand swing event. Specifically, as shown in fig. 4, in the event operation area in the animation editor, the binding object a corresponds to the event operation sub-area cr1, and the following object B corresponds to the event operation sub-area cr 2; the kick sub-event corresponds to the event operation sub-region cr3, the click sub-event corresponds to the event operation sub-region cr4, and the hand swing sub-event corresponds to the event operation sub-region cr 5. The animation sub-events are displayed in an event operation area of the animation editor in a mode of the event operation sub-area. Receiving an event query instruction submitted by a user U in an event operation area, traversing an event quadtree created in advance for the event operation area and event operation sub-areas of 5 animation sub-events, comparing the position information d1 carried in the event query instruction with the 5 event operation sub-areas corresponding to nodes of the event quadtree, determining that the position information d1 is matched with the event operation sub-area cr4, and displaying an event panel corresponding to the head sub-event corresponding to the event operation sub-area cr 4.
In conclusion, the animation sub-events to be checked are inquired in a mode of traversing the quadtree, and the inquiry efficiency of the animation sub-events corresponding to the inquiry event inquiry instruction is improved.
In specific implementation, each created animation sub-event needs to be included in the event quadtree, and in order to avoid that the event quadtree needs to be reconstructed after a new animation sub-event is created each time, a new animation sub-event can be created each time, and the created event quadtree is updated based on the event operation sub-region of the newly created animation sub-event, in the embodiment of the present application, the event quadtree is updated in the following manner:
after any animation sub-event in the animation events is created, determining a corresponding target node of an event operation sub-region of the any animation sub-event in the event quadtree;
and updating the target node according to the event operation sub-region of any animation sub-event to obtain an updated event quadtree.
In practical application, each leaf node in the event quadtree corresponds to a node region in the event operation region, and each leaf node corresponds to an event operation sub-region included in the node region. Therefore, it is necessary to determine a node area where the operation event sub-area of the newly created animation sub-event (the basic animation sub-event or the extended animation sub-event) is located, and then determine a corresponding target node of the node area in the event quadtree. And updating the event quadtree by adding the operation event sub-region position of the newly created animation sub-event to the target node for updating, so as to obtain the updated event quadtree. For example, the target node originally corresponds to 1 event operation sub-region, and after the target node is updated by the operation event sub-region of the newly created animation sub-event, the target node corresponds to 2 event operation sub-regions.
Along the above example, it is assumed that there are 4 leaf nodes in the event quadtree, and the 4 leaf nodes are respectively leaf node a, leaf node B, leaf node C and leaf node D, where leaf node a corresponds to event operation sub-region cr1, leaf node B corresponds to event operation sub-region cr3, leaf node C corresponds to event operation sub-region cr2, and leaf node D corresponds to event operation sub-region cr 4. After the extended animation sub-event is created: after the hand swing sub-event, if it is determined that the event operation sub-region cr5 of the hand swing sub-event is in the node region corresponding to the leaf node C, the leaf node C is taken as the target node. The leaf node C originally corresponds to the event operation sub-region cr2 of the follower object B, and the leaf node C is updated by adding the event operation sub-region r5 to the leaf node C, and the updated leaf node C corresponds to the event operation sub-region cr2 of the follower object B and the event operation sub-region cr5 of the waving sub-event. Further, the updated event quadtree is obtained by updating the leaf node C in the event quadtree.
In summary, after the event quadtree is created for the first time, the event quadtree only needs to be updated through the event operation sub-region corresponding to the newly created animation sub-event after the animation sub-event is created each time, and the efficiency of checking the event is indirectly improved.
Step 206: and creating a target animation file based on the animation object, the animation track and the animation event.
Specifically, in addition to reading the animation event, in order to include all animation elements in the created target animation file, a playable animation (i.e., a target animation file) is created based on animation elements such as an animation object, an animation track, and an animation event.
During specific implementation, a blank initial animation file can be created, animation data corresponding to an animation object, an animation track and an animation event are read from a memory, and the animation data are added to the initial animation file according to a preset format to generate a target animation file. Furthermore, the target animation file can be played by calling a playing interface so as to play the created target animation.
In order to enable the created target animation file to include all animation elements created in an animation editing process on the basis that the animation track is divided into a main animation track and a sub animation track and an animation event is divided into a basic animation event and an extended animation event, according to the embodiment of the application, the target animation file is created on the basis of the animation object, the main animation track, the basic animation event, the sub animation track and the extended animation event. The created target animation file is well-defined in hierarchy and is more interpretable. In particular, in the process of creating an animation by editing an animation editor, some animation resources of a preset type in the animation editor may be referenced, for example: the method and the system for creating the animation file have the advantages that the material resources, the map resources, the animation clip resources and the like which are imported or configured in advance in the animation editor are not beneficial to the migration of the animation file if only the reference relations are directly saved for the resources in the process of creating the animation file, so that the resource information of the referenced original resources can be directly saved in the animation file, and the method and the system for creating the animation file are specifically realized by the following steps:
taking the animation object, the animation track and the animation event as animation elements;
determining the preset type of target animation resources associated with at least one target animation element in the animation elements, and reading resource information of the target animation resources;
updating the at least one target animation element through the resource information according to the incidence relation between the target animation resource and the at least one target animation element;
updating the animation object, the animation track and the animation event based on the updated at least one target animation element;
and taking the updated animation object, animation track and animation event as the animation object, the animation track and the animation event.
The preset type refers to a preset resource type of the animation resource. In practical application, the resource type which is relatively independent relative to the animation editor can be set as a preset type, and in this case, reading the resource information in the animation editor is relatively simple, so that the resource information of the animation resources can be directly read.
In specific implementation, the created animation object, animation track and animation event are used as animation elements, each of the animation elements may refer to animation resources, and therefore, the following steps are required: and selecting an animation element associated with a preset type of animation resource from the animation object, the animation track and the animation event as at least one target animation element. So as to read preset types of animation resources (i.e., target animation resources) associated (referenced or depended on) with the target animation elements and read resource information of the target animation resources.
And updating the target animation elements with the association relationship through the read resource information according to the association relationship (reference relationship or dependency relationship) between the target animation resources and at least one target animation element. And updating the animation object, the animation track and the animation element corresponding to the animation object in the animation event based on the updated at least one target animation element to obtain the updated animation element so as to create a target animation file based on the updated animation elements.
Following the above example, an animation creation instruction submitted by a user U is received, and all animation objects created in an animation editor are determined in response to the animation creation instruction, the animation objects including: the actor 1, the director 1 and the camera 1 read basic animation events which are created in advance aiming at active drawing tracks corresponding to the three animation objects, wherein the basic animation events comprise: binding object a (basic animation sub-event). And selecting an animation object associated with the sub-animation track among the three animation objects: actor 1, the animation object: actor 1 serves as the target animation object. And reads an extended animation event created in advance by an action track (sub-animation track) associated with the target animation object actor 1, the extended animation event including: a kicking sub-event (an extended animation sub-event).
Three animation objects, active drawing tracks corresponding to the three animation objects, action tracks associated with actors 1, a binding object A and kicking sub-events are used as animation elements, and preset types are preset in the animation elements: under the condition of the animation resources of the map type and the material type, determining that the target animation elements of the animation resources related to the preset type in the animation elements are as follows: the object a is bound. And determining that the preset type (associated) target animation resources configured in the binding object a are: and reading the file information contained in the file A. And updating the binding object A based on the read file information to obtain the updated binding object A.
In summary, by reading the resource information of the target animation resource associated with the target animation element and updating the target animation element based on the resource information, the created target animation file directly includes the resource information of the animation resource, which is convenient for transplanting the target animation file.
In addition, animation elements related to the preform resources also exist in the animation elements, and since the preform resources themselves are created in advance by an animation editor, and the stored information needs to be analyzed by the animation editor, for such preform resources, in order to avoid that it takes too long time and more resources to analyze the preform resources, resource paths of the preform resources can be saved, which is specifically implemented in the following manner in the embodiment of the present application:
taking the animation object, the animation track and the animation event as animation elements;
determining a target preform resource associated with at least one of the animation elements and determining a resource path of the target preform resource;
updating the at least one prefabricated animation element through the resource path according to the incidence relation between the target prefabricated resource and the at least one prefabricated animation element;
updating the animation object, the animation track, and the animation event based on the updated at least one preform animation element;
and taking the updated animation object, animation track and animation event as the animation object, the animation track and the animation event.
Because the dependence of the prefabricated part resources on the animation editor is larger, the difficulty in reading the resource information in the prefabricated part resources is larger, and especially, when the animation editor is used as a plug-in editor and the prefabricated part resources are pre-created by an original editor corresponding to the plug-in editor, the resource information in the prefabricated part resources is difficult to analyze, so that the resource paths of the prefabricated part resources can be directly determined.
In specific implementation, the created animation object, active drawing track, basic animation event, sub animation track and extended animation event are used as animation elements, each animation element in the animation elements may also refer to a preform resource, and therefore, the method needs to pass through: selecting an animation element associated with the pre-form resource as at least one pre-form animation element among an animation object, an active draw track, a base animation event, a sub-animation track, and an extended animation event. In order to determine the resource path of the target preform resource to which these preform animation elements are associated (referenced or dependent).
And updating the prefabricated animation elements with the association relationship through the determined resource path according to the association relationship (reference relationship or dependency relationship) between the target prefabricated resource and at least one prefabricated animation element. And updating the animation object, the animation track and the animation element corresponding to the animation object in the animation event based on the updated at least one prefabricated part animation element to obtain an updated animation element so as to create a target animation file based on the updated animation elements.
According to the above example, three animation objects, active drawing tracks corresponding to the three animation objects, action tracks associated with the actor 1, the bound object a and the kicking sub-event are taken as animation elements, and the prefab elements of the associated prefab resources in the animation elements are determined as follows: actor 1. A target prefabricated resource file B configured (associated) in the actor 1 is determined and a resource path p1 of the target prefabricated resource file B is determined. Actor 1 is updated based on resource path p1 to obtain updated actor 1.
In conclusion, the resource path of the target prefabricated part resource associated with the prefabricated part animation element is determined, and the prefabricated part animation element is updated based on the resource path, so that the created target animation file directly contains the resource path of the prefabricated part resource, and the generation efficiency of the target animation file is guaranteed.
To sum up, in the animation generation method provided in the embodiment of the present application, based on the received animation creation instruction, the created animation object is determined, and an animation event created in advance for an animation track corresponding to the animation object is read; and then according to the hierarchical structure of the animation object, the animation track and the animation event, the target animation file is created based on the animation object, the animation track and the animation event, so that the target animation file is created according to the hierarchical structure of the animation track and the animation event corresponding to the animation object, the animation layer is more visual and is convenient to understand, the animation creating efficiency of a user is improved, and the creating experience of the user is improved.
Fig. 5 shows a processing flow chart of an animation generating method applied to an animation editor, which is described by taking the animation editor as an example and specifically includes the following steps:
step 502: and receiving an object creating instruction submitted by a user.
Specifically, the object creation instruction refers to an object creation instruction submitted by a user through operating an object creation control in the animation editor.
Step 504: and responding to the object creating instruction, and displaying an animation object type list, wherein the animation object type list comprises a director type, a camera type, an actor type, a special effect type and a light type.
Specifically, the director type, the camera type, the actor type, the special effect type, and the light type are animation types in the animation object type list.
Step 506: and creating a camera 01 corresponding to the camera type based on the camera type selected by the user in the animation object type list.
Specifically, the camera 01 is an animation object corresponding to the created camera type. In practical applications, the created camera 01 directly corresponds to an active drawing track.
Step 508: a sub-track creation instruction of the user for the camera 01 is received.
Specifically, the sub-track creation instruction refers to a sub-track creation instruction submitted by a user through an operation performed on a sub-track creation control for the camera 01 in the animation editor.
Step 510: in response to the sub-track creation instruction, a list of sub-track types associated with the functions of the camera 01 is presented.
Specifically, the sub-track type list may include a moving track type, a display track type, and other sub-track types.
Step 512: and creating a moving track 1 corresponding to the moving track type based on the moving track type selected by the user in the sub-track type list.
Step 514: and receiving a basic sub-event creating instruction of the user for actively drawing the track corresponding to the camera 01.
In practical application, when the camera 01 is created, the corresponding active drawing track can be automatically created.
Step 516: and responding to the basic sub-event creating instruction, and showing a basic sub-event type list corresponding to the active drawing track corresponding to the camera 01.
Specifically, the basic sub-event type list includes basic sub-event types such as a lens parameter type and a depth of field type.
Step 518: and creating a lens parameter sub-event corresponding to the lens parameter type based on the lens parameter type selected by the user in the basic sub-event type list.
Step 520: an extended sub-event creation instruction of the user for the movement track 1 associated with the camera 01 is received.
Step 522: and responding to the extended sub-event creating instruction, and showing an extended sub-event type list corresponding to the moving track 1 associated with the camera 01.
Specifically, the extended sub-event type list may include: displacement type, straight type, turning type, shift displacement and other extended sub-event types.
Step 524: and creating a turning sub-event corresponding to the turning type based on the turning type selected by the user in the extended sub-event type list.
Specifically, the turning sub-event is an extended animation sub-event.
Step 526: based on the received animation creation instruction, three animation objects of the camera 01, the director 01, and the actor 01 are determined.
In particular, in addition to the above-described camera 01 created in the animation editor, two animation objects of a director 01 and an actor 01 are created in the animation editor before receiving an animation creation instruction.
Step 528: and reading a lens parameter sub-event pre-created for a general track corresponding to each animation object, and reading a turning sub-event pre-created for a moving track 1 associated with the camera 01 in the three animation objects.
Step 530: determining three animation objects of a camera 01, a director 01 and an actor 01, and turning sub-events which are related to animation resources of preset types in a general track, a moving track 1, a lens parameter sub-event and a turning sub-event corresponding to the three animation objects, and reading resource information of target animation resources related to the turning sub-event.
Specifically, the preset types include: the resource types, such as the material type, the map type, and/or the animation clip type, are not limited herein. The target animation resource is an animation resource of a preset type associated with the turning sub-event.
Step 532: and updating the turning sub-event through the resource information according to the incidence relation between the target animation resource and the turning sub-event.
Step 534: and creating a target animation file based on three animation objects, namely the camera 01, the director 01 and the actor 01, and the three animation objects, namely the corresponding general track, the moving track 1 and the lens parameter sub-event, and the updated turning sub-event.
To sum up, the animation generation method provided in the embodiment of the present application creates a basic animation sub-event for the active drawing track corresponding to the animation object, and creates an extended animation sub-event for the sub-animation track corresponding to the animation track, on the basis of creating the animation object. And based on the received animation creating instruction, reading the created animation object, active drawing track, basic animation event, sub-animation track and extended animation event according to the hierarchical structures of the animation object, the active drawing track and the basic animation event and the hierarchical structures of the animation object, the sub-animation track and the extended animation event to create a target animation file, so that the animation creating process is more visual by dividing the animation hierarchy, the interpretability is stronger, and the animation creating efficiency is improved.
Corresponding to the above method embodiment, the present application further provides an animation generation apparatus embodiment, and fig. 6 shows a schematic structural diagram of the animation generation apparatus provided in an embodiment of the present application. As shown in fig. 6, the apparatus 600 includes:
a determination module 602 configured to determine an animation object based on the received animation creation instruction;
a reading module 604 configured to read an animation event created in advance for an animation track corresponding to the animation object;
a creation module 606 configured to create a target animation file based on the animation object, the animation track, and the animation event.
Optionally, the reading module 604 is further configured to:
reading a basic animation event which is pre-created aiming at an active animation track corresponding to the animation object, and reading an extended animation event which is pre-created aiming at a sub-animation track associated with a target animation object, wherein the target animation object is the animation object associated with the sub-animation track in the animation object;
accordingly, the creating module 606 is further configured to:
creating a target animation file based on the animation object, the main animation track, the base animation event, the sub animation track, and the extended animation event.
Optionally, in a case where a preset type of animation resource is configured in advance, the animation generation apparatus includes:
a read information module configured to take the animation object, the animation track, and the animation event as animation elements; determining the preset type of target animation resources associated with at least one target animation element in the animation elements, and reading resource information of the target animation resources;
a first updating module configured to update the at least one target animation element through the resource information according to an incidence relation between the target animation resource and the at least one target animation element;
a second update module configured to update the animation object, the animation track, and the animation event based on the updated at least one target animation element; and taking the updated animation object, animation track and animation event as the animation object, the animation track and the animation event.
Optionally, the at least one target animation element is determined by running the following modules:
a selection module configured to select an animation element associated with the preset type of animation resource as at least one target animation element among the animation object, the animation track, and the animation event.
Optionally, in a case where the resource of the preform is configured in advance, the animation generation apparatus includes:
a determine path module configured to take the animation object, the animation track, and the animation event as animation elements; determining a target preform resource associated with at least one of the animation elements and determining a resource path of the target preform resource;
a third updating module configured to update the at least one prefabricated animation element through the resource path according to the association relationship between the target prefabricated resource and the at least one prefabricated animation element;
a fourth update module configured to update the animation object, the animation track, and the animation event based on the updated at least one preform animation element; and taking the updated animation object, animation track and animation event as the animation object, the animation track and the animation event.
Optionally, any one of the animation sub-events is queried by running the following modules:
the first receiving module is configured to receive an event query instruction submitted by a user in an event operation area;
the comparison module is configured to compare the position information in the event query instruction with the event operation sub-region corresponding to the node in the event quadtree by traversing the event quadtree corresponding to the event operation region; the event operation sub-region is the position of an operation sub-region corresponding to an animation sub-event in the animation event;
and the determining sub-event module is configured to determine and display a target animation sub-event in the animation sub-events according to the comparison result.
Optionally, the event quadtree is updated by running the following modules:
the node determining module is configured to determine a target node corresponding to an event operation sub-region of any animation sub-event in the event quadtree after the animation sub-event is created;
and the fifth updating module is configured to update the target node according to the event operation sub-region of any one animation sub-event to obtain an updated event quadtree.
Optionally, any one of the animation objects is created by running the following modules:
a first presentation module configured to present an animation object type list in response to a received object creation instruction, wherein the animation object type list includes, but is not limited to, a director type, a camera type, an actor type, a special effect type, and a light type;
and the object creating module is configured to create an animation object corresponding to the target animation object type based on the target animation object type selected from the animation object type list.
Optionally, any one of the sub animation tracks is created by:
a second receiving module configured to receive a sub-track creation instruction for any one of the animation objects;
a second presentation module configured to present a sub-track type list associated with an object property of the arbitrary one of the animation objects in response to the sub-track creation instruction;
and the track creating module is configured to create a sub animation track corresponding to the target sub track type based on the target sub track type selected from the sub track type list.
Optionally, the target animation object is determined by running the following modules:
and the selection object module is configured to select the animation object of the associated sub-animation track from the animation objects and determine the animation object as the target animation object.
Optionally, any one of the basic animation sub-events is created by running the following modules:
a third receiving module, configured to receive, for any one of the animation objects, a basic sub-event creation instruction for an active drawing track corresponding to the any one animation object;
a third presentation module, configured to present, in response to the basic sub-event creating instruction, a basic sub-event type list corresponding to an active drawing track corresponding to the arbitrary animation object;
and the first creating sub-event module is configured to create a basic animation sub-event corresponding to the target basic sub-event type based on the target basic sub-event type selected from the basic sub-event type list.
Optionally, any one of the extended animation sub-events is created by running the following modules:
a fourth receiving module, configured to receive, for any one of the target animation objects, an extended sub-event creation instruction for a sub-animation track associated with the any one target animation object;
a fourth presentation module, configured to present, in response to the extended sub-event creating instruction, a corresponding extended sub-event type list of a sub-animation track associated with the arbitrary one target animation object;
and the second creating sub-event module is configured to create an extended animation sub-event corresponding to the target extended sub-event type based on the target extended sub-event type selected from the extended sub-event type list.
To sum up, the animation generation apparatus provided in the embodiment of the present application determines, based on the received animation creation instruction, an animation object that has been created, and reads an animation event that is created in advance for an animation track corresponding to the animation object; and then according to the hierarchical structure of the animation object, the animation track and the animation event, the target animation file is created based on the animation object, the animation track and the animation event, so that the animation creation process is more visual by dividing the animation hierarchy, the interpretability is stronger, and the animation creation efficiency is improved.
The above is a schematic configuration of an animation generation apparatus of the present embodiment. It should be noted that the technical solution of the animation generation apparatus is the same concept as the technical solution of the animation generation method described above, and for details not described in detail in the technical solution of the animation generation apparatus, reference may be made to the description of the technical solution of the animation generation method described above.
There is also provided in an embodiment of the present application a computing device comprising a memory, a processor, and computer instructions stored on the memory and executable on the processor, wherein the processor implements the steps of the animation generation method when executing the computer instructions.
The above is an illustrative scheme of a computing device of the present embodiment. It should be noted that the technical solution of the computing device and the technical solution of the animation generation method belong to the same concept, and details that are not described in detail in the technical solution of the computing device can be referred to the description of the technical solution of the animation generation method.
An embodiment of the present application further provides a computer readable storage medium, which stores computer instructions, and the computer instructions, when executed by a processor, implement the steps of the animation generation method as described above.
The above is an illustrative scheme of a computer-readable storage medium of the present embodiment. It should be noted that the technical solution of the storage medium belongs to the same concept as the technical solution of the above animation generation method, and for details that are not described in detail in the technical solution of the storage medium, reference may be made to the description of the technical solution of the above animation generation method.
The foregoing description of specific embodiments of the present application has been presented. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The computer instructions comprise computer program code which may be in the form of source code, object code, an executable file or some intermediate form, or the like. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The preferred embodiments of the present application disclosed above are intended only to aid in the explanation of the application. Alternative embodiments are not exhaustive and do not limit the invention to the precise embodiments described. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and its practical applications, to thereby enable others skilled in the art to best understand and utilize the application. The application is limited only by the claims and their full scope and equivalents.

Claims (15)

1.一种动画生成方法,其特征在于,包括:1. a kind of animation generation method, is characterized in that, comprises: 基于接收的动画创建指令,确定动画对象;Determine the animation object based on the received animation creation instruction; 读取针对所述动画对象对应的动画轨道预先创建的动画事件;Read the animation event created in advance for the animation track corresponding to the animation object; 基于所述动画对象、所述动画轨道以及所述动画事件创建目标动画文件。A target animation file is created based on the animation object, the animation track, and the animation event. 2.根据权利要求1所述的动画生成方法,其特征在于,所述读取针对所述动画对象对应的动画轨道预先创建的动画事件,包括:2. The animation generation method according to claim 1, wherein the reading an animation event pre-created for the animation track corresponding to the animation object, comprises: 读取针对所述动画对象对应的主动画轨道预先创建的基础动画事件,并读取针对目标动画对象关联的子动画轨道预先创建的扩展动画事件,其中,所述目标动画对象为所述动画对象中关联子动画轨道的动画对象;Read the basic animation event pre-created for the main animation track corresponding to the animation object, and read the extended animation event pre-created for the sub-animation track associated with the target animation object, wherein the target animation object is the animation object The animation object associated with the child animation track; 相应地,所述基于所述动画对象、所述动画轨道以及所述动画事件创建目标动画文件,包括:Correspondingly, creating a target animation file based on the animation object, the animation track and the animation event includes: 基于所述动画对象、所述主动画轨道、所述基础动画事件、所述子动画轨道以及所述扩展动画事件创建目标动画文件。A target animation file is created based on the animation object, the main animation track, the base animation event, the sub animation track, and the extended animation event. 3.根据权利要求1所述的动画生成方法,其特征在于,在预先配置预设类型的动画资源的情况下,所述基于所述动画对象、所述动画轨道以及所述动画事件创建目标动画文件之前,还包括:3. The animation generation method according to claim 1, characterized in that, in the case of preconfigured animation resources of preset types, the target animation is created based on the animation object, the animation track and the animation event. Before the file, also include: 将所述动画对象、所述动画轨道以及所述动画事件作为动画元素;using the animation object, the animation track and the animation event as animation elements; 确定与所述动画元素中至少一个动画元素关联的所述预设类型的目标动画资源,并读取所述目标动画资源的资源信息;Determine the target animation resource of the preset type associated with at least one animation element in the animation elements, and read the resource information of the target animation resource; 按照所述目标动画资源与所述至少一个目标动画元素之间的关联关系,通过所述资源信息对所述至少一个目标动画元素进行更新;According to the association relationship between the target animation resource and the at least one target animation element, the at least one target animation element is updated through the resource information; 基于更新后的至少一个目标动画元素对所述动画对象、所述动画轨道以及所述动画事件进行更新;updating the animation object, the animation track and the animation event based on the updated at least one target animation element; 将更新后的动画对象、动画轨道以及动画事件作为所述动画对象、所述动画轨道以及所述动画事件。The updated animation object, animation track and animation event are used as the animation object, the animation track and the animation event. 4.根据权利要求3所述的动画生成方法,其特征在于,所述至少一个目标动画元素,通过如下方式确定:4. The animation generation method according to claim 3, wherein the at least one target animation element is determined in the following manner: 在所述动画对象、所述动画轨道以及所述动画事件中选择与所述预设类型的动画资源关联的动画元素作为至少一个目标动画元素。The animation element associated with the animation resource of the preset type is selected as at least one target animation element from the animation object, the animation track and the animation event. 5.根据权利要求1所述的动画生成方法,其特征在于,在预先配置预制件资源的情况下,所述基于所述动画对象、所述动画轨道以及所述动画事件创建目标动画文件之前,还包括:5. The animation generation method according to claim 1, wherein, in the case of preconfigured prefab resources, before the target animation file is created based on the animation object, the animation track and the animation event, Also includes: 将所述动画对象、所述动画轨道以及所述动画事件作为动画元素;using the animation object, the animation track and the animation event as animation elements; 确定与所述动画元素中至少一个预制件动画元素关联的目标预制件资源,并确定所述目标预制件资源的资源路径;determining a target prefab resource associated with at least one prefab animation element in the animation elements, and determining a resource path of the target prefab resource; 按照所述目标预制件资源与所述至少一个预制件动画元素之间的关联关系,通过所述资源路径对所述至少一个预制件动画元素进行更新;updating the at least one prefab animation element through the resource path according to the association relationship between the target prefab resource and the at least one prefab animation element; 基于更新后的至少一个预制件动画元素对所述动画对象、所述动画轨道以及所述动画事件进行更新;updating the animation object, the animation track and the animation event based on the updated at least one prefab animation element; 将更新后的动画对象、动画轨道以及动画事件作为所述动画对象、所述动画轨道以及所述动画事件。The updated animation object, animation track and animation event are used as the animation object, the animation track and the animation event. 6.根据权利要求1所述的动画生成方法,其特征在于,所述动画事件中的任意一个动画子事件,通过如下方式查询:6. animation generation method according to claim 1, is characterized in that, any one animation sub-event in described animation event is inquired as follows: 接收用户在事件操作区域内提交的事件查询指令;Receive event query instructions submitted by users in the event operation area; 通过对所述事件操作区域对应的事件四叉树进行遍历,将所述事件查询指令中的位置信息与所述事件四叉树中节点对应的事件操作子区域进行对比;By traversing the event quadtree corresponding to the event operation area, the location information in the event query instruction is compared with the event operation sub-area corresponding to the node in the event quadtree; 其中,所述事件操作子区域为所述动画事件中动画子事件对应的操作子区域的位置;Wherein, the event operation sub-region is the position of the operation sub-region corresponding to the animation sub-event in the animation event; 根据对比结果在所述动画子事件中确定目标动画子事件并展示。According to the comparison result, the target animation sub-event is determined in the animation sub-event and displayed. 7.根据权利要求6所述的动画生成方法,其特征在于,所述事件四叉树,通过如下方式进行更新:7. animation generation method according to claim 6, is characterized in that, described event quadtree, is updated in the following way: 在创建所述动画事件中任意一个动画子事件之后,确定所述任意一个动画子事件的事件操作子区域在所述事件四叉树中对应的目标节点;After creating any animation sub-event in the animation event, determine the target node corresponding to the event operation sub-region of the animation sub-event in the event quadtree; 根据所述任意一个动画子事件的事件操作子区域对所述目标节点进行更新,获得更新后的事件四叉树。The target node is updated according to the event operation sub-region of any animation sub-event to obtain an updated event quadtree. 8.根据权利要求1所述的动画生成方法,其特征在于,所述动画对象中的任意一个动画对象,通过如下方式创建:8. animation generation method according to claim 1, is characterized in that, any animation object in described animation object is created by the following way: 响应于接收的对象创建指令,展示动画对象类型列表,其中,所述动画对象类型列表中包括并不限于导演类型、摄像机类型、演员类型、特效类型以及灯光类型;In response to the received object creation instruction, displaying a list of animation object types, wherein the animation object type list includes but is not limited to director type, camera type, actor type, special effect type and lighting type; 基于在所述动画对象类型列表中选择的目标动画对象类型创建所述目标动画对象类型对应的动画对象。An animation object corresponding to the target animation object type is created based on the target animation object type selected in the animation object type list. 9.根据权利要求2所述的动画生成方法,其特征在于,所述子动画轨道中任意一个子动画轨道,通过如下方式创建:9. animation generation method according to claim 2, is characterized in that, any one sub-animation track in described sub-animation track is created by the following way: 接收针对所述动画对象中任意一个动画对象的子轨道创建指令;Receive a sub-track creation instruction for any animation object in the animation object; 响应于所述子轨道创建指令,展示与所述任意一个动画对象的对象属性关联的子轨道类型列表;In response to the sub-track creation instruction, presenting a list of sub-track types associated with the object property of any one of the animation objects; 基于在所述子轨道类型列表中选择的目标子轨道类型创建所述目标子轨道类型对应的子动画轨道。A sub-animation track corresponding to the target sub-track type is created based on the target sub-track type selected in the sub-track type list. 10.根据权利要求2所述的动画生成方法,其特征在于,所述目标动画对象,通过如下方式进行确定:10. The animation generation method according to claim 2, wherein the target animation object is determined in the following manner: 从所述动画对象中选择关联子动画轨道的动画对象确定为目标动画对象。Selecting an animation object associated with a sub-animation track from the animation objects is determined as the target animation object. 11.根据权利要求2所述的动画生成方法,其特征在于,所述基础动画事件中任意一个基础动画子事件,通过如下方式创建:11. animation generation method according to claim 2, is characterized in that, any one basic animation sub-event in described basic animation event, is created by following way: 针对所述动画对象中任意一个动画对象,接收针对所述任意一个动画对象对应的主动画轨道的基础子事件创建指令;For any one of the animation objects in the animation object, receive a basic sub-event creation instruction for the main animation track corresponding to the any one of the animation objects; 响应于所述基础子事件创建指令,展示所述任意一个动画对象对应的主动画轨道对应的基础子事件类型列表;In response to the basic sub-event creation instruction, display the basic sub-event type list corresponding to the main animation track corresponding to any one of the animation objects; 基于在所述基础子事件类型列表中选择的目标基础子事件类型创建所述目标基础子事件类型对应的基础动画子事件。A basic animation sub-event corresponding to the target basic sub-event type is created based on the target basic sub-event type selected in the basic sub-event type list. 12.根据权利要求2所述的动画生成方法,其特征在于,所述扩展动画事件中任意一个扩展动画子事件,通过如下方式创建:12. animation generation method according to claim 2, is characterized in that, in described extended animation event, any one extended animation sub-event is created by following way: 针对所述目标动画对象中任意一个目标动画对象,接收针对所述任意一个目标动画对象关联的子动画轨道的扩展子事件创建指令;For any one of the target animation objects in the target animation object, receive an extended sub-event creation instruction for the sub-animation track associated with the any one of the target animation objects; 响应于所述扩展子事件创建指令,展示所述任意一个目标动画对象关联的子动画轨道的对应的扩展子事件类型列表;In response to the extended sub-event creation instruction, displaying the corresponding extended sub-event type list of the sub-animation track associated with any one of the target animation objects; 基于在所述扩展子事件类型列表中选择的目标扩展子事件类型创建所述目标扩展子事件类型对应的扩展动画子事件。An extended animation sub-event corresponding to the target extended sub-event type is created based on the target extended sub-event type selected in the extended sub-event type list. 13.一种动画生成装置,其特征在于,包括:13. An animation generation device, characterized in that, comprising: 确定模块,被配置为基于接收的动画创建指令,确定动画对象;a determination module configured to determine an animation object based on the received animation creation instruction; 读取模块,被配置为读取针对所述动画对象对应的动画轨道预先创建的动画事件;a reading module, configured to read the animation event pre-created for the animation track corresponding to the animation object; 创建模块,被配置为基于所述动画对象、所述动画轨道以及所述动画事件创建目标动画文件。A creation module is configured to create a target animation file based on the animation object, the animation track and the animation event. 14.一种计算设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机指令,其特征在于,所述处理器执行所述计算机指令时实现权利要求1-12任意一项所述方法的步骤。14. A computing device, comprising a memory, a processor and computer instructions stored in the memory and running on the processor, wherein the processor implements any one of claims 1-12 when executing the computer instructions the steps of the method described in item. 15.一种计算机可读存储介质,其存储有计算机指令,其特征在于,所述计算机指令被处理器执行时实现权利要求1-12任意一项所述方法的步骤。15. A computer-readable storage medium storing computer instructions, wherein, when the computer instructions are executed by a processor, the steps of the method according to any one of claims 1-12 are implemented.
CN202111113011.7A 2021-09-18 2021-09-18 Animation generation method and device Active CN113808237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111113011.7A CN113808237B (en) 2021-09-18 2021-09-18 Animation generation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111113011.7A CN113808237B (en) 2021-09-18 2021-09-18 Animation generation method and device

Publications (2)

Publication Number Publication Date
CN113808237A true CN113808237A (en) 2021-12-17
CN113808237B CN113808237B (en) 2024-12-27

Family

ID=78896333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111113011.7A Active CN113808237B (en) 2021-09-18 2021-09-18 Animation generation method and device

Country Status (1)

Country Link
CN (1) CN113808237B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040222992A1 (en) * 2003-05-09 2004-11-11 Microsoft Corporation System supporting animation of graphical display elements through animation object instances
CN102197414A (en) * 2008-12-03 2011-09-21 诺基亚公司 Stroke-based animation creation
CN105469438A (en) * 2015-11-11 2016-04-06 广州大学 Animation button device and method thereof for controlling image conversion into animation
CN106815882A (en) * 2015-12-01 2017-06-09 北京触控科技有限公司 It is a kind of can infinite expanding animation attributes method
CN106887029A (en) * 2016-06-14 2017-06-23 阿里巴巴集团控股有限公司 Animation control methodses, device and terminal
CN109242935A (en) * 2018-08-21 2019-01-18 北京奔流网络信息技术有限公司 A kind of Masking animation implementation method and device based on android system
CN112667942A (en) * 2019-10-16 2021-04-16 腾讯科技(深圳)有限公司 Animation generation method, device and medium
CN112927331A (en) * 2021-03-31 2021-06-08 腾讯科技(深圳)有限公司 Animation generation method and device of character model, storage medium and electronic equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040222992A1 (en) * 2003-05-09 2004-11-11 Microsoft Corporation System supporting animation of graphical display elements through animation object instances
CN1689046A (en) * 2003-05-09 2005-10-26 微软公司 System supporting animation of graphical display elements through animation object instances
CN102197414A (en) * 2008-12-03 2011-09-21 诺基亚公司 Stroke-based animation creation
CN105469438A (en) * 2015-11-11 2016-04-06 广州大学 Animation button device and method thereof for controlling image conversion into animation
CN106815882A (en) * 2015-12-01 2017-06-09 北京触控科技有限公司 It is a kind of can infinite expanding animation attributes method
CN106887029A (en) * 2016-06-14 2017-06-23 阿里巴巴集团控股有限公司 Animation control methodses, device and terminal
CN109242935A (en) * 2018-08-21 2019-01-18 北京奔流网络信息技术有限公司 A kind of Masking animation implementation method and device based on android system
CN112667942A (en) * 2019-10-16 2021-04-16 腾讯科技(深圳)有限公司 Animation generation method, device and medium
CN112927331A (en) * 2021-03-31 2021-06-08 腾讯科技(深圳)有限公司 Animation generation method and device of character model, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN113808237B (en) 2024-12-27

Similar Documents

Publication Publication Date Title
TWI808393B (en) Page processing method, device, apparatus and storage medium
JP6928644B2 (en) Creating a project in a content management system
US12373172B2 (en) Interactive graphic design system to enable creation and use of variant component sets for interactive objects
US20150052144A1 (en) Computer system storing content into application independent objects
CN107122175B (en) Interface creating method and device
CN112654995B (en) Tracking content attribution in online collaborative electronic documents
CN109471580B (en) A visual 3D courseware editor and courseware editing method
US20200342029A1 (en) Systems and methods for querying databases using interactive search paths
WO2025092766A1 (en) Method and apparatus for displaying work, and device and storage medium
CN115080016A (en) Method, device, device and medium for implementing extended function based on UE editor
CN117745885A (en) Expression generating and work publishing method, device, equipment and storage medium
CN111736929B (en) Method, apparatus, device and readable storage medium for creating task instance
CN120418768A (en) System and method for generating simulations using segment groupings
WO2025195387A1 (en) Work publishing method and apparatus, work viewing method and apparatus, device, and storage medium
CN113808237B (en) Animation generation method and device
WO2025157193A1 (en) Method and apparatus for creating media content, device, and storage medium
CN117611711A (en) Methods, apparatus, equipment and media for generating comics
CN116185197A (en) Method, device, device and storage medium for editing virtual objects
KR102385381B1 (en) Method and system for generating script forcamera effect
US20130156399A1 (en) Embedding content in rich media
CN118642809B (en) Method, apparatus, storage medium, and program product for dynamically rendering pages
US20240411525A1 (en) Tracking and comparing changes in a design interface
CN116206016A (en) Method and device for processing special effect event in animation
CN120540565A (en) Media resource input method, device, equipment and storage medium
KR101916802B1 (en) Apparatus and method for creating document form using block user interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 519000 Room 102, 202, 302 and 402, No. 325, Qiandao Ring Road, Tangjiawan Town, high tech Zone, Zhuhai City, Guangdong Province, Room 102 and 202, No. 327 and Room 302, No. 329

Applicant after: Zhuhai Jinshan Digital Network Technology Co.,Ltd.

Address before: 519000 Room 102, 202, 302 and 402, No. 325, Qiandao Ring Road, Tangjiawan Town, high tech Zone, Zhuhai City, Guangdong Province, Room 102 and 202, No. 327 and Room 302, No. 329

Applicant before: ZHUHAI KINGSOFT ONLINE GAME TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant