CN110992448A - Animation processing method and device, electronic equipment and storage medium - Google Patents

Animation processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN110992448A
CN110992448A CN201911191752.XA CN201911191752A CN110992448A CN 110992448 A CN110992448 A CN 110992448A CN 201911191752 A CN201911191752 A CN 201911191752A CN 110992448 A CN110992448 A CN 110992448A
Authority
CN
China
Prior art keywords
node
animation
action
action sequence
file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911191752.XA
Other languages
Chinese (zh)
Other versions
CN110992448B (en
Inventor
徐兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201911191752.XA priority Critical patent/CN110992448B/en
Publication of CN110992448A publication Critical patent/CN110992448A/en
Application granted granted Critical
Publication of CN110992448B publication Critical patent/CN110992448B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides an animation processing method, an animation processing device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring node identifiers of operation nodes respectively corresponding to at least one action sequence in the animation file and node default state information corresponding to each operation node; each action sequence comprises at least one execution action; for each action sequence, determining at least one difference node of the action sequence compared with other action sequences based on the node identification of the operation node corresponding to the action sequence and the node identifications of the operation nodes corresponding to other action sequences respectively, wherein the difference node is the operation node corresponding to the execution action not included in the action sequence; and performing frame supplementing processing on the action sequence according to the node default state information corresponding to the different node. The animation processing efficiency can be improved.

Description

Animation processing method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of animation processing technologies, and in particular, to an animation processing method and apparatus, an electronic device, and a storage medium.
Background
When the animation is previewed in a skeletal animation editing tool (such as Spine), the animation is previewed based on the default state of a skeletal model, and when the animation runs in a game engine, only the first frame of animation is played from the default state, and the subsequent frames of animation are played next to the previous frame of animation and are not played from the default state.
When animation is transited, that is, when switching from one animation to another animation, if a node exists in the animation, the node is related to a key frame in the previous animation, and there is no key frame in the next animation, after the animation is switched, because the state of the node in the next animation is inconsistent with the default state and the key frame is lacked to forcibly update the state of the node, the state of the node in the previous animation remains on the next animation, and an abnormality occurs when the animation is presented, which seriously affects the game quality.
For the residual problem during animation transition, it is only required to forcibly update the abnormal node on the subsequent animation, and the current method generally includes: finding abnormal nodes in a manual mode, and adding key frames in the animation by using the default states of the abnormal nodes. However, when the number of animations is large, the processing efficiency of the abnormal node by the manual method is low.
Disclosure of Invention
In view of the above, an object of the present application is to provide an animation processing method, an animation processing apparatus, an electronic device, and a storage medium, which improve animation processing efficiency.
In a first aspect, an embodiment of the present application provides an animation processing method, where the method includes:
acquiring node identifiers of operation nodes respectively corresponding to at least one action sequence in the animation file and node default state information corresponding to each operation node; each action sequence comprises at least one execution action;
for each action sequence, determining at least one difference node of the action sequence compared with other action sequences based on the node identification of the operation node corresponding to the action sequence and the node identifications of the operation nodes corresponding to other action sequences respectively, wherein the difference node is the operation node corresponding to the execution action not included in the action sequence;
and performing frame supplementing processing on the action sequence according to the node default state information corresponding to the different node.
In one embodiment, the performing, according to the node default state information corresponding to the different node, frame supplementing processing on the action sequence includes:
and according to the node default state information corresponding to the difference node, performing frame supplementing processing before the first animation frame corresponding to the action sequence.
In an embodiment, the performing, according to the node default state information corresponding to the difference node, frame supplementing processing before the first animation frame corresponding to the action sequence includes:
generating an animation frame for the action sequence based on the node default state information of the difference node;
adding a new table entry above the first entry of the animation frame list corresponding to the action sequence;
and writing the generated animation frame into the newly added table entry of the corresponding animation frame list.
In one embodiment, the determining, based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to the other action sequences, at least one difference node of the action sequence compared with the other action sequences includes:
and respectively taking each action sequence except the action sequence as a current comparison sequence, and if the operation node corresponding to the action sequence is determined not to contain the operation node corresponding to the first execution action in the current comparison sequence based on the node identification, taking the operation node corresponding to the first execution action as the at least one difference node.
In one embodiment, determining at least one difference node of the action sequence compared with other action sequences based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to the other action sequences respectively comprises:
if the action sequence corresponds to a preset execution sequence, determining that the operation node corresponding to the action sequence does not include an operation node corresponding to a second execution action in a previous action sequence adjacent to the execution sequence of the action sequence based on the node identifier, and taking the operation node corresponding to the second execution action as the at least one difference node.
In one embodiment, the method further comprises:
determining a root operation node from the plurality of operation nodes based on node default state information corresponding to each operation node;
extracting a texture scaling coefficient corresponding to the root operation node from node default state information corresponding to the root operation node;
generating a texture file adjustment coefficient for the animation file based on the texture scaling coefficient of the root operation node; and the texture file adjusting coefficient is used for adjusting the texture file corresponding to the animation file.
In one embodiment, before obtaining node identifiers of operation nodes respectively corresponding to at least one action sequence included in the animation file, the method further includes:
and acquiring the updated animation file.
In an embodiment, the obtaining node identifiers of operation nodes respectively corresponding to at least one action sequence included in the animation file, and node default state information corresponding to each operation node includes:
exporting an animation intermediate file in a text format from the animation file after the animation production data is removed;
and analyzing the animation intermediate file to obtain node identifications of operation nodes respectively corresponding to at least one action sequence in the animation file and node default state information corresponding to each operation node.
In one embodiment, after performing the frame complementing process on the motion sequence, the method further includes:
exporting a texture file corresponding to the animation file, and exporting an animation file in a binary format based on the animation file in the text format after frame supplement;
obtaining an animation running file based on the exported texture file and the animation file in the binary format; the animation running file is used for controlling the playing of the animation in the game.
In a second aspect, an embodiment of the present application provides an animation processing apparatus, including:
the acquisition module is used for acquiring node identifiers of operation nodes respectively corresponding to at least one action sequence in the animation file and node default state information corresponding to each operation node; each action sequence comprises at least one execution action;
a determining module, configured to determine, for each action sequence, at least one difference node of the action sequence compared with other action sequences based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to the other action sequences, where the difference node is an operation node corresponding to an execution action not included in the action sequence;
and the processing module is used for performing frame supplementing processing on the action sequence according to the node default state information corresponding to the different node.
In one embodiment, the processing module is configured to perform frame-filling processing on the motion sequence according to the following steps:
and according to the node default state information corresponding to the difference node, performing frame supplementing processing before the first animation frame corresponding to the action sequence.
In one embodiment, the processing module is configured to perform frame-filling processing before a first animation frame corresponding to the motion sequence according to the following steps:
generating an animation frame for the action sequence based on the node default state information of the difference node;
adding a new table entry above the first entry of the animation frame list corresponding to the action sequence;
and writing the generated animation frame into the newly added table entry of the corresponding animation frame list.
In one embodiment, the determining module is configured to determine at least one difference node of the action sequence compared to other action sequences according to the following steps:
and respectively taking each action sequence except the action sequence as a current comparison sequence, and if the operation node corresponding to the action sequence is determined not to contain the operation node corresponding to the first execution action in the current comparison sequence based on the node identification, taking the operation node corresponding to the first execution action as the at least one difference node.
In one embodiment, the determining module is configured to determine at least one difference node of the action sequence compared to other action sequences according to the following steps:
if the action sequence corresponds to a preset execution sequence, determining that the operation node corresponding to the action sequence does not include an operation node corresponding to a second execution action in a previous action sequence adjacent to the execution sequence of the action sequence based on the node identifier, and taking the operation node corresponding to the second execution action as the at least one difference node.
In one embodiment, the method further comprises: a generation module to:
determining a root operation node from the plurality of operation nodes based on node default state information corresponding to each operation node;
extracting a texture scaling coefficient corresponding to the root operation node from node default state information corresponding to the root operation node;
generating a texture file adjustment coefficient for the animation file based on the texture scaling coefficient of the root operation node; and the texture file adjusting coefficient is used for adjusting the texture file corresponding to the animation file.
In one embodiment, the obtaining module is further configured to:
and acquiring the updated animation file.
In an embodiment, the obtaining module is configured to obtain, according to the following steps, node identifiers of operation nodes respectively corresponding to at least one action sequence included in the animation file, and node default state information corresponding to each operation node:
exporting an animation intermediate file in a text format from the animation file after the animation production data is removed;
and analyzing the animation intermediate file to obtain node identifications of operation nodes respectively corresponding to at least one action sequence in the animation file and node default state information corresponding to each operation node.
In one embodiment, the processing module is further configured to:
exporting a texture file corresponding to the animation file, and exporting an animation file in a binary format based on the animation file in the text format after frame supplement;
obtaining an animation running file based on the exported texture file and the animation file in the binary format; the animation running file is used for controlling the playing of the animation in the game.
In a third aspect, an embodiment of the present application provides an electronic device, including: the animation processing method comprises a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, when the electronic device runs, the processor is communicated with the storage medium through the bus, and the processor executes the machine-readable instructions to execute the steps of the animation processing method.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the animation processing method.
The animation processing method provided in the embodiment of the application obtains node identifiers of operation nodes corresponding to at least one action sequence included in an animation file respectively, and node default state information corresponding to each operation node, wherein each action sequence includes at least one execution action, and for each action sequence, based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to other action sequences, at least one difference node of the action sequence compared with other action sequences is determined from the operation nodes corresponding to other action sequences, the difference node is an operation node corresponding to an execution action not included in the action sequence, and based on the node default state information corresponding to the determined operation node, the animation file is subjected to frame supplementing, and finally the animation file after frame supplementing is obtained, therefore, by comparing the operation node corresponding to the current action sequence with the operation nodes corresponding to other action sequences, the difference node is automatically determined for each action sequence in the animation file, and the determined difference node is used for complementing frames for the animation file, so that the animation processing efficiency is improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
FIG. 1 is a flow chart illustrating an animation processing method according to an embodiment of the present disclosure;
FIG. 2 is a first schematic structural diagram of an animation processing device according to an embodiment of the present application;
FIG. 3 is a second schematic structural diagram of an animation processing apparatus according to an embodiment of the present application;
fig. 4 shows a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
The animation processing method of the embodiment of the application can be applied to terminal equipment and can also be applied to any other computing equipment with a processing function. In some embodiments, the terminal device or computing device may include a processor. The processor may process the information and/or data related to the request to perform one or more of the functions described herein.
In the related art, a Cocos game engine is widely applied to game production as a cross-platform game engine, but the Cocos engine supports two-dimensional animation production, but the produced animation has limited functions and cannot meet the complex requirements of players, and when the Cocos game engine is used for producing a game, if animation with stronger expressive force is needed, a professional animation production tool is often needed to be matched for use. Thus, it is contemplated that animation tools, such as the Spine tool, may be used to animate in conjunction with the Cocos game engine.
The Spine tool is an animation tool supporting most game engines, and supports advanced animation functions such as animation transition, animation fusion, skin replacement, flexible animation, drawing sequence adjustment and the like. In applying the Spine tool to the Cocos game engine, the animation process generally comprises: first, the skeleton and the map are assembled in the Spine tool, resulting in the default pose of the animation, i.e., the initial state of the animation (also called the SetupPose state), then, key frames are added to the animation based on the SetupPose state, the key frame is generally a frame where a key action in the movement or change of the character or object is located, and the purpose of the key frame is to make the animation switching smoother, the key frame generally contains the setup _ Pose state information of the node that needs to be operated by animation switching (motion transition), the node is generally a part (such as an arm, a leg, an eye, a foot, etc.) in the model, the model can be a skeleton model of any object (such as human body, animal, etc.), animation can be previewed in a Spine tool directly after the key frame is made, and manually exporting the animation through a Spine tool, and adding the exported animation file into the game directory. When the animation is played in the game, the animation is played according to the key frame.
Before the animation is derived by the Spine tool, the animation is previewed by the Spine tool, when the animation is previewed by the Spine tool, the SetupPose state of the node is used as an initial state for previewing, when the Cocos game engine actually runs, only the first action in the animation is played from the SetupPose state, the animation corresponding to the subsequent action is played from the previous action, and the animation corresponding to the subsequent action is not played from the initial state. Therefore, when the animation corresponding to the action is transited, if a node exists, the node is related to a key frame in the previous animation, but does not have a key frame in the next animation, after the animation is transited, the current state of the node is inconsistent with the initial state of the node, and the node is forcibly updated due to the lack of the key frame, finally, the running state of the node in the previous animation is remained on the next animation, and the animation is abnormally represented, thereby seriously affecting the game quality.
Aiming at the problem of residue in animation transition, only a residual node needs to be forcibly updated on the next animation, and the current general method is as follows: the animation frames in the animation are modified manually, i.e., an abnormal node in the animation is found, a key frame is added to the subsequent animation by using the SetupPose state of the abnormal node, and the animation is exported again. Therefore, when animation transition occurs, the subsequent animation forcibly updates the state of the animation node to the default state through the key frame, and the residual problem can be corrected.
However, the residuals are corrected manually, and when the number of animation frames is large, even an undiscovered residual problem may exist, which increases the processing difficulty and reduces the processing efficiency. In addition, when a new animation containing new nodes is created at the development end, almost all previous animations need to be subjected to frame supplementing, and the workload is increased.
In addition, when the animation is exported by the Spine tool, the animation is generally exported manually, export parameters are manually estimated, and the parameters are not standardized, so that export files are not good, for example, if the texture of the animation is large, the side effects of wasting disk space, increasing a game bag body, wasting precision, slowing game running and the like can be generated; if the texture of the animation is too small, the picture will not perform well, and the quality of the game will be affected. When the Spine tool is used for exporting the animation, the exported effect cannot be directly previewed, the animation needs to be previewed in the game, export parameters need to be corrected for multiple times, and then a proper export texture can be obtained, so that the animation can finally meet the game requirement through a series of feedback processes and multiple times of modification of the export parameters, and development efficiency is low.
Based on this, the embodiment of the application provides an animation processing method, so as to improve the processing efficiency of the animation and improve the quality of the animation played by a game. Specifically, the present application obtains node identifiers of operation nodes corresponding to at least one action sequence included in an animation file, and node default state information corresponding to each operation node, wherein each action sequence includes at least one execution action, determines, for each action sequence, at least one difference node of the action sequence compared with other action sequences based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to other action sequences, the difference node being an operation node corresponding to an execution action not included in the action sequence, and complements a frame for the animation file based on the node default state information corresponding to the determined operation node, so as to obtain an animation file after complementing the frame, thereby improving processing efficiency of the animation, when the animation file is applied to a game, the animation corresponding to each action sequence is presented from the default state of the node in the presentation process, so that smooth transition of the action sequence can be realized when the action sequence is switched, and the picture quality of the game is improved.
The idea of an embodiment of the present application is described in detail below.
The embodiment of the application provides an animation processing method, as shown in fig. 1, specifically including the following steps:
s101, acquiring node identifiers of operation nodes respectively corresponding to at least one action sequence in an animation file, and node default state information corresponding to each operation node; each action sequence comprises at least one execution action;
s102, aiming at each action sequence, determining at least one difference node of the action sequence compared with other action sequences based on the node identification of the operation node corresponding to the action sequence and the node identifications of the operation nodes corresponding to other action sequences respectively, wherein the difference node is the operation node corresponding to the execution action not included in the action sequence;
and S103, performing frame supplementing processing on the action sequence according to the node default state information corresponding to the different node.
In S101, the animation file is typically an animation project exported using Spine tools, the animation project is generally an animation for a game character, which may be a character, an animal, a hero, etc., and the animation file generally includes at least one sequence of actions, the action sequence may be each action sequence that a game character may execute during game running, where each action sequence includes at least one execution action, and executing one execution action may require cooperation of multiple nodes in the game character, where the node may be a skeleton (e.g., a bone), an Inverse Kinematics (IK) node (e.g., a special part of a model), a slot (e.g., a container displaying a picture in a skeleton animation), and so on, and a node when each execution action is executed is used as an operation node, that is, an operation node corresponding to each action sequence is a node used by an action executed during animation playing. The execution action can be running, jumping, dancing, shooting, blinking eyes and other actions, and the node identification can be identified through numbers, letters, characters and the like.
The node default state information is the setupose state information of the node, and the node default state information comprises a position coefficient, a rotation coefficient, a scaling coefficient, a shearing coefficient and the like of the node, and can be determined according to actual conditions.
The node identifier of the operation node and the node default state information are obtained by analyzing an animation intermediate file, wherein the animation intermediate file can be obtained by exporting an animation source project, and the animation source project comprises an animation file and a texture file.
When the animation source engineering is exported, the animation source engineering is generally exported twice. The first export of the generated animation intermediate file is mainly used to correct the defects in the animation and obtain the optimal texture file adjustment coefficients (described in detail below). The second derivation takes the corrected animation intermediate file as input, the obtained texture file adjusting coefficient is used for deriving the texture file, and the finally obtained animation running file effectively reduces the size of derived animation resources while ensuring the animation quality.
Defects in the animation are corrected through the animation intermediate file exported for the first time, the animation is subjected to frame supplementing processing automatically, the performance of the animation is not required to be checked through running a game, secondary modification is not required to be carried out on an animation source project, a plurality of problems generated during animation transition in the game can be solved, and the animation making time is shortened.
In the specific implementation process, the animation source project needing export processing may not be a single animation project but may be a plurality of animation projects, and different animation projects are only required to be distinguished according to folders during export, so that batch export is realized, and export time is greatly reduced. In order to ensure that the animation source project needing to be exported is the latest animation source project, before the export, an updated animation file is obtained, and the updated animation file can be obtained by updating the animation project by using an updating tool such as a version control System (SVN), a distributed version control system (Git), or a mirror image backup tool (rsync).
Considering that a development end modifies an animation source project, for example, adding an action in the animation source project, the modification may occur before the animation source project is exported or after the animation source project is exported, and when the animation source project is modified, the modification time of the animation source project is the latest time, in order to improve the export speed, before the animation source project of an animation file is exported by using an animation converter, judging whether the latest time of the animation source project is later than the export time of the animation source project recorded by a system, if so, indicating that the animation source project is modified after the last export, and needing to perform frame supplementing processing on the animation source project again, that is, needing to analyze an intermediate file of the exported animation so as to perform subsequent frame supplementing processing; otherwise, the animation source project of the animation file is subjected to frame supplementing processing, and the next animation is processed, so that repeated processing of the animation source project subjected to export processing is avoided, and the processing efficiency is improved.
When the animation source project of the animation file is exported (exported for the first time), a uniform export parameter (the export parameter can be determined according to a historical export parameter) is generally used, the export process is aimed at performing frame supplementing processing on an action sequence in the animation file, and when the frame supplementing processing is performed, a texture file and unnecessary operation data (animation production data) are not necessary data, so that the texture file can not be exported when the animation source project is exported for the first time, an animation intermediate file with unnecessary operation data removed is obtained, and the animation intermediate file in a text format is exported for facilitating the operation of an originating terminal, namely, the animation intermediate file with the animation production data removed and in a text format is exported from the animation file to obtain a visual animation intermediate file, so that the export data is reduced, and the export speed is accelerated. The animation production data is data generated in the process that a research and development end uses a Spine tool to produce an animation file, and the data can be skeleton color data in a texture file, a source texture file path, shooting list frame rate data used for previewing the animation in Spine, an audio path and the like.
After the animation intermediate file is obtained, the animation intermediate file needs to be analyzed twice, so that the node identification and the node default state information of the operation node are obtained. When the animation intermediate file is analyzed for the first time, node default state information of the operation nodes corresponding to each action sequence included in the animation file is obtained, and the node default state information of the operation nodes can be stored in a list form. Then, the animation intermediate file is analyzed again, the node identifier of the operation node corresponding to each action sequence included in the animation file and the node state information of each operation node are obtained, the node state information is the state information of the node in different frames, and generally includes the position of the node in the current frame, the offset of the node in the current frame and the like, the node identifier of the operation node corresponding to each action sequence and the node state information of the operation node are stored in the form of an animation frame list, that is, one action sequence corresponds to one animation frame list, and one item in the animation frame list is one animation frame. The node default state information and the node state information corresponding to each action sequence can be obtained by analyzing the animation intermediate file once or obtained in different times.
When the animation file is applied to a game, corresponding action sequences are triggered in response to touch operations of a player, the game presents an animation effect corresponding to the action sequences, considering that the touch operations of the player are performed based on a game scene, and there may not be a correlation between action sequences triggered by different touch operations, that is, there may not be an inevitable execution sequence for the execution of the action sequences, and there may also be an execution sequence between some action sequences, for example, the animation file includes an action sequence A, B, C, after the action sequence a is triggered by the player, an action sequence B may be triggered, and after the action sequence B is triggered, an action sequence a may be retriggered, and at this time, there is no execution sequence between the action sequences A, B, C; or, after triggering the action sequence a, the player necessarily triggers the action sequence B, and at this time, there is a sequential execution order between the animation sequences a and B.
The following is introduced for the case where the sequence of actions has and does not have an execution order:
the first condition is as follows: and in the running process of the game, the action sequence corresponding to the presented animation does not have a sequential execution sequence.
Determining at least one difference node of the action sequence compared with other action sequences based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to other action sequences respectively, specifically comprising:
and respectively taking each action sequence except the action sequence as a current comparison sequence, and if the operation node corresponding to the action sequence is determined not to contain the operation node corresponding to the first execution action in the current comparison sequence based on the node identification, taking the operation node corresponding to the first execution action as the at least one difference node.
Here, the first performing action may comprise at least one performing action, the performing action being referred to above.
In a specific implementation process, regarding each action sequence in the animation file, taking each action sequence except the action sequence as a current comparison sequence, and comparing the node identifier of the operation node corresponding to the action sequence with the node identifier of the operation node corresponding to the current comparison sequence.
If the operation node corresponding to the action sequence does not include an operation node corresponding to a first execution action in the current comparison sequence, that is, the operation node corresponding to the action sequence is different from the operation node corresponding to the execution action included in the current comparison sequence, and the different operation nodes are the operation nodes corresponding to the first execution action in the current comparison sequence, the operation node corresponding to the first execution action is taken as the operation node corresponding to the action sequence compared with the current comparison sequence, and the operation nodes corresponding to the execution actions not included are sequentially compared with other action sequences taken as the current comparison sequence, so that the operation node corresponding to the action sequence compared with other action sequences and the operation node corresponding to the execution action not included is obtained.
If the operation node corresponding to the action sequence comprises an operation node corresponding to an execution action in the current comparison sequence, that is, the operation node corresponding to the action sequence is the same as the operation node corresponding to the current comparison sequence, a action sequence is obtained again from other action sequences, the obtained action sequence is used as the current comparison sequence, the step of comparing the node identification of the operation node corresponding to the action sequence with the node identification of the operation node corresponding to the current comparison sequence is continuously carried out until all other action sequences are compared, and finally at least one difference node of the action sequence compared with other action sequences is obtained.
For example, A, B, C is included in the animation file, action sequence a includes three operation nodes, which are a1, a2 and A3, action sequence B includes three operation nodes, which are B1, a2 and B3, action sequence C includes three operation nodes, which are a1, C2 and C3, action sequence a is taken as an action sequence that requires to determine an operation node corresponding to an uncontained execution action, B and C are respectively taken as current alignment sequences, the operation node corresponding to action sequence a and the operation node corresponding to action sequence B are aligned, it is found that operation node B1 and B3 in action sequence B are not included in action sequence a, operation node B1 and B3 are determined to be action sequence a and operation node corresponding to an uncontained execution action, and similarly, operation node C2 and C3 are determined to be action sequence a and action sequence C, the operation nodes corresponding to the execution actions not included may finally be determined as B1, B3, C2 and C3 when the action sequence A is compared with the action sequences B and C.
Case two: during game operation, the action sequence corresponding to the presented animation has an execution sequence.
When determining at least one difference node of the action sequence compared with other action sequences based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to other action sequences, the method specifically includes the following steps:
and determining that the operation node corresponding to the action sequence does not contain an operation node corresponding to a second execution action in the action sequence adjacent to the execution sequence of the action sequence based on the node identifier, and taking the operation node corresponding to the second execution action as the at least one difference node.
Here, the execution sequence is generally a sequential execution sequence between two or more action sequences, that is, when the animation corresponding to the action sequence is presented, the animation corresponding to the action sequence before the execution sequence is preferentially displayed, and then the animation corresponding to the action sequence after the execution sequence is displayed; the second performed action includes at least one performed action.
In a specific implementation process, aiming at each action sequence in the animation file, the node identification of the operation node corresponding to the action sequence is compared with the node identification of the operation node corresponding to the action sequence adjacent to the action execution sequence.
If the operation node corresponding to the action sequence does not include an operation node corresponding to a second execution action in an action sequence adjacent to the execution sequence in the execution sequence, that is, the operation node corresponding to the action sequence is different from the operation node corresponding to the execution action included in the action sequence adjacent to the execution sequence in the execution sequence, and the different operation node is the operation node corresponding to the second execution action in the previous action sequence, the operation node corresponding to the second execution action is taken as the operation node corresponding to the execution action not included in the action sequence compared with other action sequences.
If the operation node corresponding to the action sequence comprises an operation node corresponding to a second execution action in an action sequence adjacent to the execution sequence of the action sequence, namely, the operation node corresponding to the action sequence is the same as the operation node corresponding to an action sequence adjacent to the execution sequence, selecting an action sequence from other action sequences corresponding to the animation file except the action sequence, judging whether the action sequence corresponds to a preset execution sequence, if the selected action sequence is preset with the execution sequence, executing a step of comparing the node identification of the operation node corresponding to the action sequence with the node identification of the operation node corresponding to the action sequence adjacent to the action execution sequence, and if the selected action sequence does not preset the execution sequence, comparing the node identification of the operation node corresponding to the action sequence with the node identification of the operation node corresponding to the current comparison sequence in the first execution case And (5) identifying.
Continuing with the above example, if there is an execution order between the action sequence a and the action sequence B, and the execution order of the action sequence a is prior to the action sequence B, when determining an operation node corresponding to an execution action not included in the action sequence B, the operation node corresponding to the action sequence a is compared with the operation node corresponding to the action sequence B, and it is found that the operation nodes a1 and A3 in the action sequence a are not included in the action sequence B, it is determined that the operation nodes a1 and A3 are operation nodes corresponding to execution actions not included in the action sequence B as compared with the action sequence a.
After determining an operation node corresponding to an execution action which is not included in each action sequence, performing frame interpolation processing on each action sequence, wherein each animation frame corresponding to the action sequence is stored in an animation frame list form, and each item in the list corresponds to one animation frame, so that when performing frame interpolation processing on the action sequence according to the node default state information corresponding to the different node, frame interpolation processing can be performed before a first animation frame corresponding to the action sequence according to the node default state information corresponding to the different node, and the method specifically comprises the following steps:
generating an animation frame for the action sequence based on the node default state information of the difference node;
adding a new table entry above the first entry of the animation frame list corresponding to the action sequence;
and writing the generated animation frame into the newly added table entry of the corresponding animation frame list.
Here, the animation frame includes node default state information of each operation node determined for the action sequence.
In a specific implementation process, for each action sequence, generating an animation frame for the action sequence by using the node default state information of the operation node determined for the action sequence, where the animation frame includes the node default state information of the operation node determined for the action sequence, and adding an entry to the first entry in the animation frame list corresponding to the action sequence, and writing the generated animation frame into the added entry in the animation frame list, that is, writing the node default state information of the operation node determined for the action sequence into the added entry, thereby completing the frame supplement for the action sequence.
After completing the frame supplementing of each action sequence of the current action, the defect correction of the animation is realized, and further the node default state information of each operation node obtained by analysis and the animation frame list after the frame supplementing corresponding to each action sequence can be rewritten into the animation intermediate file.
When exporting the animation file after frame supplementing, the animation file after frame supplementing can be taken as a source through a Spine command line, the export parameters after correction are used, the export parameters comprise texture file adjusting coefficients (detailed below) and file format export settings, and then the animation file in the binary format and the texture file after adjustment can be exported, namely, the texture file corresponding to the animation file is exported, and the animation file in the binary format is exported based on the animation file in the text format after frame supplementing; obtaining an animation running file based on the exported texture file and the animation file in the binary format; the animation running file is used for controlling the playing of the animation in the game. Therefore, the exported animation file not only contains the original animation frames of the animation file, but also contains the added animation frames, so that the animation defects are corrected, and a texture file with a proper size is obtained by using a proper texture file adjusting coefficient.
Considering that the export result (animation running file) exported again from the modified animation intermediate file is directly applied to the game, after the frame supplementing processing is carried out on the animation file, the texture scaling coefficient corresponding to the texture file is adjusted so as to obtain a proper texture file when the texture file is exported again, therefore, the running quality of the game can be improved, and the animation file in the binary format is exported when the animation file after the frame supplementing processing is exported, but the animation file in the binary format is smaller, and when the animation file is run in the game, the animation file does not need to be compiled, so that the running speed of the game is improved.
The following describes a process for determining an adjustment coefficient of a texture file, specifically:
determining a root operation node from the plurality of operation nodes based on node default state information corresponding to the different nodes;
extracting a texture scaling coefficient corresponding to the root operation node from the initial state information corresponding to the root operation node;
generating a texture file adjustment coefficient for the animation file based on the texture scaling coefficient of the root operation node; and the texture file adjusting coefficient is used for adjusting the texture file corresponding to the animation file.
Here, the node default state information includes a root operation node and a child operation node, the root operation node may be identified by a node identifier, such as root, and the node default state information includes a texture scaling factor corresponding to the root operation node.
In a specific implementation process, a root operation node is determined based on a node identifier of an operation node included in node default state information, further, a texture scaling coefficient of the root operation node is extracted from the node default state information corresponding to the root node, if the texture scaling coefficient is greater than or equal to a first preset value, a texture file adjustment coefficient of an animation file is set to be a second preset value, and if the texture scaling coefficient is less than the first preset value, a sum of the texture scaling coefficient and a third preset value is used as the texture file adjustment coefficient of the animation file. The first preset value is generally 0.9, the second preset value is generally 1, and the third preset value is generally 0.1.
The texture file adjustment coefficient may be determined according to the following formula:
Figure BDA0002293745460000191
wherein s is a texture file adjustment coefficient, rs is a texture scaling coefficient of the root node, α is a third preset value which can be 0.1, β is a first preset value which can be 0.9, and γ is a second preset value which can be 1.
Based on the same inventive concept, an animation processing apparatus corresponding to the animation processing method is also provided in the embodiments of the present application, and as the principle of solving the problem of the method in the embodiments of the present application is similar to the animation processing method in the embodiments of the present application, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not repeated.
An embodiment of the present application provides an animation processing apparatus, as shown in fig. 2, the apparatus including:
an obtaining module 21, configured to obtain node identifiers of operation nodes corresponding to at least one action sequence included in the animation file, and node default state information corresponding to each operation node; each action sequence comprises at least one execution action;
a determining module 22, configured to determine, for each action sequence, at least one difference node of the action sequence compared with other action sequences based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to other action sequences, where the difference node is an operation node corresponding to an execution action not included in the action sequence;
and the processing module 23 is configured to perform frame supplementing processing on the action sequence according to the node default state information corresponding to the difference node.
In one embodiment, the processing module 23 is configured to perform frame-filling processing on the motion sequence according to the following steps:
and according to the node default state information corresponding to the difference node, performing frame supplementing processing before the first animation frame corresponding to the action sequence.
In one embodiment, the processing module 23 is configured to perform a frame complementing process before a first animation frame corresponding to the motion sequence according to the following steps:
generating an animation frame for the action sequence based on the node default state information of the difference node;
adding a new table entry above the first entry of the animation frame list corresponding to the action sequence;
and writing the generated animation frame into the newly added table entry of the corresponding animation frame list.
In one embodiment, the determining module 22 is configured to determine at least one difference node of the action sequence compared to other action sequences according to the following steps:
and respectively taking each action sequence except the action sequence as a current comparison sequence, and if the operation node corresponding to the action sequence is determined not to contain the operation node corresponding to the first execution action in the current comparison sequence based on the node identification, taking the operation node corresponding to the first execution action as the at least one difference node.
In one embodiment, the determining module 22 is configured to determine at least one difference node of the action sequence compared to other action sequences according to the following steps:
if the action sequence corresponds to a preset execution sequence, determining that the operation node corresponding to the action sequence does not include an operation node corresponding to a second execution action in a previous action sequence adjacent to the execution sequence of the action sequence based on the node identifier, and taking the operation node corresponding to the second execution action as the at least one difference node.
In one embodiment, the obtaining module 21 is further configured to:
and acquiring the updated animation file.
In an embodiment, the obtaining module 21 is configured to obtain node identifiers of operation nodes respectively corresponding to at least one action sequence included in the animation file, and node default state information corresponding to each operation node, according to the following steps:
exporting an animation intermediate file in a text format from the animation file after the animation production data is removed;
and analyzing the animation intermediate file to obtain node identifications of operation nodes respectively corresponding to at least one action sequence in the animation file and node default state information corresponding to each operation node.
In one embodiment, the processing module 23 is further configured to:
exporting a texture file corresponding to the animation file, and exporting an animation file in a binary format based on the animation file in the text format after frame supplement;
obtaining an animation running file based on the exported texture file and the animation file in the binary format; the animation running file is used for controlling the playing of the animation in the game.
An embodiment of the present application further provides an animation processing apparatus, as shown in fig. 3, the apparatus further includes: a generating module 24, the generating module 24 being configured to:
determining a root operation node from the plurality of operation nodes based on node default state information corresponding to each operation node;
extracting a texture scaling coefficient corresponding to the root operation node from node default state information corresponding to the root operation node;
generating a texture file adjustment coefficient for the animation file based on the texture scaling coefficient of the root operation node; and the texture file adjusting coefficient is used for adjusting the texture file corresponding to the animation file.
An embodiment of the present application further provides an electronic device, as shown in fig. 4, which is a schematic structural diagram of the electronic device 40 provided in the embodiment of the present application, and the electronic device includes:
a processor 41, a memory 42, and a bus 43; the memory 42 is used for storing execution instructions and includes a memory 421 and an external memory 422; the memory 421 is also referred to as an internal memory, and is used for temporarily storing the operation data in the processor 41 and the data exchanged with the external memory 422 such as a hard disk, the processor 41 exchanges data with the external memory 422 through the memory 421, and when the electronic device 40 operates, the processor 41 communicates with the memory 42 through the bus 43, so that the processor 41 executes the following instructions in a user mode:
acquiring node identifiers of operation nodes respectively corresponding to at least one action sequence in the animation file and node default state information corresponding to each operation node; each action sequence comprises at least one execution action;
for each action sequence, determining at least one difference node of the action sequence compared with other action sequences based on the node identification of the operation node corresponding to the action sequence and the node identifications of the operation nodes corresponding to other action sequences respectively, wherein the difference node is the operation node corresponding to the execution action not included in the action sequence;
and performing frame supplementing processing on the action sequence according to the node default state information corresponding to the different node.
In one possible embodiment, in the instructions executed by the processor 41, the performing, according to the node default state information corresponding to the different node, frame supplementing processing on the action sequence includes:
and according to the node default state information corresponding to the difference node, performing frame supplementing processing before the first animation frame corresponding to the action sequence.
In one possible embodiment, in the instructions executed by the processor 41, the performing, according to the node default state information corresponding to the difference node, frame supplementing processing before the first animation frame corresponding to the action sequence includes:
generating an animation frame for the action sequence based on the node default state information of the difference node;
adding a new table entry above the first entry of the animation frame list corresponding to the action sequence;
and writing the generated animation frame into the newly added table entry of the corresponding animation frame list.
In a possible implementation manner, in the instructions executed by the processor 41, the determining, based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to the other action sequences, at least one difference node of the action sequence compared to the other action sequences includes:
and respectively taking each action sequence except the action sequence as a current comparison sequence, and if the operation node corresponding to the action sequence is determined not to contain the operation node corresponding to the first execution action in the current comparison sequence based on the node identification, taking the operation node corresponding to the first execution action as the at least one difference node.
In one possible embodiment, the determining, by the processor 41, at least one difference node of the action sequence compared with other action sequences based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to the other action sequences respectively includes:
if the action sequence corresponds to a preset execution sequence, determining that the operation node corresponding to the action sequence does not include an operation node corresponding to a second execution action in a previous action sequence adjacent to the execution sequence of the action sequence based on the node identifier, and taking the operation node corresponding to the second execution action as the at least one difference node.
In a possible implementation, the instructions executed by the processor 41 further include:
determining a root operation node from the plurality of operation nodes based on node default state information corresponding to each operation node;
extracting a texture scaling coefficient corresponding to the root operation node from node default state information corresponding to the root operation node;
generating a texture file adjustment coefficient for the animation file based on the texture scaling coefficient of the root operation node; and the texture file adjusting coefficient is used for adjusting the texture file corresponding to the animation file.
In a possible embodiment, the instructions executed by the processor 41 further include, before obtaining node identifiers of operation nodes respectively corresponding to at least one action sequence included in the animation file, that:
and acquiring the updated animation file.
In a possible implementation manner, in the instructions executed by the processor 41, the obtaining node identifiers of operation nodes respectively corresponding to at least one action sequence included in the animation file, and node default state information corresponding to each operation node includes:
exporting an animation intermediate file in a text format from the animation file after the animation production data is removed;
and analyzing the animation intermediate file to obtain node identifications of operation nodes respectively corresponding to at least one action sequence in the animation file and node default state information corresponding to each operation node.
In one possible embodiment, the instructions executed by the processor 41 further include, after performing the frame-filling process on the motion sequence:
exporting a texture file corresponding to the animation file, and exporting an animation file in a binary format based on the animation file in the text format after frame supplement;
obtaining an animation running file based on the exported texture file and the animation file in the binary format; the animation running file is used for controlling the playing of the animation in the game.
The present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the animation processing method in the foregoing method embodiments.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to corresponding processes in the method embodiments, and are not described in detail in this application. In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and there may be other divisions in actual implementation, and for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or modules through some communication interfaces, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A method for processing an animation, the method comprising:
acquiring node identifiers of operation nodes respectively corresponding to at least one action sequence in the animation file and node default state information corresponding to each operation node; each action sequence comprises at least one execution action;
for each action sequence, determining at least one difference node of the action sequence compared with other action sequences based on the node identification of the operation node corresponding to the action sequence and the node identifications of the operation nodes corresponding to other action sequences respectively, wherein the difference node is the operation node corresponding to the execution action not included in the action sequence;
and performing frame supplementing processing on the action sequence according to the node default state information corresponding to the different node.
2. The animation processing method as claimed in claim 1, wherein the performing of the frame supplementing processing on the action sequence according to the node default state information corresponding to the difference node comprises:
and according to the node default state information corresponding to the difference node, performing frame supplementing processing before the first animation frame corresponding to the action sequence.
3. The animation processing method as claimed in claim 1, wherein the performing of frame supplementing processing before a first animation frame corresponding to the action sequence according to the node default state information corresponding to the difference node comprises:
generating an animation frame for the action sequence based on the node default state information of the difference node;
adding a new table entry above the first entry of the animation frame list corresponding to the action sequence;
and writing the generated animation frame into the newly added table entry of the corresponding animation frame list.
4. The animation processing method as claimed in claim 1, wherein the determining at least one difference node of the action sequence compared with other action sequences based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to other action sequences respectively comprises:
and respectively taking each action sequence except the action sequence as a current comparison sequence, and if the operation node corresponding to the action sequence is determined not to contain the operation node corresponding to the first execution action in the current comparison sequence based on the node identification, taking the operation node corresponding to the first execution action as the at least one difference node.
5. The animation processing method as claimed in claim 1, wherein determining at least one difference node of the action sequence compared with other action sequences based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to other action sequences respectively comprises:
if the action sequence corresponds to a preset execution sequence, determining that the operation node corresponding to the action sequence does not include an operation node corresponding to a second execution action in a previous action sequence adjacent to the execution sequence of the action sequence based on the node identifier, and taking the operation node corresponding to the second execution action as the at least one difference node.
6. The animation processing method as claimed in claim 1, further comprising:
determining a root operation node from the plurality of operation nodes based on node default state information corresponding to each operation node;
extracting a texture scaling coefficient corresponding to the root operation node from node default state information corresponding to the root operation node;
generating a texture file adjustment coefficient for the animation file based on the texture scaling coefficient of the root operation node; and the texture file adjusting coefficient is used for adjusting the texture file corresponding to the animation file.
7. The animation processing method as claimed in claim 1, wherein before obtaining the node identifiers of the operation nodes respectively corresponding to the at least one action sequence included in the animation file, the method further comprises:
and acquiring the updated animation file.
8. The animation processing method according to claim 1, wherein the obtaining of the node identifiers of the operation nodes respectively corresponding to the at least one action sequence included in the animation file, and the node default state information corresponding to each operation node comprises:
exporting an animation intermediate file in a text format from the animation file after the animation production data is removed;
and analyzing the animation intermediate file to obtain node identifications of operation nodes respectively corresponding to at least one action sequence in the animation file and node default state information corresponding to each operation node.
9. The animation processing method as claimed in claim 8, further comprising, after the frame-filling processing for the motion sequence:
exporting a texture file corresponding to the animation file, and exporting an animation file in a binary format based on the animation file in the text format after frame supplement;
obtaining an animation running file based on the exported texture file and the animation file in the binary format; the animation running file is used for controlling the playing of the animation in the game.
10. An animation processing apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring node identifiers of operation nodes respectively corresponding to at least one action sequence in the animation file and node default state information corresponding to each operation node; each action sequence comprises at least one execution action;
a determining module, configured to determine, for each action sequence, at least one difference node of the action sequence compared with other action sequences based on the node identifier of the operation node corresponding to the action sequence and the node identifiers of the operation nodes corresponding to the other action sequences, where the difference node is an operation node corresponding to an execution action not included in the action sequence;
and the processing module is used for performing frame supplementing processing on the action sequence according to the node default state information corresponding to the different node.
11. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the electronic device is operating, the processor executing the machine-readable instructions to perform the steps of the animation processing method according to any one of claims 1 to 9.
12. A computer-readable storage medium, having stored thereon a computer program for performing, when executed by a processor, the steps of the animation processing method as claimed in any one of claims 1 to 9.
CN201911191752.XA 2019-11-28 2019-11-28 Animation processing method, device, electronic equipment and storage medium Active CN110992448B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911191752.XA CN110992448B (en) 2019-11-28 2019-11-28 Animation processing method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911191752.XA CN110992448B (en) 2019-11-28 2019-11-28 Animation processing method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110992448A true CN110992448A (en) 2020-04-10
CN110992448B CN110992448B (en) 2023-06-30

Family

ID=70088106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911191752.XA Active CN110992448B (en) 2019-11-28 2019-11-28 Animation processing method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110992448B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113209628A (en) * 2021-05-12 2021-08-06 郑州大学 AI-based image processing method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108010112A (en) * 2017-11-28 2018-05-08 腾讯数码(天津)有限公司 Animation processing method, device and storage medium
CN108446373A (en) * 2018-03-16 2018-08-24 五八有限公司 Animation playing method, device, equipment and storage medium
CN109544664A (en) * 2018-11-21 2019-03-29 北京像素软件科技股份有限公司 Animation data processing method, device, electronic equipment and readable storage medium storing program for executing
CN110428485A (en) * 2019-07-31 2019-11-08 网易(杭州)网络有限公司 2 D animation edit methods and device, electronic equipment, storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108010112A (en) * 2017-11-28 2018-05-08 腾讯数码(天津)有限公司 Animation processing method, device and storage medium
CN108446373A (en) * 2018-03-16 2018-08-24 五八有限公司 Animation playing method, device, equipment and storage medium
CN109544664A (en) * 2018-11-21 2019-03-29 北京像素软件科技股份有限公司 Animation data processing method, device, electronic equipment and readable storage medium storing program for executing
CN110428485A (en) * 2019-07-31 2019-11-08 网易(杭州)网络有限公司 2 D animation edit methods and device, electronic equipment, storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113209628A (en) * 2021-05-12 2021-08-06 郑州大学 AI-based image processing method and device

Also Published As

Publication number Publication date
CN110992448B (en) 2023-06-30

Similar Documents

Publication Publication Date Title
CN110766776B (en) Method and device for generating expression animation
CN109727302B (en) Skeleton creation method, device, electronic equipment and storage medium
CN107657650B (en) Animation model role binding method and system based on Maya software
CN109621419B (en) Game character expression generation device and method, and storage medium
CN110163939A (en) Three-dimensional animation role's expression generation method, apparatus, equipment and storage medium
CN111489423B (en) Animation processing method and device, electronic equipment and storage medium
Cooper et al. Active learning for real-time motion controllers
JP2012221318A (en) Scenario generation device and scenario generation program
CN111798550A (en) Method and device for processing model expressions
CN109656595A (en) The method, apparatus and system that client-side program updates
CN110992448A (en) Animation processing method and device, electronic equipment and storage medium
CN115564642A (en) Image conversion method, image conversion device, electronic apparatus, storage medium, and program product
CN112799656A (en) Script file configuration method, device, equipment and storage medium for automation operation
CN116843809A (en) Virtual character processing method and device
US20100013838A1 (en) Computer system and motion control method
CN115984433A (en) Skeleton animation generation method and device, storage medium and electronic equipment
CN114241099A (en) Method and device for batch zeroing of animation data and computer equipment
CN115599793A (en) Method, device and storage medium for updating data
CN112800736B (en) Method, device, medium and computer equipment for generating cell editing assembly
CN110877332B (en) Robot dance file generation method and device, terminal device and storage medium
CN112819931A (en) Animation generation method, animation generation device, terminal and storage medium
US8773441B2 (en) System and method for conforming an animated camera to an editorial cut
CN109947344B (en) Training method and device for application strategy model
CN116071473B (en) Method and system for acquiring animation motion key frame
WO2022269708A1 (en) Information processing device and information processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant