CN112156461A - Animation processing method and device, computer storage medium and electronic equipment - Google Patents

Animation processing method and device, computer storage medium and electronic equipment Download PDF

Info

Publication number
CN112156461A
CN112156461A CN202011091236.2A CN202011091236A CN112156461A CN 112156461 A CN112156461 A CN 112156461A CN 202011091236 A CN202011091236 A CN 202011091236A CN 112156461 A CN112156461 A CN 112156461A
Authority
CN
China
Prior art keywords
animation
fusion
game
configuration information
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011091236.2A
Other languages
Chinese (zh)
Inventor
李文松
周靖人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202011091236.2A priority Critical patent/CN112156461A/en
Publication of CN112156461A publication Critical patent/CN112156461A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Abstract

The present disclosure relates to the field of game technology, and provides an animation processing method, an animation processing apparatus, a computer storage medium, and an electronic device, wherein the animation processing method includes: establishing communication connection with an editing end; receiving fusion configuration information sent by an editing end, wherein the fusion configuration information comprises at least two animation names corresponding to at least two animation segments and fusion parameters for performing fusion processing on the at least two animation segments; and calling at least two animation segments according to the at least two animation names, and fusing the at least two animation segments according to the fusion parameters. The animation processing method in the disclosure decouples the game end and the editing end, is suitable for all games needing animation fusion debugging, can be combined with a specific animation engine interface, truly reflects the actual effect of animation fusion in the games, and improves the convenience and the quality of animation fusion.

Description

Animation processing method and device, computer storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of game technologies, and in particular, to an animation processing method, an animation processing apparatus, a computer storage medium, and an electronic device.
Background
Animation is widely used in many fields today as computer technology, network technology, and electronic technology are continuously developed. The skeleton of a game character is composed of a plurality of bones, which are organized in a tree-like manner, each bone recording translations, rotations, and scalings with respect to a parent node, so that the overall pose of the character depends on the pose of each bone. In the game, the animation comprises key frame data of a character skeleton in a period of time, and the animation can be used for driving the movement posture of the character to change during running. Animation fusion in a game generally refers to the transition process when a character switches between two actions.
In the related art, animations are fused by an animation fusion technology carried by a game engine. On one hand, however, the animation editing and debugging functions provided by the engine are usually only suitable for the engine itself and cannot be used across engines, so that the universality is poor; on the other hand, the animation debugging function of the engine is usually fixed, and the number of expansion interfaces provided to the outside is small, so that the flexibility is not strong, and the customized addition or adjustment of the related functions of animation fusion by a user is not facilitated.
In view of the above, there is a need in the art to develop a new animation processing method and apparatus.
It is to be noted that the information disclosed in the background section above is only used to enhance understanding of the background of the present disclosure.
Disclosure of Invention
The present disclosure is directed to provide an animation processing method, an animation processing apparatus, a computer storage medium, and an electronic device, so as to avoid the disadvantages of poor versatility and flexibility in the related art at least to a certain extent.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an animation processing method, including: establishing communication connection with an editing end; receiving fusion configuration information sent by the editing end, wherein the fusion configuration information comprises at least two animation names corresponding to at least two animation segments and fusion parameters for performing fusion processing on the at least two animation segments; calling at least two animation segments according to the at least two animation names, and fusing the at least two animation segments according to the fusion parameters; the fusion configuration information is information obtained by performing deserialization processing on a pre-created configuration file, or information received by the editing end.
In an exemplary embodiment of the present disclosure, the establishing a communication connection with an editing end includes: receiving port information sent by the editing end, and sending a communication connection request to the editing end according to the port information; and when a response message returned by the editing end is received, establishing communication connection with the editing end.
In an exemplary embodiment of the present disclosure, the fusion parameters include a fusion manner; the fusion mode comprises a first fusion mode; the fusion processing of the at least two animation segments according to the fusion parameters comprises: determining a target fusion mode according to the fusion parameters; if the target fusion mode is the first fusion mode, determining any animation segment of the at least two animation segments as a target animation; and determining the pose variation of the game role in the two adjacent frames of the target animation as the pose variation of the game role in the two adjacent frames of the fusion animation.
In an exemplary embodiment of the present disclosure, the fusion manner further includes a second fusion manner, and the method further includes: if the target fusion mode is the second fusion mode, respectively acquiring pose variation of game roles in two adjacent frames of each animation segment; and performing linear interpolation on at least two pose variation quantities to obtain target pose variation quantities, and determining the target pose variation quantities as pose variation quantities of game characters in two adjacent frames of the fusion animation.
In an exemplary embodiment of the present disclosure, the linearly interpolating at least two pose change amounts to obtain a target pose change amount includes: obtaining the product of each pose variation and the corresponding fusion weight; a preset mapping relation is formed between the pose variation and the fusion weight; the fusion weight varies linearly with time; determining a sum of at least two of the products as the target pose change amount.
According to a second aspect of the present disclosure, there is provided an animation processing method, including: determining whether to load a pre-established configuration file or not according to the received interactive operation instruction; if the configuration file is loaded, sending the fusion configuration information obtained after the deserialization processing is carried out on the configuration file to a game terminal; and if the configuration file is not loaded, sending the received fusion configuration information to the game terminal.
In an exemplary embodiment of the present disclosure, before determining whether to load the pre-created configuration file according to the received interoperation instruction, the method further includes: sending the port information to the game terminal; and receiving a communication connection request sent by the game terminal, and sending a response message to the game terminal.
In an exemplary embodiment of the present disclosure, after establishing a communication connection with the game terminal, the method further includes: acquiring an input keyword; matching the animation name corresponding to the input keyword based on a regular matching method; and sending fusion configuration information containing at least two animation names to the game terminal.
In an exemplary embodiment of the present disclosure, after sending the received converged configuration information to the game end, the method further includes: obtaining optimized fusion parameters; and sending the optimized fusion parameters to the game end so that the game end performs fusion processing on at least two animation segments according to the optimized fusion parameters.
According to a third aspect of the present disclosure, there is provided an animation processing apparatus comprising: the connection module is used for establishing communication connection with the editing end; the receiving module is used for receiving fusion configuration information sent by the editing end, wherein the fusion configuration information comprises at least two animation names corresponding to at least two animation segments and fusion parameters for performing fusion processing on the at least two animation segments; the fusion module is used for calling at least two animation segments according to the at least two animation names and fusing the at least two animation segments according to the fusion parameters; the fusion configuration information is information obtained by performing deserialization processing on a pre-created configuration file, or information received by the editing end.
According to a fourth aspect of the present disclosure, there is provided an animation processing apparatus comprising: the determining module is used for determining whether to load a pre-established configuration file according to the received interactive operation instruction; the deserializing module is used for sending the fusion configuration information obtained after deserializing the configuration file to the game terminal if the configuration file is loaded; and the information input module is used for sending the received fusion configuration information to the game terminal if the configuration file is not loaded.
According to a fifth aspect of the present disclosure, there is provided a computer storage medium having a computer program stored thereon, the computer program, when executed by a processor, implementing the animation processing method according to the first or second aspect.
According to a sixth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to execute the animation processing method according to the first and second aspects via executing the executable instructions.
As can be seen from the foregoing technical solutions, the animation processing method, the animation processing apparatus, the computer storage medium, and the electronic device in the exemplary embodiments of the present disclosure have at least the following advantages and positive effects:
in the technical solutions provided in some embodiments of the present disclosure, on one hand, the game end and the editing end are in communication connection, so that the game end and the editing end can communicate with each other conveniently under the condition that the game end and the editing end are decoupled, and the game end can perform animation processing according to an instruction sent by the editing end. In addition, the game end can be replaced by a plurality of different target games under the condition of only adjusting a small part of codes, so that the tool can comprise a plurality of different game engines, cross-engine use is realized, and the universality of the tool is improved. Furthermore, the fusion configuration information sent by the editing end is received, the fusion configuration information can be information obtained by performing deserialization processing on a pre-created configuration file by the editing end, and can also be information input by a user according to the preference of the user, so that the technical problems that the animation debugging function of the engine is fixed and the fusion effect cannot be customized according to the self-thought of the user in the related art are solved, and the fusion animation is flexible and controllable. On the other hand, the at least two animation segments are called according to the at least two animation names, and the at least two animation segments are subjected to fusion processing according to the fusion parameters, so that the game fusion process can be carried out in an engine of the game end, and the actual effect in the game is truly reflected.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 illustrates a flow diagram of an animation processing method in an exemplary embodiment of the present disclosure;
FIG. 2 illustrates a sub-flow diagram of an animation processing method in an exemplary embodiment of the present disclosure;
FIG. 3 illustrates an interface diagram of an animation processing method in an exemplary embodiment of the present disclosure;
FIG. 4 is an interface diagram illustrating an animation processing method according to an exemplary embodiment of the present disclosure;
FIG. 5 is a diagram illustrating a method of animation processing in an exemplary embodiment of the present disclosure;
FIG. 6 is a block diagram of an animation processing system according to an exemplary embodiment of the present disclosure;
FIG. 7 is a general flow diagram illustrating a method of animation processing according to an exemplary embodiment of the present disclosure;
FIG. 8 is a schematic structural diagram of an animation processing apparatus according to an exemplary embodiment of the present disclosure;
FIG. 9 is a schematic structural diagram of an animation processing apparatus according to another exemplary embodiment of the present disclosure;
FIG. 10 shows a schematic diagram of a computer storage medium in an exemplary embodiment of the disclosure;
fig. 11 shows a schematic structural diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
In the related art, animation fusion is generally performed by the following two schemes:
the method is characterized in that fusion effects are checked by means of animation editing software, such as: autodesk Maya/Autodesk3ds Max/Autodesk MotionBuilder, etc. However, the method by means of animation editing software (Autodesk Maya, etc.) is not actually combined with an animation player of a game engine, and thus a fusion effect in animation editing software cannot reflect an actual effect in a game 100%. Moreover, the animation players of different game engines have certain differences, and the animation playing mechanisms and the fusion effects of the game engines are different, which cannot be well reflected in animation editing software. Therefore, the method has large limitation and cannot solve the debugging problem of animation fusion according to local conditions.
The animation fusion technology is hooked with a specific game engine, and an engine editor supports the debugging of animation fusion by providing an editing window. However, some animation fusion techniques carried by game engines can completely reflect the actual effect of animation fusion in games, but have the disadvantages of insufficient generality, poor flexibility and poor expansibility. On one hand, the animation editing and debugging functions provided by the engine are usually only suitable for the engine and cannot be used across the engines, so that the universality is poor; on the other hand, the animation debugging function of the engine is usually fixed, and the number of expansion interfaces provided to the outside is small, so that the flexibility is not strong, and the customized addition or adjustment of the related functions of animation fusion by a user is not facilitated.
In the embodiment of the disclosure, firstly, an animation processing method is provided, which overcomes the defects of poor universality and flexibility of the animation processing method provided in the related art at least to some extent.
Fig. 1 is a flowchart illustrating an animation processing method according to an exemplary embodiment of the present disclosure, where an execution subject of the animation processing method may be an animation fusion tool that processes an animation.
Referring to fig. 1, an animation processing method according to one embodiment of the present disclosure includes the steps of:
step S110, establishing communication connection with an editing end;
step S120, receiving fusion configuration information sent by an editing end, wherein the fusion configuration information comprises at least two animation names corresponding to at least two animation segments and fusion parameters for performing fusion processing on the at least two animation segments;
and step S130, calling at least two animation segments according to the at least two animation names, and fusing the at least two animation segments according to the fusion parameters.
In the technical solution provided in the embodiment shown in fig. 1, on one hand, the game end and the editing end are in communication connection, so that the game end and the editing end can communicate with each other conveniently under the condition that the game end and the editing end are decoupled, and the game end can perform animation processing according to an instruction sent by the editing end. In addition, the game end can be replaced by a plurality of different target games under the condition of only adjusting a small part of codes, so that the tool can comprise a plurality of different game engines, cross-engine use is realized, and the universality of the tool is improved. Furthermore, the fusion configuration information sent by the editing end is received, the fusion configuration information can be information obtained by performing deserialization processing on a pre-created configuration file by the editing end, and can also be information input by a user according to the preference of the user, so that the technical problems that the debugging function of the engine per se is fixed and the fusion effect cannot be customized according to the self-idea of the user in the related technology are solved, and the fusion animation is flexible and controllable. On the other hand, the at least two animation segments are called according to the at least two animation names, and the at least two animation segments are subjected to fusion processing according to the fusion parameters, so that the game fusion process can be carried out in an engine of the game end, and the actual effect in the game is truly reflected.
The following describes the specific implementation of each step in fig. 1 in detail:
it should be noted that an animation fusion tool may be created in advance, and the animation fusion tool includes a game side and an editing side. The game end is used for receiving the related instruction sent by the editing end and executing the animation fusion process; the editing end is used for sending an instruction to the game end according to the received information so that the game end can execute the animation fusion process according to the instruction of the user. The game terminal and the editing terminal are combined by using a communication interface as a link, and can communicate through an RPC protocol (Remote Procedure Call, which may also be called as Remote Procedure Call, for short: RPC). The underlying technology is based on socket communications, and the communication protocol is based on a stateless, lightweight remote procedure call transfer protocol (json-RPC). For example, referring to fig. 2, fig. 2 shows a structural diagram of an animation processing method in an exemplary embodiment of the disclosure, and specifically shows a structural diagram of a game end and an editing end in an animation fusion tool, and referring to fig. 2, a messaging structure of the game end may communicate with a messaging interface of the editing end through RPC communication.
Through the RPC communication mode, the editing end and the game end can be separated, so that on one hand, the tool is independent of the game engine and can be applied across the engines. On the other hand, the technical framework of the invention is almost suitable for all games needing animation fusion debugging, and can be combined with a specific animation engine interface to truly reflect the actual effect of animation fusion in the games.
With continued reference to fig. 1, in step S110, a communication connection is established with the editing end.
Specifically, after the animation fusion tool is started, the editing end may send port information of itself (for example, an IP (Internet Protocol, IP) address and a port number) to the game end, and then the game end may send a communication connection request to the editing end according to the port information, so that when a response message sent by the editing end is received, the game end may establish communication connection with the editing end.
After establishing communication connection with the editing end, the game end can initialize the game interface, the game role and the animation segments in the animation database and send an initialization completion message to the editing end. Initialization is to set a variable (variable) to a default value, set the control to a default state, and prepare it unprepared. There is an initialization for each software, or tool, system, etc. For example, the initialization of the system is to restore the system to the state of the backup that was made at the beginning.
In step S120, fusion configuration information sent by the editing end is received, where the fusion configuration information includes at least two animation names corresponding to at least two animation segments, and a fusion parameter for performing fusion processing on the at least two animation segments.
Exemplarily, a User Interface (User Interface, also called human-computer Interface, UI for short) can be built at an editing end based on PyQt5, and the name of an animation input by a User is determined through an editing window and a search window of the UI Interface, so that not only can key parameters of animation fusion be conveniently and quickly configured, thereby dynamically observing fusion effects under different conditions, but also a series of auxiliary functions can be used, such as: animation searching, animation information viewing, and rapid saving and loading of configuration files to facilitate debugging of animation fusion. On one hand, based on the editing window of the UI, relevant parameters of animation fusion can be configured conveniently and rapidly, so that the fusion effect under different conditions can be observed dynamically; further, it is also possible to pass a series of auxiliary functions, such as: animation searching, animation information viewing, and rapid saving and loading of configuration files to promote debugging of animation fusion; furthermore, the animation fusion tool is simple in use steps, clear in function and high in practicability.
After receiving the initialization completion message sent by the game terminal, the user may input an interactive operation instruction in the edit window of the UI interface to determine whether to load a pre-created configuration file. For example, referring to fig. 3, fig. 3 shows an interface schematic diagram of an animation processing method in an exemplary embodiment of the present disclosure, and specifically shows an interface schematic diagram of the above-mentioned editing window, and the following explains a specific implementation manner in conjunction with fig. 3:
when the user clicks the "load configuration" button in fig. 3, the editing end may load a configuration file created in advance, perform deserialization processing on the configuration file to obtain fusion configuration information, and then automatically fill the fusion configuration information into the editing window of the UI interface, and further, when the user clicks the "execute button", the editing end may send the fusion configuration information to the game end. Alternatively, the user may perform an interactive operation (input or select related information) in the editing window to generate the fusion configuration information, and when the user clicks an "execute button", the editing end may send the fusion configuration information to the game end.
The fusion configuration information may include the following contents:
first, at least two animation names (i.e., animation name a and animation name B in fig. 3) corresponding to at least two animation segments can be both two-dimensional animations or both three-dimensional animations. It should be noted that clicking the "search" button will pop up the animated search interface. In the search box, an animation name or an animation keyword may be input to quickly search for an animation. Because the number of animations involved in some games is huge, the animation segments participating in fusion can be quickly found and designated from the huge number of animations through the function. For example, referring to fig. 4, fig. 4 shows an interface schematic diagram of an animation processing method in an exemplary embodiment of the present disclosure, and specifically shows an interface schematic diagram of the search window, so that a user may input an animation keyword in the search window, and further, the editing end may match an animation name corresponding to the animation keyword based on a regular matching method, and select a target animation name by double-clicking a left mouse button. The user may also enter an animation name in the search window, and may then select the animation name.
And secondly, performing fusion processing on at least two animation segments. It should be noted that the fusion parameters may include:
cutting points: and A, a cut-out point of the animation. For more intuitive setting, the cut-out point of the animation is configured in the form of animation time percentage, so the value interval of the cut-out point is [0, 1], 0 represents the starting time of the animation, and 1 represents the ending time of the animation;
② fusion time (expressed by t, unit: second): the fusion time is a time during which the a animation and the B animation coexist. During the fusion time, the A animation and the B animation will jointly influence the action and the posture of the game character, and the influence of the A animation and the B animation on the posture of the character depends on the fusion mode. When the fusion time is over, the specific gravity of the animation A is 0% and the specific gravity of the animation B is 100%.
③ entering point: and B, setting the cut-in point of the animation, namely setting the percentage of the animation time, wherein the value interval is [0, 1 ].
For example, referring to fig. 5, fig. 5 shows a schematic diagram of an animation processing method in an exemplary embodiment of the present disclosure, and specifically shows a relationship schematic diagram of the above cut-out point, the blending time, and the cut-in point, and referring to fig. 5, the blending time refers to a time when a previous animation and a next animation coexist. In fig. 5, a game character moves under the driving of animation a, and the action posture of animation a is shown; when the animation A is played to the cut-off point of the animation A (the default is the ending time of the animation A), the animation B begins to be merged, and the animation A and the animation B coexist on the role in the merging time to jointly influence the action and the posture of the role; and (4) finishing the fusion time, completely fusing the animation A, completely fusing the animation B, and continuously playing the animation B by the character until the fusion time is finished. In general, the cut-in point minus the blend time should not be less than the start time of the animation, and the cut-out point should not be greater than the end time of the animation.
And fourthly, a fusion mode: and (5) fusion mode of the animation. The fusion approach provided in the present disclosure may include: a first fusion method (using the displacement and orientation of the animation A or the displacement and orientation of the animation B), and a second fusion method (linear interpolation). This disclosure supports to select any one of the above-mentioned 3 kinds of fusion modes, and can expand more fusion modes according to the demand.
Circularly playing: whether to play in a loop. If yes, the fusion effect of the animation A and the animation B is repeatedly played during execution, otherwise, the animation A and the animation B are played only once.
Animation information: clicking the query button to pop up an animation information display window, wherein the animation information display window comprises an entry point, an exit point, animation duration and the like during default fusion of the A animation and the B animation, and the default parameters of animation data can be conveniently checked.
Using default values: clicking a 'setting' button, setting a cut-out point of the animation A and a cut-in point of the animation B according to original default data of the animation, and updating parameters in a UI (user interface).
When executing reset: whether the game environment needs to be reset during execution is set. If yes, firstly resetting the game environment and the position of the game role during execution, and then executing the animation fusion operation, otherwise, directly executing the animation fusion operation.
Ninthly, other settings: other non-critical configurations may be set here, including UI styles and personalization settings for tools, etc.
The default save path in r: setting a default saving path of the configuration file.
Figure BDA0002722185190000101
Saving the scheme to a file: and clicking a button of saving the scheme to the file, calling a configuration file saving module to save the parameters in the UI interface to the configuration file in a json file mode, wherein the saving path is the set default saving path.
Figure BDA0002722185190000102
Loading configuration: clicking the "load" button may choose to load the saved configuration file. The bottom layer will read and parse the configuration file and update the configuration parameters to the UI interface.
Figure BDA0002722185190000103
An execution button: clicking the 'execute' button will send the execution instruction and the panel parameters to the game end. And the game end receives the instruction and extracts the parameters, and finally the fusion effect of the roles is displayed in the game interface.
With continued reference to fig. 1, in step S130, at least two animation segments are called according to at least two animation names, and are subjected to a fusion process according to a fusion parameter.
After receiving the fusion configuration information, the game terminal may call at least two animation segments according to the at least two animation names, and perform fusion processing on the at least two animation segments according to the fusion parameters.
Specifically, referring to the relevant explanation of step S120, when the target fusion manner determined according to the fusion parameter (i.e. the target fusion manner selected by the user or the target fusion manner preset in the configuration file) is: in the first fusion mode (using the displacement and the orientation of the animation a), the animation a can be determined as the target animation, and the pose variation of the game character in two adjacent frames of the target animation can be obtained, and the pose variation can be determined as the pose variation of the game character in two adjacent frames of the fusion animation. For example, taking the example that the pose change amount includes a displacement change amount and an orientation change amount, when the time interval between two adjacent frames of images in the game is dt (for example, a 30-frame game whose frame interval is 1/30 seconds and is equal to about 0.033 seconds), and the pose change amount and the orientation change amount of the a animation are s1 and y1 respectively in dt time, the pose change amount s of the game character in two adjacent frames in the generated fusion animation is s1, and the orientation change amount y of the game character in two adjacent frames is y 1.
When the target fusion mode (i.e. the target fusion mode selected by the user or the target fusion mode preset in the configuration file) determined according to the fusion parameters is: in the first fusion mode (using the displacement and the orientation of the B animation), the B animation may be determined as the target animation, the pose variation of the game character in two adjacent frames of the target animation may be obtained, and the pose variation may be determined as the pose variation of the game character in two adjacent frames of the fusion animation. For example, referring to the above explanation, when the time interval between two adjacent frames of images in the game is dt (for example, in a 30-frame game, the frame interval is 1/30 seconds, which is approximately equal to 0.033 seconds), and the pose change amount and the orientation change amount of the B animation are s2 and y2, respectively, the pose change amount s of the game character in two adjacent frames in the generated fusion animation is s2, and the orientation change amount y of the game character in two adjacent frames is y 2.
When the target fusion mode (i.e. the target fusion mode selected by the user or the target fusion mode preset in the configuration file) determined according to the fusion parameters is: in a second fusion mode (linear interpolation), the pose variation of the game role in two adjacent frames of each animation segment can be respectively obtained; and performing linear interpolation on at least two pose variation quantities to obtain target pose variation quantities, and determining the target pose variation quantities as pose variation quantities of game roles in two adjacent frames of the fusion animation.
For example, with continued reference to the related explanation of the above steps, when the pose change amount includes a displacement change amount and an orientation change amount, the target displacement change amount may be obtained by linearly interpolating at least two displacement change amounts, the target orientation change amount may be obtained by linearly interpolating at least two orientation change amounts, the target displacement change amount may be determined as the displacement change amount of the game character in two adjacent frames of the fusion animation, and the target orientation change amount may be determined as the orientation change amount of the game character in two adjacent frames of the fusion animation.
Specifically, the specific process of performing linear interpolation on at least two displacement variations to obtain the target displacement variation may be: obtaining a displacement variation s1 of the game character in two adjacent frames of the animation a, obtaining a displacement variation s2 of the game character in two adjacent frames of the animation B, further obtaining a first product of the displacement variation s1 and the fusion weight w1, obtaining a second product of the displacement variation s2 and the fusion weight w2, and determining the sum of the first product and the second product as the target displacement variation s, wherein s is s1 w1+ s2 w 2. Thus, the target displacement variation amount can be determined as the displacement variation amount of the game character in two adjacent frames of the fusion animation.
The specific process of performing linear interpolation on the at least two orientation variation amounts to obtain the target orientation variation amount may be: acquiring orientation variation y1 of the game characters in two adjacent frames of the animation segment A, and acquiring orientation variation y2 of the game characters in two adjacent frames of the animation segment B; further, a third product of the orientation change amount y1 and the fusion weight w1 may be obtained, and a fourth product of the orientation change amount y2 and the fusion weight w2 may be obtained; and determining the sum of the third product and the fourth product as the target orientation change amount y, and then y1 w1+ y2 w 2. Thus, the target orientation change amount can be determined as the orientation change amount of the game character in the two adjacent frames of the fusion animation.
The fusion weight w1 and the fusion weight w2 may be linearly changed with time. For example, initially, w 1-0, w 2-1-w 1; as the fusion process progresses, w1 increases linearly and w2 decreases linearly, specifically, w1 (the next frame fusion weight) ═ w1 (the last frame fusion weight) +1/t × dt, and w2 ═ 1-w 1; until the end of the fusion, w1 is 1 and w2 is 0.
After the at least two animation segments are fused according to the fusion parameters, the game end can play and display the fusion animation. Thus, the user can visually see the fusion effect.
If the fusion effect does not meet the expected requirement, the user can directly modify the fusion configuration information in the editing window of the UI interface to modify the previous fusion configuration information into the optimized fusion parameter, and then can click a button of 'save scheme to file' of the UI interface to save the received information input by the user in the form of a json file, and further after clicking an execution button, the editing end can send the optimized fusion parameter to the game end, so that the game end performs re-fusion processing on the at least two animation segments according to the optimized fusion parameter. Similarly, if the fusion effect is still not expected, the user may change the parameters again until the optimization effect satisfies the expected effect.
It should be noted that, based on the animation fusion tool in the present disclosure, a user only needs to adjust the UI interface of the editing end and the instruction execution logic of the game end, so as to complete the customized or customized function requirement.
The present disclosure also provides an animation processing system, which may refer to fig. 6 by way of example, fig. 6 shows a schematic structural diagram of an animation processing system in an exemplary embodiment of the present disclosure, and a specific implementation is explained below with reference to fig. 6.
The game side 601 and the editing side 602 each comprise a messaging interface. The message receiving and sending interface is used as an intermediate layer of the game end and the editing end, plays a role of a bridge and carries out two-way communication in an RPC mode. The editing end can not only conveniently modify and set animation fusion related parameters, but also send instructions and parameters to the game end. And after receiving the message, the game end invokes an internal processing unit to process and display the effect of animation fusion. In addition, the game terminal can also transmit the message to the editing terminal, thereby realizing the synchronization of the information.
Specifically, the game side may include a messaging unit 6011, an instruction processing unit 6012, an initialization component 6013, an animation playback component 6014, an animation information management unit 6015, and other components 6016.
The messaging unit 6011 of the game side: the editing terminal is used for receiving instructions and parameters transmitted from the editing terminal, and then transmitting the instructions and parameters to the instruction processing unit 6012 for processing, so as to complete functions of information transmission, result feedback and the like. Specifically, the game terminal is used as a client terminal, and when the game terminal is started, a communication connection request is actively initiated to the editing terminal according to the ip address and the port number sent by the editing terminal, and the game terminal and the editing terminal establish communication connection immediately.
Instruction processing unit 6012: and the message and the instruction are used for analyzing and processing the message and the instruction sent by the editing end. Specifically, when the game end is started, the initialization component 6013 is automatically called to create a basic game role and a game scene, and the animation data required is loaded through the animation information management unit 6015. And further, the mapping between the instructions and the logic is automatically completed by analyzing and extracting the specific instructions and parameters transmitted from the editing end, so that the corresponding logic processing is performed. When the fusion process is completed, the animation play component 6014 is invoked to complete the presentation of the fusion effect.
Initialization component 6013: a series of initialization interfaces including initialization logic for game characters, game scenes, and animation data are provided to the instruction processing unit 6012. The initialization interface in the game is packaged by one layer inside the component, so that the logic of initialization can be conveniently completed at the game end when the game is used.
Animation playback component 6014: the interface is used for providing the interface relevant to animation playing and animation fusion for the instruction processing unit. Although the animation playing interfaces of different game engines are slightly different, the animation playing component completes interface packaging and provides a simple calling interface for an external module. When executing the animation fusion instruction, the instruction processing unit 6012 may show the actual effect of fusing the animation by transferring the key parameters of the animation fusion and calling the relevant interface of this module.
Animation information management unit 6015: for saving and managing all animation data in the animation database. All animation data in this disclosure are stored in a file in XML (Extensible Markup Language, a subset of standard universal Markup Language, abbreviated as XML) format. For animation fusion, the basic information of each animation includes: the total animation duration, the default cut-in point during fusion, the default cut-out point during fusion and the like. In order to facilitate unified management, the module internally comprises an XML file analysis component, so that the basic information of each animation data can be read, a data structure can be constructed, and the use of a program is facilitated.
Other components 6016: other components of the game end are included that are not relevant to the core content of the present disclosure.
The editing end 602 may include a messaging unit 6021, a UI interface 6022, an animation search unit 6023, a configuration file saving and loading unit 6024, and a configuration file 6025.
The messaging unit of editing end 6021: and the animation fusion logic is used for sending instructions and parameters to the game end to drive the execution of the animation fusion logic at the game end. Specifically, when a game is started, the ip address and the port number are transmitted to the game terminal in the form of starting parameters, and a communication connection request sent by the game terminal is received, so that communication connection is established with the game terminal. The bottom layer is based on socket communication, and the protocol is based on json-RPC. In order to facilitate calling and program realization, the upper layer is packaged into an RPC calling interface, so that the editing end and the game end can conveniently carry out two-way communication.
UI interface 6022: comprising an edit window and a search window and a series of functionality controls. In the editing window, the names of the animations participating in the fusion and the fusion parameters thereof can be dynamically set.
Animation search unit 6023: for providing an underlying processing mechanism for animation selection and searching. The animation search unit maintains a list of animation names, including all animation names in the game. The implementation of the search function is based on the regular matching of character strings, animation keywords in a search box are used as input, and matched animation names are displayed in a search window.
Configuration file saving and loading unit 6024: the function of saving and loading the fusion parameters in the UI interface as the configuration file is provided. The configuration file is in a json format, a json module of Python 3 is introduced into the configuration file storing and loading module, the fusion parameters are stored into the configuration file by using a json. Through the unit, the complicated process of manually and repeatedly editing and configuring is omitted, the service efficiency of the dynamic fusion tool is greatly improved, and a quick method is provided for recording and analyzing problems.
Configuration file 6025: for saving a collection of all configuration files.
The technical scheme disclosed by the invention greatly uses the modular and unit modular thinking to carry out system design, thereby being beneficial to improving the debugging efficiency of the codes and being very convenient for later code maintenance. In addition, based on the animation fusion tool in the disclosure, a user (related developer) can conveniently extend, transplant and modify the tool. Firstly, the function expansion is convenient, secondly, the tool transplantation cost is low, each unit of the editing end can be almost intact during the transplantation, and the animation fusion tool can be accessed into different games according to local conditions by only adjusting the module interface of the game end according to different games or engines, so that the cross-engine use of the tool is realized.
Exemplarily, referring to fig. 7, fig. 7 shows an overall flowchart of an animation processing method in an exemplary embodiment of the disclosure, which includes steps S701 to S708, and a specific implementation is explained below with reference to fig. 7.
In step S701, start;
in step S702, an animation fusion tool is started;
in step S703, the game terminal establishes a communication connection with the editing terminal;
in step S704, the game end initializes the game interface, the game role and the animation database;
in step S705, determining whether to load a configuration file according to an interactive operation instruction of a user;
in step S706, if yes, loading a configuration file, and executing step S707 to automatically write the fusion configuration information in the configuration file into the UI interface;
if not, executing step S707 to obtain the fusion configuration information input by the user on the UI interface of the editing end;
in step S708, when the user clicks the execute button, the editing end sends the fusion configuration information to the game end;
in step S709, the game end receives the fusion configuration information;
in step S710, the game side performs animation fusion processing according to the fusion configuration information;
in step S711, the user checks the fusion effect at the game end;
in step S712, the process ends.
Based on the technical scheme, on one hand, the game end can be replaced by a plurality of different target games under the condition that only a small part of codes are adjusted, so that the tool can comprise a plurality of different game engines, cross-engine use is realized, and the universality of the tool is improved. Furthermore, the technical problems that the debugging function of the engine self-body animation in the related technology is relatively fixed and the fusion effect cannot be customized according to the self-idea of the user can be solved, so that the fusion animation is flexible and controllable. On the other hand, the game fusion process can be carried out in the engine of the game end, and the actual effect in the game is truly reflected.
The present disclosure also provides an animation processing apparatus, and fig. 8 shows a schematic structural diagram of the animation processing apparatus in an exemplary embodiment of the present disclosure; as shown in fig. 8, the animation processing apparatus 800 may include a connection module 801, a reception module 802, and a fusion module 803. Wherein:
and a connection module 801, configured to establish a communication connection with the editing end.
In an exemplary embodiment of the present disclosure, the connection module is further configured to receive port information sent by the editing end, and send a communication connection request to the editing end according to the port information; and when a response message returned by the editing end is received, establishing communication connection with the editing end.
The receiving module 802 is configured to receive fusion configuration information sent by an editing end, where the fusion configuration information includes at least two animation names corresponding to at least two animation segments, and a fusion parameter for performing fusion processing on the at least two animation segments.
And the fusion module 803 is configured to call at least two animation segments according to the at least two animation names, and perform fusion processing on the at least two animation segments according to the fusion parameters.
In an exemplary embodiment of the present disclosure, the fusion parameters include a fusion manner; the fusion mode comprises a first fusion mode; the fusion processing of the at least two animation segments according to the fusion parameters comprises: determining a target fusion mode according to the fusion parameters; if the target fusion mode is the first fusion mode, determining any animation segment of the at least two animation segments as a target animation; and determining the pose variation of the game role in the two adjacent frames of the target animation as the pose variation of the game role in the two adjacent frames of the fusion animation.
In an exemplary embodiment of the present disclosure, the fusion manner further includes a second fusion manner, and the method further includes: if the target fusion mode is the second fusion mode, respectively acquiring pose variation of game roles in two adjacent frames of each animation segment; and performing linear interpolation on at least two pose variation quantities to obtain target pose variation quantities, and determining the target pose variation quantities as pose variation quantities of game characters in two adjacent frames of the fusion animation.
In an exemplary embodiment of the disclosure, the fusion module is further configured to obtain a product of each of the pose change amounts and the corresponding fusion weight; a preset mapping relation is formed between the pose variation and the fusion weight; the fusion weight varies linearly with time; determining a sum of at least two of the products as the target pose change amount.
The specific details of each module in the animation processing apparatus have been described in detail in the corresponding animation processing method, and therefore are not described herein again.
The present disclosure also provides an animation processing apparatus, and fig. 9 shows a schematic structural diagram of the animation processing apparatus in an exemplary embodiment of the present disclosure; as shown in fig. 9, the animation processing apparatus 900 may include a determination module 901, an deserialization processing module 902, and an information input module 903. Wherein:
a determining module 901, configured to determine whether to load a pre-created configuration file according to the received interactive operation instruction.
In an exemplary embodiment of the disclosure, the determining module is further configured to send the port information to the game end; and receiving a communication connection request sent by the game terminal, and sending a response message to the game terminal.
In an exemplary embodiment of the disclosure, the determining module is further configured to obtain an input keyword; matching animation names corresponding to the input keywords based on a regular matching method; and sending the fusion configuration information containing at least two animation names to the game terminal.
And an deserializing module 902, configured to, if the configuration file is loaded, send the fused configuration information obtained after deserializing the configuration file to the game side.
And the information input module 903 is used for sending the received fusion configuration information to the game terminal if the configuration file is not loaded.
In an exemplary embodiment of the present disclosure, the information input module is further configured to obtain an optimized fusion parameter; and sending the optimized fusion parameters to the game end so that the game end performs fusion processing on at least two animation segments according to the optimized fusion parameters.
The specific details of each module in the animation processing apparatus have been described in detail in the corresponding animation processing method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer storage medium capable of implementing the above method. On which a program product capable of implementing the above-described method of the present specification is stored. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
Referring to fig. 10, a program product 1000 for implementing the above method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 1100 according to this embodiment of the disclosure is described below with reference to fig. 11. The electronic device 1100 shown in fig. 11 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present disclosure.
As shown in fig. 11, electronic device 1100 is embodied in the form of a general purpose computing device. The components of the electronic device 1100 may include, but are not limited to: the at least one processing unit 1110, the at least one memory unit 1120, a bus 1130 connecting different system components (including the memory unit 1120 and the processing unit 1110), and a display unit 1140.
Wherein the storage unit stores program code that is executable by the processing unit 1110 to cause the processing unit 1110 to perform steps according to various exemplary embodiments of the present disclosure as described in the above section "exemplary methods" of this specification. For example, the processing unit 1110 may perform the following as shown in fig. 1: step S110, establishing communication connection with an editing end; step S120, receiving fusion configuration information sent by an editing end, wherein the fusion configuration information comprises at least two animation names corresponding to at least two animation segments and fusion parameters for performing fusion processing on the at least two animation segments; step S130, calling at least two animation segments according to at least two animation names, and fusing the at least two animation segments according to the fusion parameters; the fusion configuration information is information obtained by performing deserialization processing on a pre-created configuration file, or information received by an editing end.
The storage unit 1120 may include a readable medium in the form of a volatile memory unit, such as a random access memory unit (RAM)11201 and/or a cache memory unit 11202, and may further include a read only memory unit (ROM) 11203.
Storage unit 1120 may also include a program/utility 11204 having a set (at least one) of program modules 11205, such program modules 11205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 1130 may be representative of one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 1100 may also communicate with one or more external devices 1200 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 1100, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 1100 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 1150. Also, the electronic device 1100 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 1160. As shown, the network adapter 1160 communicates with the other modules of the electronic device 1100 over the bus 1130. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 1100, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (13)

1. An animation processing method, comprising:
establishing communication connection with an editing end;
receiving fusion configuration information sent by the editing end, wherein the fusion configuration information comprises at least two animation names corresponding to at least two animation segments and fusion parameters for performing fusion processing on the at least two animation segments;
calling at least two animation segments according to the at least two animation names, and fusing the at least two animation segments according to the fusion parameters;
the fusion configuration information is information obtained by performing deserialization processing on a pre-created configuration file, or information received by the editing end.
2. The method according to claim 1, wherein the establishing a communication connection with the editing end comprises:
receiving port information sent by the editing end, and sending a communication connection request to the editing end according to the port information;
and when a response message returned by the editing end is received, establishing communication connection with the editing end.
3. The method of claim 1 or 2, wherein the fusion parameters include a fusion mode; the fusion mode comprises a first fusion mode;
the fusion processing of the at least two animation segments according to the fusion parameters comprises:
determining a target fusion mode according to the fusion parameters;
if the target fusion mode is the first fusion mode, determining any animation segment of the at least two animation segments as a target animation;
and determining the pose variation of the game role in the two adjacent frames of the target animation as the pose variation of the game role in the two adjacent frames of the fusion animation.
4. The method of claim 3, wherein the fusion modality further comprises a second fusion modality, the method further comprising:
if the target fusion mode is the second fusion mode, respectively acquiring pose variation of game roles in two adjacent frames of each animation segment;
and performing linear interpolation on at least two pose variation quantities to obtain target pose variation quantities, and determining the target pose variation quantities as pose variation quantities of game characters in two adjacent frames of the fusion animation.
5. The method of claim 4, wherein linearly interpolating at least two of the pose changes to obtain a target pose change comprises:
obtaining the product of each pose variation and the corresponding fusion weight; a preset mapping relation is formed between the pose variation and the fusion weight; the fusion weight varies linearly with time;
determining a sum of at least two of the products as the target pose change amount.
6. An animation processing method, comprising:
determining whether to load a pre-established configuration file or not according to the received interactive operation instruction;
if the configuration file is loaded, sending the fusion configuration information obtained after the deserialization processing is carried out on the configuration file to a game terminal;
and if the configuration file is not loaded, sending the received fusion configuration information to the game terminal.
7. The method of claim 6, wherein prior to determining whether to load the pre-created configuration file according to the received interoperation instructions, the method further comprises:
sending the port information to the game terminal;
and receiving a communication connection request sent by the game terminal, and sending a response message to the game terminal.
8. The method of claim 7, wherein after establishing the communication connection with the game terminal, the method further comprises:
acquiring an input keyword;
matching the animation name corresponding to the input keyword based on a regular matching method;
and sending fusion configuration information containing at least two animation names to the game terminal.
9. The method of claim 6, wherein after sending the received converged configuration information to the game peer, the method further comprises:
obtaining optimized fusion parameters;
and sending the optimized fusion parameters to the game end so that the game end performs fusion processing on at least two animation segments according to the optimized fusion parameters.
10. An animation processing apparatus, comprising:
the connection module is used for establishing communication connection with the editing end;
the receiving module is used for receiving fusion configuration information sent by the editing end, wherein the fusion configuration information comprises at least two animation names corresponding to at least two animation segments and fusion parameters for performing fusion processing on the at least two animation segments;
the fusion module is used for calling at least two animation segments according to the at least two animation names and fusing the at least two animation segments according to the fusion parameters;
the fusion configuration information is information obtained by performing deserialization processing on a pre-created configuration file, or information received by the editing end.
11. An animation processing apparatus, comprising:
the determining module is used for determining whether to load a pre-established configuration file according to the received interactive operation instruction;
the deserializing module is used for sending the fusion configuration information obtained after deserializing the configuration file to the game terminal if the configuration file is loaded;
and the information input module is used for sending the received fusion configuration information to the game terminal if the configuration file is not loaded.
12. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the animation processing method as claimed in any one of claims 1 to 9.
13. An electronic device, comprising:
a processor; and:
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the animation processing method of any one of claims 1-9 via execution of the executable instructions.
CN202011091236.2A 2020-10-13 2020-10-13 Animation processing method and device, computer storage medium and electronic equipment Pending CN112156461A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011091236.2A CN112156461A (en) 2020-10-13 2020-10-13 Animation processing method and device, computer storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011091236.2A CN112156461A (en) 2020-10-13 2020-10-13 Animation processing method and device, computer storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN112156461A true CN112156461A (en) 2021-01-01

Family

ID=73866694

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011091236.2A Pending CN112156461A (en) 2020-10-13 2020-10-13 Animation processing method and device, computer storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112156461A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991248A (en) * 2021-03-10 2021-06-18 维沃移动通信有限公司 Image processing method and device
CN113658300A (en) * 2021-08-18 2021-11-16 北京百度网讯科技有限公司 Animation playing method and device, electronic equipment and storage medium
CN113791821A (en) * 2021-09-18 2021-12-14 广州博冠信息科技有限公司 Animation processing method, device, medium and electronic equipment based on illusion engine

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090009520A1 (en) * 2005-04-11 2009-01-08 France Telecom Animation Method Using an Animation Graph
US20100118034A1 (en) * 2008-11-13 2010-05-13 Jin-Young Kim Apparatus and method of authoring animation through storyboard
CN102254335A (en) * 2011-07-01 2011-11-23 厦门吉比特网络技术股份有限公司 System and method for editing game characters
US20150178988A1 (en) * 2012-05-22 2015-06-25 Telefonica, S.A. Method and a system for generating a realistic 3d reconstruction model for an object or being
CN105242931A (en) * 2015-10-15 2016-01-13 福建天晴数码有限公司 Method and system for editing and generating codes for game interface
CN109710244A (en) * 2018-12-29 2019-05-03 苏州玩友时代科技股份有限公司 Customized animation configuration method and device, equipment and storage medium
JP2020096660A (en) * 2018-12-17 2020-06-25 株式会社カプコン Game animation editing program and game animation editing system
CN111553967A (en) * 2020-04-26 2020-08-18 苏州沁游网络科技有限公司 Unity-based animation resource file production method, module and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090009520A1 (en) * 2005-04-11 2009-01-08 France Telecom Animation Method Using an Animation Graph
US20100118034A1 (en) * 2008-11-13 2010-05-13 Jin-Young Kim Apparatus and method of authoring animation through storyboard
CN102254335A (en) * 2011-07-01 2011-11-23 厦门吉比特网络技术股份有限公司 System and method for editing game characters
US20150178988A1 (en) * 2012-05-22 2015-06-25 Telefonica, S.A. Method and a system for generating a realistic 3d reconstruction model for an object or being
CN105242931A (en) * 2015-10-15 2016-01-13 福建天晴数码有限公司 Method and system for editing and generating codes for game interface
JP2020096660A (en) * 2018-12-17 2020-06-25 株式会社カプコン Game animation editing program and game animation editing system
CN109710244A (en) * 2018-12-29 2019-05-03 苏州玩友时代科技股份有限公司 Customized animation configuration method and device, equipment and storage medium
CN111553967A (en) * 2020-04-26 2020-08-18 苏州沁游网络科技有限公司 Unity-based animation resource file production method, module and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991248A (en) * 2021-03-10 2021-06-18 维沃移动通信有限公司 Image processing method and device
CN113658300A (en) * 2021-08-18 2021-11-16 北京百度网讯科技有限公司 Animation playing method and device, electronic equipment and storage medium
CN113791821A (en) * 2021-09-18 2021-12-14 广州博冠信息科技有限公司 Animation processing method, device, medium and electronic equipment based on illusion engine
CN113791821B (en) * 2021-09-18 2023-11-17 广州博冠信息科技有限公司 Animation processing method and device based on illusion engine, medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN112156461A (en) Animation processing method and device, computer storage medium and electronic equipment
US6262734B1 (en) Graphic data generating apparatus, graphic data generation method, and medium of the same
US6552721B1 (en) Graphic data generating apparatus, graphic data generation method, and medium of the same
US5721912A (en) Graphical user interface for creating database integration specifications
EP0951668B1 (en) Virtual environment navigation aid
US6493001B1 (en) Method, apparatus and medium for describing a virtual shared space using virtual reality modeling language
US6518989B1 (en) Graphic data generating apparatus, graphic data generation method, and medium of the same
RU2317582C2 (en) System and method for dynamic master interface
US20020085041A1 (en) Method and apparatus for editing data used in creating a three-dimensional virtual reality environment
CN102012906A (en) Three-dimensional scene management platform based on SaaS architecture and editing and browsing method
US6401237B1 (en) Method and apparatus for editing data used in creating a three-dimensional virtual reality environment
US6789263B1 (en) Data conversion method and apparatus
JP2001306308A (en) Method for defining class of data center application
JP2010511214A (en) Aggregation of portlets used in the client environment without depending on server resources
US6377255B1 (en) Pattern data generator, pattern data generating method and its medium
JPH11167584A (en) Page shift method and its execution device and medium recording page shift processing program and data
US7996764B2 (en) Apparatus, program and method for accepting a request from a client computer via a network and executing a web application
JPH11175757A (en) Processor and method for information processing and provided medium
JPH10222698A (en) Communication equipment of three-dimensional virtual space and communicating method
JP2002197490A (en) Device for displaying three-dimensional graph
CN110175309B (en) Native terminal resource pool management system and method for mobile application hybrid development
CN111897887A (en) Parameter configuration method, device, system, electronic equipment and storage medium
Bandelloni et al. Reverse engineering cross-modal user interfaces for ubiquitous environments
CN117130717B (en) Element positioning method and system of HTMLayout application program in RPA scene
CN115033282B (en) Charging pile operation platform compatible method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination