Disclosure of Invention
The embodiment of the disclosure at least provides an animation processing method, an animation processing device, computer equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides an animation processing method, including:
acquiring a target animation object in execution, and determining an animation type corresponding to the target animation object;
extracting animation parameter information corresponding to the parameter type to be analyzed from the target animation object by adopting a visualization tool based on the parameter type to be analyzed corresponding to the animation type;
and displaying the extracted animation parameter information through the visualization tool.
In an alternative embodiment, obtaining a target animation object in execution includes:
and acquiring the target animation object in execution by monitoring the animation adding event of the view layer of the view.
In an optional implementation manner, based on the parameter type to be analyzed corresponding to the animation type, extracting, by using a visualization tool, animation parameter information corresponding to the parameter type to be analyzed from the target animation object, includes:
if the animation type is basic animation or key frame animation, extracting curve parameter information of the animation from the target animation object by adopting the visualization tool;
and if the animation type is physical animation, extracting the physical parameter information of the animation from the target animation object by adopting the visualization tool.
In an alternative embodiment, obtaining a target animation object in execution includes:
acquiring at least one target animation object corresponding to at least one view respectively;
the displaying the analyzed animation parameter information through the visualization tool comprises:
clustering the animation parameter information of the target animation object obtained by analysis according to the view information corresponding to the target animation object to generate a clustering result, wherein the clustering result comprises a view list, an animation list as the next level of the view list and animation details as the next level of the animation list; the animation details comprise animation parameter information;
and displaying the animation parameter information according to the clustering result.
In an optional embodiment, presenting the animation parameter information according to the clustering result includes:
displaying the view list;
responding to the selected operation aiming at the target view information in the view list, and displaying an animation list associated with the target view information;
and responding to the selected operation aiming at the target animation in the animation list, and displaying the animation parameter information of the target animation.
In an optional implementation manner, after presenting the animation list associated with the target view information, the method further includes:
and in response to the deletion operation aiming at the target view information, deleting the target view information and the animation list and the animation details associated with the target view information.
In an optional embodiment, after displaying the extracted animation parameter information through the visualization tool, the method further includes:
responding to an animation parameter information storage instruction, and storing the animation parameter information in a curve preset library; or,
responding to an animation parameter information adjusting instruction, adjusting the animation parameter information, and storing the adjusted animation parameter information in a curve preset library.
In a second aspect, an embodiment of the present disclosure further provides an animation processing apparatus, including:
the animation type identification module is used for acquiring a target animation object in execution and determining an animation type corresponding to the target animation object;
the analysis module is used for extracting animation parameter information corresponding to the parameter type to be analyzed from the target animation object by adopting a visualization tool based on the parameter type to be analyzed corresponding to the animation type;
and the display module is used for displaying the analyzed animation parameter information through the visualization tool.
In one possible embodiment, the animation type identification module, when acquiring the target animation object in execution, is configured to:
and acquiring the target animation object in execution by monitoring the animation adding event of the view layer of the view.
In a possible embodiment, when extracting, by using a visualization tool, animation parameter information corresponding to a parameter type to be analyzed from the target animation object based on the parameter type to be analyzed corresponding to the animation type, the analysis module is configured to:
if the animation type is basic animation or key frame animation, extracting curve parameter information of the animation from the target animation object by adopting the visualization tool;
and if the animation type is physical animation, extracting the physical parameter information of the animation from the target animation object by adopting the visualization tool.
In one possible embodiment, the animation type identification module, when acquiring the target animation object in execution, is configured to:
acquiring at least one target animation object corresponding to at least one view respectively;
the displaying the analyzed animation parameter information through the visualization tool comprises:
clustering the animation parameter information of the target animation object obtained by analysis according to the view information corresponding to the target animation object to generate a clustering result, wherein the clustering result comprises a view list, an animation list as the next level of the view list and animation details as the next level of the animation list; the animation details comprise animation parameter information;
and displaying the animation parameter information according to the clustering result.
In a possible implementation manner, the presentation module, when presenting the animation parameter information according to the clustering result, is configured to:
displaying the view list;
responding to the selected operation aiming at the target view information in the view list, and displaying an animation list associated with the target view information;
and responding to the selected operation aiming at the target animation in the animation list, and displaying the animation parameter information of the target animation.
In a possible implementation manner, the animation processing apparatus further includes a deletion module, configured to:
and in response to the deletion operation aiming at the target view information, deleting the target view information and the animation list and the animation details associated with the target view information.
In a possible implementation manner, the animation processing apparatus further includes a saving module, configured to:
responding to an animation parameter information storage instruction, and storing the animation parameter information in a curve preset library; or,
responding to an animation parameter information adjusting instruction, adjusting the animation parameter information, and storing the adjusted animation parameter information in a curve preset library.
In a third aspect, an embodiment of the present disclosure further provides a computer device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the computer device is running, the machine-readable instructions when executed by the processor performing the steps of any one of the possible implementations of the first aspect or the first aspect as described above.
In a fourth aspect, the disclosed embodiments also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the animation processing method.
According to the animation processing method, the animation processing device, the computer equipment and the storage medium, the target animation object in execution is obtained firstly, then the animation type corresponding to the target animation object is determined, then based on the parameter type to be analyzed corresponding to the animation type, the animation parameter information corresponding to the parameter type to be analyzed is extracted from the target animation object by adopting a visualization tool, and finally the analyzed animation parameter information is displayed through the visualization tool. By the method, the related animation parameters can be automatically analyzed for the animation object in execution, the analyzed related animation parameters can be directly presented to the dynamic effect designer and can be used for code writing of the animation by an engineer, so that the code writing accuracy is improved, the communication cost between the dynamic effect designer and the engineer is saved, and the communication efficiency is improved.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Example one
Referring to fig. 1, which is a flowchart of an animation processing method provided in an embodiment of the present disclosure, the method includes steps S101 to S103, where:
s101: and acquiring a target animation object in execution, and determining an animation type corresponding to the target animation object.
In a specific implementation, a monitoring node may be added in a view layer (layer) of a view to monitor an animation addition event, so as to obtain a target animation object being executed.
Here, in some operating systems, animation is added to the layer of the view when being executed, so if an animation addition event can be monitored in the layer, the target animation object being executed can be acquired. Specifically, a preset classification may be added to the view layer, where the classification includes a method for adding an animation to the view layer, and the method may be replaced with a monitoring method by method replacement, where the monitoring method is a method for monitoring a target animation object and executing an animation parameter analysis process, and thus, the target animation object may be monitored and the animation parameter analysis process may be executed by a monitoring method in the added classification.
In a specific implementation, since there are many animation types, the animation parameter information to be parsed is different for different animation types. Therefore, the target animation object needs to be identified first to determine the animation type to which the target animation object belongs.
To further understand the types of animations, several possible types of animations are described below.
FIG. 2 is a schematic diagram of inheritance relationships among several types of animations, in which:
as shown in the left branch of fig. 2, the physical animation (caspringmation) does not have a corresponding animation curve, and generally needs to analyze physical parameters such as mass (mass), elasticity (stiffness), damping (damping), initial velocity (initial velocity), and the like.
Physical animations inherit from a base animation (CABasicAnimation); base animation and keyframe animation (CAKeyframeanimation) are animations with corresponding animation curves, where the animation curves usually need to be represented by a third-order Bezier curve. A third-order Bezier curve has two control points, the third-order Bezier curve can be represented through the two control points, each control point has x and y coordinate parameters, therefore, four key parameters are needed for representing the animation curve, and besides, the basic animation and the key frame animation also have animation parameter information such as a starting point coordinate, an end point coordinate, animation duration, a key frame in the animation and the like.
Here, the keyframe animation differs from the base animation in that the base animation has only an initial state and an end state, and the keyframe animation may have multiple control states. Further, in the aspect of displaying effect, the key frame animation may display the animation effect of the back-and-forth movement such as vibration, swing, zoom-in, zoom-out, etc., and the basic animation may not display the animation effect of the back-and-forth movement due to lack of the control state between the initial state and the end state.
The basic animation and the key frame animation are inherited from abstract animation (Capropertylanimation), namely the abstract animation is a father class of the basic animation and the key frame animation; the abstract animation has no corresponding instance and cannot be directly and independently used in practical application, so that the analysis process of animation parameters is not required.
In addition to the above animation types, there are other types of animations, such as transition animation (cattransition), for transition between different pages, and the processing procedure thereof is not involved in the embodiment of the present disclosure.
The animation group (CAAnimationGroup) is a combination of multiple animations, that is, multiple animations executed simultaneously will be assembled into an animation group, and animation parameters of each animation in the combination can be separately analyzed.
Here, the animation group is a combined design mode, and can perform unified control on animation behaviors in the animation group, that is, each animation effect in the animation group can be executed concurrently. When the animation group is analyzed, each animation object in the animation group is respectively identified, and the animation type to which the animation object belongs is confirmed, so that the animation parameter information can be extracted according to the parameter type to be analyzed corresponding to the determined animation type.
The common base class of the aforementioned Capropertylanimation, CATransition, and CAAnamationGroup is an abstract base class (CAAnimation), and there is no corresponding example, so no analysis is necessary.
S102: and extracting animation parameter information corresponding to the parameter type to be analyzed from the target animation object by adopting a visualization tool based on the parameter type to be analyzed corresponding to the animation type.
The target animation object comprises various animation parameter information for rendering and forming corresponding animation effects, and after the animation type is determined, the animation parameter information under the parameter type to be analyzed can be extracted from the target animation object based on the parameter type to be analyzed corresponding to the animation type.
As described above, if the animation type is a basic animation or a keyframe animation, the parameter type to be analyzed belongs to a curve parameter type, and then the visualization tool is used to extract curve parameter information of the animation, such as information of the two control points, from the target animation object;
if the animation type is physical animation, the parameter type to be analyzed belongs to a physical parameter type, and at this time, the visualization tool is adopted to extract physical parameter information of the animation, such as the physical parameters of the mass, the stiff, the damming, the initial velocity and the like, from the target animation object.
S103: and displaying the extracted animation parameter information through the visualization tool.
In specific implementation, the analyzed animation parameter information can be displayed to an animation effect designer through a visualization tool, and based on the displayed animation parameter information, the animation effect designer can judge whether each animation parameter information accords with expectations or not and can adjust the animation parameter information.
In addition, the embodiment of the disclosure can encapsulate the analyzed animation parameter information at the engineer end to form a curve preset library, the engineer end extracts animation parameter information of related animations from the curve preset library, and writes animation codes based on the animation parameter information, and the engineer can interact with an animation effect designer based on the animation parameter information in the curve preset library and correspondingly modify the animation codes according to the feedback of the animation effect designer.
Fig. 3 is a schematic diagram of an animation parsing result shown in the embodiment of the present disclosure. Wherein the type (type) includes: difference animation (animation) and physical animation, wherein the attribute (attribute) represents the attribute of executing animation, such as transparency, position, height, and the like, for example, if transparency is added with animation, the corresponding graph becomes gradually lighter or darker. The initial value (start) and the end value (end) are the initial value and the end value of attribute. The animation duration (duration) is the duration of the animation. The curve name (curve) is easeInOut, which may be defined by a dynamic designer.
And sending the animation analysis results to a code engineer end, packaging the analysis results at the engineer end to form a curve preset library, and then finding corresponding animation parameter information by the engineer directly based on the curve name provided by the dynamic effect designer so as to write corresponding animation codes.
In addition, in specific implementation, due to the fact that service application scenes of the App are complex, related animations are more, the information quantity of all parameter information of all animations directly listed is very large, on one hand, the space of a display page is limited, and on the other hand, the user cannot view the information conveniently.
Based on this, the embodiment of the present disclosure performs multi-level clustering on the APP parsing result, and performs hierarchical display when displaying the animation parsing result. The specific process is as follows:
after at least one target animation object corresponding to at least one view is obtained, aiming at each target animation object, determining an animation type corresponding to the target animation object, and analyzing animation parameter information of the target animation object based on the animation type; after obtaining the animation parameter information of each target animation object, clustering the animation parameter information of the target animation object obtained by analysis to generate a clustering result, wherein the clustering result comprises a view list, an animation list and a hierarchical relationship between animation details; each view information in the view list is associated with an animation list, each animation list is associated with at least one animation detail of an animation, and the animation details comprise animation parameter information; and then, displaying the animation parameter information according to the clustering result.
Here, a view, such as a function button in APP, such as a contact list icon at the bottom of a certain social software is a view, and when the contact list icon is clicked, the contact list icon turns green, and an animation that the icon turns green is executed. There are many views in an APP, with many animations per view. According to the embodiment of the disclosure, all animation analysis results are clustered according to views, each view corresponds to one animation list, one or more animations exist in one animation list, and when the animation list is displayed, the view list is displayed firstly, and one or more views exist in the view list. After the user selects the target view information based on the view list, the animation list related to the target view information is displayed, and the animation list is sequentially expanded, so that the information amount browsed by the user at one time is reduced, the user can conveniently search, and the problem of insufficient display space on a page is solved.
Specifically, according to the clustering result, the process of displaying the animation parameter information may be:
first, the view list is presented, as shown in fig. 4 a;
here, in order to avoid a problem that long-time listening causes a large amount of view list data, the embodiment of the present disclosure adds a flush function. As shown in fig. 4a, a deletion symbol "x" is added to the upper right corner. Specifically, the target view information and the animation list and the animation details associated with the target view information may be deleted in response to a deletion operation for the target view information.
Then, a user (generally, an animation designer) can select one piece of target view information to view and process related animation parameters aiming at the view list;
at this time, in response to the operation of selecting the target view information in the view list, displaying an animation list associated with the target view information, as shown in fig. 4 b;
then, responding to the selection operation of the target animation in the animation list, displaying animation parameter information of the target animation, such as selecting the target animation 2 by a user, and displaying an animation analysis result page shown in fig. 3; in addition, if the target animation is an animation group, the interface shown in fig. 4c is displayed after the animation group is selected, wherein the interface includes information such as each animation in the animation group and the execution sequence (for example, simultaneous execution) between the animations, and any animation in the animation group is further selected, and an animation analysis result page corresponding to the any animation can be displayed.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
The animation processing method provided by the embodiment of the disclosure can automatically analyze the relevant animation parameters for the executed animation object, and the analyzed relevant animation parameters can be directly presented to the dynamic effect designer and can be used for code writing of the animation by an engineer, so that the accuracy of code writing is improved, the communication cost between the dynamic effect designer and the engineer is saved, and the communication efficiency is improved.
Based on the same inventive concept, the embodiment of the present disclosure further provides an animation processing apparatus corresponding to the animation processing method, and as the principle of the apparatus in the embodiment of the present disclosure for solving the problem is similar to the animation processing method in the embodiment of the present disclosure, the implementation of the animation processing apparatus may refer to the implementation of the method, and repeated details are not described again.
Referring to fig. 5, a schematic diagram of an animation processing apparatus 500 according to an embodiment of the disclosure is shown, where:
an animation type identification module 501, configured to obtain a target animation object in execution, and determine an animation type corresponding to the target animation object;
an analysis module 502, configured to extract, based on the parameter type to be analyzed corresponding to the animation type, animation parameter information corresponding to the parameter type to be analyzed from the target animation object by using a visualization tool;
a displaying module 503, configured to display the analyzed animation parameter information through the visualization tool.
In a possible implementation, the animation type recognition module 501 is specifically configured to:
and acquiring the target animation object in execution by monitoring the animation adding event of the view layer of the view.
In a possible implementation, the parsing module 502 is specifically configured to:
if the animation type is basic animation or key frame animation, extracting curve parameter information of the animation from the target animation object by adopting the visualization tool;
and if the animation type is physical animation, extracting the physical parameter information of the animation from the target animation object by adopting the visualization tool.
In a possible implementation, the animation type recognition module 501 is specifically configured to:
acquiring at least one target animation object corresponding to at least one view respectively;
the display module 503 is specifically configured to:
clustering the animation parameter information of the target animation object obtained by analysis according to the view information corresponding to the target animation object to generate a clustering result, wherein the clustering result comprises a view list, an animation list as the next level of the view list and animation details as the next level of the animation list; the animation details comprise animation parameter information;
and displaying the animation parameter information according to the clustering result.
In a possible implementation, the presentation module 503 is specifically configured to:
displaying the view list;
responding to the selected operation aiming at the target view information in the view list, and displaying an animation list associated with the target view information;
and responding to the selected operation aiming at the target animation in the animation list, and displaying the animation parameter information of the target animation.
In a possible implementation, the animation processing apparatus 500 further includes:
a deleting module 504, configured to, after the displaying module 503 displays the animation list associated with the target view information, respond to a deleting operation for the target view information, and delete the target view information and the animation list and the animation details associated with the target view information.
In a possible implementation, the animation processing apparatus 500 further includes:
a saving module 505, configured to respond to an animation parameter information saving instruction, and save the animation parameter information in a curve preset library; or,
responding to an animation parameter information adjusting instruction, adjusting the animation parameter information, and storing the adjusted animation parameter information in a curve preset library.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
Based on the same technical concept, the embodiment of the disclosure also provides computer equipment. Referring to fig. 6, a schematic structural diagram of a computer device 600 provided in the embodiment of the present disclosure includes a processor 601, a memory 602, and a bus 603. The memory 602 is used for storing execution instructions and includes a memory 6021 and an external memory 6022; the memory 6021 is also referred to as an internal memory and is used for temporarily storing the operation data in the processor 601 and the data exchanged with the external memory 6022 such as a hard disk, the processor 601 exchanges data with the external memory 6022 through the memory 6021, and when the computer device 600 operates, the processor 601 communicates with the memory 602 through the bus 603, so that the processor 601 executes the following instructions:
acquiring a target animation object in execution, and determining an animation type corresponding to the target animation object;
extracting animation parameter information corresponding to the parameter type to be analyzed from the target animation object by adopting a visualization tool based on the parameter type to be analyzed corresponding to the animation type;
and displaying the extracted animation parameter information through the visualization tool.
In one possible embodiment, the processor 601 executes instructions to obtain a target animation object in execution, including:
and acquiring the target animation object in execution by monitoring the animation adding event of the view layer of the view.
In a possible implementation manner, the extracting, by the processor 601, animation parameter information corresponding to a parameter type to be analyzed from the target animation object by using a visualization tool based on the parameter type to be analyzed corresponding to the animation type includes:
if the animation type is basic animation or key frame animation, extracting curve parameter information of the animation from the target animation object by adopting the visualization tool;
and if the animation type is physical animation, extracting the physical parameter information of the animation from the target animation object by adopting the visualization tool.
In one possible embodiment, the processor 601 executes instructions to obtain a target animation object in execution, including:
acquiring at least one target animation object corresponding to at least one view respectively;
the displaying the analyzed animation parameter information through the visualization tool comprises:
clustering the animation parameter information of the target animation object obtained by analysis according to the view information corresponding to the target animation object to generate a clustering result, wherein the clustering result comprises a view list, an animation list as the next level of the view list and animation details as the next level of the animation list; the animation details comprise animation parameter information;
and displaying the animation parameter information according to the clustering result.
In a possible implementation manner, the instructions executed by the processor 601 to present the animation parameter information according to the clustering result include:
displaying the view list;
responding to the selected operation aiming at the target view information in the view list, and displaying an animation list associated with the target view information;
and responding to the selected operation aiming at the target animation in the animation list, and displaying the animation parameter information of the target animation.
In a possible implementation, the instructions executed by the processor 601, after presenting the animation list associated with the target view information, further include:
and in response to the deletion operation aiming at the target view information, deleting the target view information and the animation list and the animation details associated with the target view information.
In a possible implementation manner, the instructions executed by the processor 601, after displaying the extracted animation parameter information through the visualization tool, further include:
responding to an animation parameter information storage instruction, and storing the animation parameter information in a curve preset library; or,
responding to an animation parameter information adjusting instruction, adjusting the animation parameter information, and storing the adjusted animation parameter information in a curve preset library.
The embodiments of the present disclosure also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the animation processing method described in the above method embodiments. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The computer program product of the animation processing method provided in the embodiment of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute steps of the animation processing method in the above method embodiment, which may be referred to specifically for the above method embodiment, and are not described herein again.
The embodiments of the present disclosure also provide a computer program, which when executed by a processor implements any one of the methods of the foregoing embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.