CN111951355A - Animation processing method and device, computer equipment and storage medium - Google Patents

Animation processing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN111951355A
CN111951355A CN202010773901.XA CN202010773901A CN111951355A CN 111951355 A CN111951355 A CN 111951355A CN 202010773901 A CN202010773901 A CN 202010773901A CN 111951355 A CN111951355 A CN 111951355A
Authority
CN
China
Prior art keywords
animation
target
parameter information
type
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010773901.XA
Other languages
Chinese (zh)
Inventor
赵轶凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202010773901.XA priority Critical patent/CN111951355A/en
Publication of CN111951355A publication Critical patent/CN111951355A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present disclosure provides an animation processing method, apparatus, computer device, and storage medium, wherein the method comprises: firstly, acquiring a target animation object in execution, determining an animation type corresponding to the target animation object, then extracting animation parameter information corresponding to a parameter type to be analyzed from the target animation object by adopting a visualization tool based on the parameter type to be analyzed corresponding to the animation type, and finally displaying the extracted animation parameter information by the visualization tool. In this way, the embodiment of the disclosure can automatically analyze the relevant animation parameters for the executed animation object, and the analyzed relevant animation parameters can be directly presented to the dynamic effect designer and can be used for code compiling of the animation by the engineer, so that the communication cost between the dynamic effect designer and the engineer is saved, and the communication efficiency is improved.

Description

Animation processing method and device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of animation technologies, and in particular, to an animation processing method, an animation processing apparatus, a computer device, and a storage medium.
Background
A dynamic effect, i.e., animation effect, in an application refers to a dynamic rendering effect of an associated view in the application, e.g., an icon moving from an A position to a B position. The animation effect needs to be designed, and different animation effects can be presented by using different animation parameters.
Because the dynamic characteristics of the dynamic effects are more complex and less descriptive than the static characteristics, the development and acceptance costs of the dynamic effects are greater. In general, after the designer designs the animation preview effect, the animation preview effect is provided to the engineer for designing the animation code. However, the engineer may not understand the animation effect term provided by the designer in the annotation information, and thus the animation effect achieved may be different from the preview effect provided by the designer.
Disclosure of Invention
The embodiment of the disclosure at least provides an animation processing method, an animation processing device, computer equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides an animation processing method, including:
acquiring a target animation object in execution, and determining an animation type corresponding to the target animation object;
extracting animation parameter information corresponding to the parameter type to be analyzed from the target animation object by adopting a visualization tool based on the parameter type to be analyzed corresponding to the animation type;
and displaying the extracted animation parameter information through the visualization tool.
In an alternative embodiment, obtaining a target animation object in execution includes:
and acquiring the target animation object in execution by monitoring the animation adding event of the view layer of the view.
In an optional implementation manner, based on the parameter type to be analyzed corresponding to the animation type, extracting, by using a visualization tool, animation parameter information corresponding to the parameter type to be analyzed from the target animation object, includes:
if the animation type is basic animation or key frame animation, extracting curve parameter information of the animation from the target animation object by adopting the visualization tool;
and if the animation type is physical animation, extracting the physical parameter information of the animation from the target animation object by adopting the visualization tool.
In an alternative embodiment, obtaining a target animation object in execution includes:
acquiring at least one target animation object corresponding to at least one view respectively;
the displaying the analyzed animation parameter information through the visualization tool comprises:
clustering the animation parameter information of the target animation object obtained by analysis according to the view information corresponding to the target animation object to generate a clustering result, wherein the clustering result comprises a view list, an animation list as the next level of the view list and animation details as the next level of the animation list; the animation details comprise animation parameter information;
and displaying the animation parameter information according to the clustering result.
In an optional embodiment, presenting the animation parameter information according to the clustering result includes:
displaying the view list;
responding to the selected operation aiming at the target view information in the view list, and displaying an animation list associated with the target view information;
and responding to the selected operation aiming at the target animation in the animation list, and displaying the animation parameter information of the target animation.
In an optional implementation manner, after presenting the animation list associated with the target view information, the method further includes:
and in response to the deletion operation aiming at the target view information, deleting the target view information and the animation list and the animation details associated with the target view information.
In an optional embodiment, after displaying the extracted animation parameter information through the visualization tool, the method further includes:
responding to an animation parameter information storage instruction, and storing the animation parameter information in a curve preset library; or,
responding to an animation parameter information adjusting instruction, adjusting the animation parameter information, and storing the adjusted animation parameter information in a curve preset library.
In a second aspect, an embodiment of the present disclosure further provides an animation processing apparatus, including:
the animation type identification module is used for acquiring a target animation object in execution and determining an animation type corresponding to the target animation object;
the analysis module is used for extracting animation parameter information corresponding to the parameter type to be analyzed from the target animation object by adopting a visualization tool based on the parameter type to be analyzed corresponding to the animation type;
and the display module is used for displaying the analyzed animation parameter information through the visualization tool.
In one possible embodiment, the animation type identification module, when acquiring the target animation object in execution, is configured to:
and acquiring the target animation object in execution by monitoring the animation adding event of the view layer of the view.
In a possible embodiment, when extracting, by using a visualization tool, animation parameter information corresponding to a parameter type to be analyzed from the target animation object based on the parameter type to be analyzed corresponding to the animation type, the analysis module is configured to:
if the animation type is basic animation or key frame animation, extracting curve parameter information of the animation from the target animation object by adopting the visualization tool;
and if the animation type is physical animation, extracting the physical parameter information of the animation from the target animation object by adopting the visualization tool.
In one possible embodiment, the animation type identification module, when acquiring the target animation object in execution, is configured to:
acquiring at least one target animation object corresponding to at least one view respectively;
the displaying the analyzed animation parameter information through the visualization tool comprises:
clustering the animation parameter information of the target animation object obtained by analysis according to the view information corresponding to the target animation object to generate a clustering result, wherein the clustering result comprises a view list, an animation list as the next level of the view list and animation details as the next level of the animation list; the animation details comprise animation parameter information;
and displaying the animation parameter information according to the clustering result.
In a possible implementation manner, the presentation module, when presenting the animation parameter information according to the clustering result, is configured to:
displaying the view list;
responding to the selected operation aiming at the target view information in the view list, and displaying an animation list associated with the target view information;
and responding to the selected operation aiming at the target animation in the animation list, and displaying the animation parameter information of the target animation.
In a possible implementation manner, the animation processing apparatus further includes a deletion module, configured to:
and in response to the deletion operation aiming at the target view information, deleting the target view information and the animation list and the animation details associated with the target view information.
In a possible implementation manner, the animation processing apparatus further includes a saving module, configured to:
responding to an animation parameter information storage instruction, and storing the animation parameter information in a curve preset library; or,
responding to an animation parameter information adjusting instruction, adjusting the animation parameter information, and storing the adjusted animation parameter information in a curve preset library.
In a third aspect, an embodiment of the present disclosure further provides a computer device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the computer device is running, the machine-readable instructions when executed by the processor performing the steps of any one of the possible implementations of the first aspect or the first aspect as described above.
In a fourth aspect, the disclosed embodiments also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the animation processing method.
According to the animation processing method, the animation processing device, the computer equipment and the storage medium, the target animation object in execution is obtained firstly, then the animation type corresponding to the target animation object is determined, then based on the parameter type to be analyzed corresponding to the animation type, the animation parameter information corresponding to the parameter type to be analyzed is extracted from the target animation object by adopting a visualization tool, and finally the analyzed animation parameter information is displayed through the visualization tool. By the method, the related animation parameters can be automatically analyzed for the animation object in execution, the analyzed related animation parameters can be directly presented to the dynamic effect designer and can be used for code writing of the animation by an engineer, so that the code writing accuracy is improved, the communication cost between the dynamic effect designer and the engineer is saved, and the communication efficiency is improved.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
FIG. 1 is a flow chart illustrating an animation processing method provided by an embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating an inheritance relationship between several types of animations in an animation processing method provided by an embodiment of the present disclosure;
FIG. 3 is a diagram illustrating an animation parsing result according to an embodiment of the disclosure;
FIG. 4a illustrates a view list diagram of an embodiment of the present disclosure;
FIG. 4b is a schematic diagram of an animation list associated with the target view information, according to an embodiment of the disclosure;
FIG. 4c is a diagram illustrating an animation group list presented by an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of an animation processing apparatus 500 according to an embodiment of the disclosure;
fig. 6 shows a schematic diagram of a computer device 600 provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
The research shows that:
due to the different professional awareness of the animation-related information between the dynamic effect designer and the engineer, the dynamic effect designer and the engineer expend a great deal of effort in the cycle of understanding, communicating, modifying and verifying in the process of developing a product with a high reduction degree.
Based on the above research, the embodiments of the present disclosure provide an animation processing method, an apparatus, a computer device, and a storage medium, which can automatically analyze and visualize animation parameter information during an animation object execution process, so that an animation designer can visually see automatically analyzed animation parameters through a visualization tool, and based on this, the animation designer can point out the problem of animation from the professional perspective of the engineer during the communication process with the engineer, thereby greatly reducing the communication cost between both parties. In addition, an engineer can directly use the analyzed animation parameter information to design codes, the error probability is greatly reduced, and the animation design effect of the designer can be better achieved.
The discovery and solution of the above problems are the results of the inventor after practical and careful study, and therefore, the discovery of the above problems and the solutions proposed by the present disclosure to the above problems should be the contributions of the inventor to the present disclosure.
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. The components of the present disclosure, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
To facilitate understanding of the present embodiment, first, an animation processing method disclosed in the embodiments of the present disclosure is described in detail, where an execution subject of the animation processing method provided in the embodiments of the present disclosure is generally a computer device with certain computing capability, and the computer device includes, for example: a terminal device, which may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle mounted device, a wearable device, or a server or other processing device. In some possible implementations, the animation processing method may be implemented by a processor calling computer readable instructions stored in a memory.
The following describes an animation processing method provided by the embodiment of the present disclosure by taking an execution subject as a terminal device as an example.
Example one
Referring to fig. 1, which is a flowchart of an animation processing method provided in an embodiment of the present disclosure, the method includes steps S101 to S103, where:
s101: and acquiring a target animation object in execution, and determining an animation type corresponding to the target animation object.
In a specific implementation, a monitoring node may be added in a view layer (layer) of a view to monitor an animation addition event, so as to obtain a target animation object being executed.
Here, in some operating systems, animation is added to the layer of the view when being executed, so if an animation addition event can be monitored in the layer, the target animation object being executed can be acquired. Specifically, a preset classification may be added to the view layer, where the classification includes a method for adding an animation to the view layer, and the method may be replaced with a monitoring method by method replacement, where the monitoring method is a method for monitoring a target animation object and executing an animation parameter analysis process, and thus, the target animation object may be monitored and the animation parameter analysis process may be executed by a monitoring method in the added classification.
In a specific implementation, since there are many animation types, the animation parameter information to be parsed is different for different animation types. Therefore, the target animation object needs to be identified first to determine the animation type to which the target animation object belongs.
To further understand the types of animations, several possible types of animations are described below.
FIG. 2 is a schematic diagram of inheritance relationships among several types of animations, in which:
as shown in the left branch of fig. 2, the physical animation (caspringmation) does not have a corresponding animation curve, and generally needs to analyze physical parameters such as mass (mass), elasticity (stiffness), damping (damping), initial velocity (initial velocity), and the like.
Physical animations inherit from a base animation (CABasicAnimation); base animation and keyframe animation (CAKeyframeanimation) are animations with corresponding animation curves, where the animation curves usually need to be represented by a third-order Bezier curve. A third-order Bezier curve has two control points, the third-order Bezier curve can be represented through the two control points, each control point has x and y coordinate parameters, therefore, four key parameters are needed for representing the animation curve, and besides, the basic animation and the key frame animation also have animation parameter information such as a starting point coordinate, an end point coordinate, animation duration, a key frame in the animation and the like.
Here, the keyframe animation differs from the base animation in that the base animation has only an initial state and an end state, and the keyframe animation may have multiple control states. Further, in the aspect of displaying effect, the key frame animation may display the animation effect of the back-and-forth movement such as vibration, swing, zoom-in, zoom-out, etc., and the basic animation may not display the animation effect of the back-and-forth movement due to lack of the control state between the initial state and the end state.
The basic animation and the key frame animation are inherited from abstract animation (Capropertylanimation), namely the abstract animation is a father class of the basic animation and the key frame animation; the abstract animation has no corresponding instance and cannot be directly and independently used in practical application, so that the analysis process of animation parameters is not required.
In addition to the above animation types, there are other types of animations, such as transition animation (cattransition), for transition between different pages, and the processing procedure thereof is not involved in the embodiment of the present disclosure.
The animation group (CAAnimationGroup) is a combination of multiple animations, that is, multiple animations executed simultaneously will be assembled into an animation group, and animation parameters of each animation in the combination can be separately analyzed.
Here, the animation group is a combined design mode, and can perform unified control on animation behaviors in the animation group, that is, each animation effect in the animation group can be executed concurrently. When the animation group is analyzed, each animation object in the animation group is respectively identified, and the animation type to which the animation object belongs is confirmed, so that the animation parameter information can be extracted according to the parameter type to be analyzed corresponding to the determined animation type.
The common base class of the aforementioned Capropertylanimation, CATransition, and CAAnamationGroup is an abstract base class (CAAnimation), and there is no corresponding example, so no analysis is necessary.
S102: and extracting animation parameter information corresponding to the parameter type to be analyzed from the target animation object by adopting a visualization tool based on the parameter type to be analyzed corresponding to the animation type.
The target animation object comprises various animation parameter information for rendering and forming corresponding animation effects, and after the animation type is determined, the animation parameter information under the parameter type to be analyzed can be extracted from the target animation object based on the parameter type to be analyzed corresponding to the animation type.
As described above, if the animation type is a basic animation or a keyframe animation, the parameter type to be analyzed belongs to a curve parameter type, and then the visualization tool is used to extract curve parameter information of the animation, such as information of the two control points, from the target animation object;
if the animation type is physical animation, the parameter type to be analyzed belongs to a physical parameter type, and at this time, the visualization tool is adopted to extract physical parameter information of the animation, such as the physical parameters of the mass, the stiff, the damming, the initial velocity and the like, from the target animation object.
S103: and displaying the extracted animation parameter information through the visualization tool.
In specific implementation, the analyzed animation parameter information can be displayed to an animation effect designer through a visualization tool, and based on the displayed animation parameter information, the animation effect designer can judge whether each animation parameter information accords with expectations or not and can adjust the animation parameter information.
In addition, the embodiment of the disclosure can encapsulate the analyzed animation parameter information at the engineer end to form a curve preset library, the engineer end extracts animation parameter information of related animations from the curve preset library, and writes animation codes based on the animation parameter information, and the engineer can interact with an animation effect designer based on the animation parameter information in the curve preset library and correspondingly modify the animation codes according to the feedback of the animation effect designer.
Fig. 3 is a schematic diagram of an animation parsing result shown in the embodiment of the present disclosure. Wherein the type (type) includes: difference animation (animation) and physical animation, wherein the attribute (attribute) represents the attribute of executing animation, such as transparency, position, height, and the like, for example, if transparency is added with animation, the corresponding graph becomes gradually lighter or darker. The initial value (start) and the end value (end) are the initial value and the end value of attribute. The animation duration (duration) is the duration of the animation. The curve name (curve) is easeInOut, which may be defined by a dynamic designer.
And sending the animation analysis results to a code engineer end, packaging the analysis results at the engineer end to form a curve preset library, and then finding corresponding animation parameter information by the engineer directly based on the curve name provided by the dynamic effect designer so as to write corresponding animation codes.
In addition, in specific implementation, due to the fact that service application scenes of the App are complex, related animations are more, the information quantity of all parameter information of all animations directly listed is very large, on one hand, the space of a display page is limited, and on the other hand, the user cannot view the information conveniently.
Based on this, the embodiment of the present disclosure performs multi-level clustering on the APP parsing result, and performs hierarchical display when displaying the animation parsing result. The specific process is as follows:
after at least one target animation object corresponding to at least one view is obtained, aiming at each target animation object, determining an animation type corresponding to the target animation object, and analyzing animation parameter information of the target animation object based on the animation type; after obtaining the animation parameter information of each target animation object, clustering the animation parameter information of the target animation object obtained by analysis to generate a clustering result, wherein the clustering result comprises a view list, an animation list and a hierarchical relationship between animation details; each view information in the view list is associated with an animation list, each animation list is associated with at least one animation detail of an animation, and the animation details comprise animation parameter information; and then, displaying the animation parameter information according to the clustering result.
Here, a view, such as a function button in APP, such as a contact list icon at the bottom of a certain social software is a view, and when the contact list icon is clicked, the contact list icon turns green, and an animation that the icon turns green is executed. There are many views in an APP, with many animations per view. According to the embodiment of the disclosure, all animation analysis results are clustered according to views, each view corresponds to one animation list, one or more animations exist in one animation list, and when the animation list is displayed, the view list is displayed firstly, and one or more views exist in the view list. After the user selects the target view information based on the view list, the animation list related to the target view information is displayed, and the animation list is sequentially expanded, so that the information amount browsed by the user at one time is reduced, the user can conveniently search, and the problem of insufficient display space on a page is solved.
Specifically, according to the clustering result, the process of displaying the animation parameter information may be:
first, the view list is presented, as shown in fig. 4 a;
here, in order to avoid a problem that long-time listening causes a large amount of view list data, the embodiment of the present disclosure adds a flush function. As shown in fig. 4a, a deletion symbol "x" is added to the upper right corner. Specifically, the target view information and the animation list and the animation details associated with the target view information may be deleted in response to a deletion operation for the target view information.
Then, a user (generally, an animation designer) can select one piece of target view information to view and process related animation parameters aiming at the view list;
at this time, in response to the operation of selecting the target view information in the view list, displaying an animation list associated with the target view information, as shown in fig. 4 b;
then, responding to the selection operation of the target animation in the animation list, displaying animation parameter information of the target animation, such as selecting the target animation 2 by a user, and displaying an animation analysis result page shown in fig. 3; in addition, if the target animation is an animation group, the interface shown in fig. 4c is displayed after the animation group is selected, wherein the interface includes information such as each animation in the animation group and the execution sequence (for example, simultaneous execution) between the animations, and any animation in the animation group is further selected, and an animation analysis result page corresponding to the any animation can be displayed.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
The animation processing method provided by the embodiment of the disclosure can automatically analyze the relevant animation parameters for the executed animation object, and the analyzed relevant animation parameters can be directly presented to the dynamic effect designer and can be used for code writing of the animation by an engineer, so that the accuracy of code writing is improved, the communication cost between the dynamic effect designer and the engineer is saved, and the communication efficiency is improved.
Based on the same inventive concept, the embodiment of the present disclosure further provides an animation processing apparatus corresponding to the animation processing method, and as the principle of the apparatus in the embodiment of the present disclosure for solving the problem is similar to the animation processing method in the embodiment of the present disclosure, the implementation of the animation processing apparatus may refer to the implementation of the method, and repeated details are not described again.
Referring to fig. 5, a schematic diagram of an animation processing apparatus 500 according to an embodiment of the disclosure is shown, where:
an animation type identification module 501, configured to obtain a target animation object in execution, and determine an animation type corresponding to the target animation object;
an analysis module 502, configured to extract, based on the parameter type to be analyzed corresponding to the animation type, animation parameter information corresponding to the parameter type to be analyzed from the target animation object by using a visualization tool;
a displaying module 503, configured to display the analyzed animation parameter information through the visualization tool.
In a possible implementation, the animation type recognition module 501 is specifically configured to:
and acquiring the target animation object in execution by monitoring the animation adding event of the view layer of the view.
In a possible implementation, the parsing module 502 is specifically configured to:
if the animation type is basic animation or key frame animation, extracting curve parameter information of the animation from the target animation object by adopting the visualization tool;
and if the animation type is physical animation, extracting the physical parameter information of the animation from the target animation object by adopting the visualization tool.
In a possible implementation, the animation type recognition module 501 is specifically configured to:
acquiring at least one target animation object corresponding to at least one view respectively;
the display module 503 is specifically configured to:
clustering the animation parameter information of the target animation object obtained by analysis according to the view information corresponding to the target animation object to generate a clustering result, wherein the clustering result comprises a view list, an animation list as the next level of the view list and animation details as the next level of the animation list; the animation details comprise animation parameter information;
and displaying the animation parameter information according to the clustering result.
In a possible implementation, the presentation module 503 is specifically configured to:
displaying the view list;
responding to the selected operation aiming at the target view information in the view list, and displaying an animation list associated with the target view information;
and responding to the selected operation aiming at the target animation in the animation list, and displaying the animation parameter information of the target animation.
In a possible implementation, the animation processing apparatus 500 further includes:
a deleting module 504, configured to, after the displaying module 503 displays the animation list associated with the target view information, respond to a deleting operation for the target view information, and delete the target view information and the animation list and the animation details associated with the target view information.
In a possible implementation, the animation processing apparatus 500 further includes:
a saving module 505, configured to respond to an animation parameter information saving instruction, and save the animation parameter information in a curve preset library; or,
responding to an animation parameter information adjusting instruction, adjusting the animation parameter information, and storing the adjusted animation parameter information in a curve preset library.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
Based on the same technical concept, the embodiment of the disclosure also provides computer equipment. Referring to fig. 6, a schematic structural diagram of a computer device 600 provided in the embodiment of the present disclosure includes a processor 601, a memory 602, and a bus 603. The memory 602 is used for storing execution instructions and includes a memory 6021 and an external memory 6022; the memory 6021 is also referred to as an internal memory and is used for temporarily storing the operation data in the processor 601 and the data exchanged with the external memory 6022 such as a hard disk, the processor 601 exchanges data with the external memory 6022 through the memory 6021, and when the computer device 600 operates, the processor 601 communicates with the memory 602 through the bus 603, so that the processor 601 executes the following instructions:
acquiring a target animation object in execution, and determining an animation type corresponding to the target animation object;
extracting animation parameter information corresponding to the parameter type to be analyzed from the target animation object by adopting a visualization tool based on the parameter type to be analyzed corresponding to the animation type;
and displaying the extracted animation parameter information through the visualization tool.
In one possible embodiment, the processor 601 executes instructions to obtain a target animation object in execution, including:
and acquiring the target animation object in execution by monitoring the animation adding event of the view layer of the view.
In a possible implementation manner, the extracting, by the processor 601, animation parameter information corresponding to a parameter type to be analyzed from the target animation object by using a visualization tool based on the parameter type to be analyzed corresponding to the animation type includes:
if the animation type is basic animation or key frame animation, extracting curve parameter information of the animation from the target animation object by adopting the visualization tool;
and if the animation type is physical animation, extracting the physical parameter information of the animation from the target animation object by adopting the visualization tool.
In one possible embodiment, the processor 601 executes instructions to obtain a target animation object in execution, including:
acquiring at least one target animation object corresponding to at least one view respectively;
the displaying the analyzed animation parameter information through the visualization tool comprises:
clustering the animation parameter information of the target animation object obtained by analysis according to the view information corresponding to the target animation object to generate a clustering result, wherein the clustering result comprises a view list, an animation list as the next level of the view list and animation details as the next level of the animation list; the animation details comprise animation parameter information;
and displaying the animation parameter information according to the clustering result.
In a possible implementation manner, the instructions executed by the processor 601 to present the animation parameter information according to the clustering result include:
displaying the view list;
responding to the selected operation aiming at the target view information in the view list, and displaying an animation list associated with the target view information;
and responding to the selected operation aiming at the target animation in the animation list, and displaying the animation parameter information of the target animation.
In a possible implementation, the instructions executed by the processor 601, after presenting the animation list associated with the target view information, further include:
and in response to the deletion operation aiming at the target view information, deleting the target view information and the animation list and the animation details associated with the target view information.
In a possible implementation manner, the instructions executed by the processor 601, after displaying the extracted animation parameter information through the visualization tool, further include:
responding to an animation parameter information storage instruction, and storing the animation parameter information in a curve preset library; or,
responding to an animation parameter information adjusting instruction, adjusting the animation parameter information, and storing the adjusted animation parameter information in a curve preset library.
The embodiments of the present disclosure also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the animation processing method described in the above method embodiments. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The computer program product of the animation processing method provided in the embodiment of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute steps of the animation processing method in the above method embodiment, which may be referred to specifically for the above method embodiment, and are not described herein again.
The embodiments of the present disclosure also provide a computer program, which when executed by a processor implements any one of the methods of the foregoing embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. An animation processing method, comprising:
acquiring a target animation object in execution, and determining an animation type corresponding to the target animation object;
extracting animation parameter information corresponding to the parameter type to be analyzed from the target animation object by adopting a visualization tool based on the parameter type to be analyzed corresponding to the animation type;
and displaying the extracted animation parameter information through the visualization tool.
2. The method of claim 1, wherein obtaining a target animated object in execution comprises:
and acquiring the target animation object in execution by monitoring the animation adding event of the view layer of the view.
3. The method of claim 1, wherein extracting animation parameter information corresponding to the parameter type to be analyzed from the target animation object by using a visualization tool based on the parameter type to be analyzed corresponding to the animation type comprises:
if the animation type is basic animation or key frame animation, extracting curve parameter information of the animation from the target animation object by adopting the visualization tool;
and if the animation type is physical animation, extracting the physical parameter information of the animation from the target animation object by adopting the visualization tool.
4. The method according to any one of claims 1 to 3, wherein obtaining the target animation object in execution comprises:
acquiring at least one target animation object corresponding to at least one view respectively;
the displaying the analyzed animation parameter information through the visualization tool comprises:
clustering the animation parameter information of the target animation object obtained by analysis according to the view information corresponding to the target animation object to generate a clustering result, wherein the clustering result comprises a view list, an animation list as the next level of the view list and animation details as the next level of the animation list; the animation details comprise animation parameter information;
and displaying the animation parameter information according to the clustering result.
5. The method of claim 4, wherein presenting the animation parameter information according to the clustering result comprises:
displaying the view list;
responding to the selected operation aiming at the target view information in the view list, and displaying an animation list associated with the target view information;
and responding to the selected operation aiming at the target animation in the animation list, and displaying the animation parameter information of the target animation.
6. The method of claim 5, wherein after presenting the animation list associated with the target view information, further comprising:
and in response to the deletion operation aiming at the target view information, deleting the target view information and the animation list and the animation details associated with the target view information.
7. The method of claim 1, wherein after displaying the extracted animation parameter information through the visualization tool, further comprising:
responding to an animation parameter information storage instruction, and storing the animation parameter information in a curve preset library; or,
responding to an animation parameter information adjusting instruction, adjusting the animation parameter information, and storing the adjusted animation parameter information in a curve preset library.
8. An animation processing apparatus, comprising:
the animation type identification module is used for acquiring a target animation object in execution and determining an animation type corresponding to the target animation object;
the analysis module is used for extracting animation parameter information corresponding to the parameter type to be analyzed from the target animation object by adopting a visualization tool based on the parameter type to be analyzed corresponding to the animation type;
and the display module is used for displaying the analyzed animation parameter information through the visualization tool.
9. A computer device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the computer device is running, the machine-readable instructions when executed by the processor performing the steps of the animation processing method as claimed in any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, performs the steps of the animation processing method as claimed in any one of claims 1 to 7.
CN202010773901.XA 2020-08-04 2020-08-04 Animation processing method and device, computer equipment and storage medium Pending CN111951355A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010773901.XA CN111951355A (en) 2020-08-04 2020-08-04 Animation processing method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010773901.XA CN111951355A (en) 2020-08-04 2020-08-04 Animation processing method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111951355A true CN111951355A (en) 2020-11-17

Family

ID=73339426

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010773901.XA Pending CN111951355A (en) 2020-08-04 2020-08-04 Animation processing method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111951355A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112634409A (en) * 2020-12-28 2021-04-09 稿定(厦门)科技有限公司 Custom animation curve generation method and device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103942050A (en) * 2014-04-15 2014-07-23 Tcl集团股份有限公司 Implementation method and system for applying animation to Android platform
CN104050579A (en) * 2013-03-12 2014-09-17 阿里巴巴集团控股有限公司 Method and apparatus for realizing animation carousel effect
CN107608993A (en) * 2016-07-12 2018-01-19 腾讯科技(深圳)有限公司 The method and apparatus of web animation generation
CN108038894A (en) * 2017-12-11 2018-05-15 武汉斗鱼网络科技有限公司 Animation creation method, device, electronic equipment and computer-readable recording medium
CN109064527A (en) * 2018-07-02 2018-12-21 武汉斗鱼网络科技有限公司 Implementation method, device, storage medium and the android terminal of dynamic configuration animation
CN109117137A (en) * 2018-08-07 2019-01-01 武汉斗鱼网络科技有限公司 Advertisement moving picture executes method, apparatus, terminal and readable medium
CN109242934A (en) * 2017-07-06 2019-01-18 阿里巴巴集团控股有限公司 A kind of generation method and equipment of animation code
CN110007907A (en) * 2019-01-04 2019-07-12 阿里巴巴集团控股有限公司 A kind of animation execution method and device
CN110136230A (en) * 2019-03-29 2019-08-16 北京达佳互联信息技术有限公司 Cartoon display method, device, electronic equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104050579A (en) * 2013-03-12 2014-09-17 阿里巴巴集团控股有限公司 Method and apparatus for realizing animation carousel effect
CN103942050A (en) * 2014-04-15 2014-07-23 Tcl集团股份有限公司 Implementation method and system for applying animation to Android platform
CN107608993A (en) * 2016-07-12 2018-01-19 腾讯科技(深圳)有限公司 The method and apparatus of web animation generation
CN109242934A (en) * 2017-07-06 2019-01-18 阿里巴巴集团控股有限公司 A kind of generation method and equipment of animation code
CN108038894A (en) * 2017-12-11 2018-05-15 武汉斗鱼网络科技有限公司 Animation creation method, device, electronic equipment and computer-readable recording medium
CN109064527A (en) * 2018-07-02 2018-12-21 武汉斗鱼网络科技有限公司 Implementation method, device, storage medium and the android terminal of dynamic configuration animation
CN109117137A (en) * 2018-08-07 2019-01-01 武汉斗鱼网络科技有限公司 Advertisement moving picture executes method, apparatus, terminal and readable medium
CN110007907A (en) * 2019-01-04 2019-07-12 阿里巴巴集团控股有限公司 A kind of animation execution method and device
CN110136230A (en) * 2019-03-29 2019-08-16 北京达佳互联信息技术有限公司 Cartoon display method, device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112634409A (en) * 2020-12-28 2021-04-09 稿定(厦门)科技有限公司 Custom animation curve generation method and device

Similar Documents

Publication Publication Date Title
TWI808393B (en) Page processing method, device, apparatus and storage medium
CN111814089A (en) Page rendering method and device, rendering server and storage medium
CN109471900A (en) Chart class data self action data exchange method and system, computer program
CN111666740A (en) Flow chart generation method and device, computer equipment and storage medium
CN112817866A (en) Recording playback method, device, system, computer equipment and storage medium
CN111949832A (en) Method and device for analyzing dependency relationship of batch operation
CN113268243B (en) Memory prediction method and device, storage medium and electronic equipment
CN104699408A (en) Operation method and device of touch screen and touch device
CN110119299A (en) Information display method and equipment
CN113407284A (en) Navigation interface generation method and device, storage medium and electronic equipment
CN111460235A (en) Atlas data processing method, device, equipment and storage medium
CN113051095B (en) Method and device for reproducing operation errors of client, electronic equipment and storage medium
CN109408322B (en) A kind of automatic business process implementation method of cloud platform
CN111951355A (en) Animation processing method and device, computer equipment and storage medium
JP2023553220A (en) Process mining for multi-instance processes
CN107220044B (en) Method and device for driving business object based on meta-model
US8024158B2 (en) Management system and management method of CAD data used for a structural analysis
CN108255486B (en) View conversion method and device for form design and electronic equipment
TW202016723A (en) Method for adaptively adjusting amount of information in user interface design and electronic device
CN106469086B (en) Event processing method and device
CN104156209A (en) Cross-platform application interface modeling method and device
CN108563578A (en) SDK compatibility detection method, device, equipment and readable storage medium
CN114564773A (en) Component screening and checking method and system based on BIM (building information modeling)
CN114239095A (en) Method and device for generating product customization module, electronic equipment and storage medium
CN115248891A (en) Page display method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant after: Douyin Vision Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant before: Tiktok vision (Beijing) Co.,Ltd.

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant after: Tiktok vision (Beijing) Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Applicant before: BEIJING BYTEDANCE NETWORK TECHNOLOGY Co.,Ltd.