CN117193720A - Animation realization method and device, electronic equipment and storage medium - Google Patents

Animation realization method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117193720A
CN117193720A CN202311107359.4A CN202311107359A CN117193720A CN 117193720 A CN117193720 A CN 117193720A CN 202311107359 A CN202311107359 A CN 202311107359A CN 117193720 A CN117193720 A CN 117193720A
Authority
CN
China
Prior art keywords
animation
management
class
type
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311107359.4A
Other languages
Chinese (zh)
Inventor
李静
吴成旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Great Wall Motor Co Ltd
Original Assignee
Great Wall Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Great Wall Motor Co Ltd filed Critical Great Wall Motor Co Ltd
Priority to CN202311107359.4A priority Critical patent/CN117193720A/en
Publication of CN117193720A publication Critical patent/CN117193720A/en
Pending legal-status Critical Current

Links

Abstract

The application provides an animation realization method, an animation realization device, electronic equipment and a storage medium, wherein the method comprises the following steps: obtaining the animation type of the target animation; constructing an animation management object corresponding to the animation type through an animation management factory class; the target animation is determined based on the animation parameter information of the animation management object and the target animation. The method can reduce the coupling degree between business logic and specific animation realization, and is easier to maintain and expand.

Description

Animation realization method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technology, and more particularly, to an animation implementation method, an animation implementation device, an electronic device, and a storage medium in the field of computer technology.
Background
In the Android system, a plurality of UI (User Interface) interfaces can be represented by animation, so that complex states and interaction information can be visually conveyed, and the user experience is enhanced. Such as air conditioning blowing effect, energy flow status, action presentation of avatar, etc. in the car machine. However, there are many implementations of animation, such as frame animation, lottie animation, three-dimensional animation, etc., each animation typically requires different coding and resources, and different vehicle models may employ different animation implementations. In the conventional development process, although the processing of business logic of the same animation effect is the same or similar, each animation mode usually corresponds to a set of independent codes, and if the change is required in the development process, the workload of code maintenance is large.
Disclosure of Invention
The application provides an animation realization method, an animation realization device, electronic equipment and a storage medium. The technical scheme is as follows:
in a first aspect, there is provided an animation implementation method, the method comprising:
obtaining the animation type of the target animation;
constructing an animation management object corresponding to the animation type through an animation management factory class;
and determining the target animation based on the animation parameter information of the animation management object and the target animation.
With reference to the first aspect, in some possible implementations, the constructing, by the animation management factory class, an animation management object corresponding to the animation type includes:
determining an animation management class corresponding to the animation type through an animation management factory class;
and instantiating the animation management class through the animation management factory class to construct an animation management object corresponding to the animation management class.
With reference to the first aspect and the foregoing implementation manner, in some possible implementation manners, before the determining, by the animation management factory class, an animation management class corresponding to the animation type includes:
Encapsulating the animation management factory class and defining the animation management class corresponding to the animation type;
wherein, the animation management factory class is packaged with a method for constructing an animation management object; the animation management class defines an animation type and animation management attributes corresponding to the animation type.
With reference to the first aspect and the foregoing implementation manner, in some possible implementation manners, the foregoing animation management classes corresponding to different foregoing animation types are inherited from the same animation management base class.
With reference to the first aspect and the foregoing implementation manner, in some possible implementation manners, before the determining, by the animation management factory class, an animation management class corresponding to the animation type, the method further includes:
defining an animation management base class;
wherein, the animation management base class defines animation playing logic and an animation view acquisition method.
With reference to the first aspect and the foregoing implementation manner, in some possible implementation manners, the animation parameter information includes animation playing control information and animation playing content information.
With reference to the first aspect and the foregoing implementation manner, in some possible implementation manners, after determining the target animation based on the animation parameter information of the animation management object and the target animation, the method includes:
And outputting the view component bearing the target animation to a user interface layer for display.
In a second aspect, there is provided an animation realization device comprising:
the acquisition module is used for acquiring the animation type of the target animation;
the building module is used for building the animation management object corresponding to the animation type through the animation management factory class;
and the determining module is used for determining the target animation based on the animation management object and the animation parameter information of the target animation.
With reference to the second aspect, in some possible implementations, the building module includes:
the identification unit is used for determining an animation management class corresponding to the animation type through an animation management factory class;
and the construction unit is used for instantiating the animation management class through the animation management factory class and constructing an animation management object corresponding to the animation management class.
With reference to the second aspect and the foregoing implementation manner, in some possible implementation manners, the apparatus further includes:
the first definition module is used for packaging the animation management factory class and defining the animation management class corresponding to the animation type;
wherein, the animation management factory class is packaged with a method for constructing an animation management object; the animation management class defines an animation type and animation management attributes corresponding to the animation type.
With reference to the second aspect and the foregoing implementation manner, in some possible implementation manners, the foregoing animation management classes corresponding to different foregoing animation types are inherited from the same animation management base class.
With reference to the second aspect and the foregoing implementation manner, in some possible implementation manners, the apparatus further includes:
the second definition module is used for defining an animation management base class;
wherein, the animation management base class defines animation playing logic and an animation view acquisition method.
With reference to the second aspect and the foregoing implementation manner, in some possible implementation manners, the foregoing animation parameter information includes animation playing control information and animation playing content information.
With reference to the second aspect and the foregoing implementation manner, in some possible implementation manners, the apparatus further includes:
and the display module is used for outputting the view component bearing the target animation to the user interface layer for display.
In a third aspect, an electronic device is provided that includes a memory and a processor. The memory is for storing executable program code and the processor is for calling and running the executable program code from the memory for causing the electronic device to perform the method of the first aspect or any one of the possible implementations of the first aspect.
In a fourth aspect, there is provided a computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the method of the first aspect or any one of the possible implementations of the first aspect.
In a fifth aspect, a computer readable storage medium is provided, the computer readable storage medium storing computer program code which, when run on a computer, causes the computer to perform the method of the first aspect or any one of the possible implementations of the first aspect.
In summary, in the technical scheme of the application, the animation type of the target animation is obtained; constructing an animation management object corresponding to the animation type through an animation management factory class; and determining the target animation based on the animation parameter information of the animation management object and the target animation. The method adopts the animation management factory class to separate the creation and the use of the animation management object, thereby reducing the coupling degree between the business logic and the specific animation realization and realizing the flexible switching and multiplexing of different animation types. Therefore, the method can be rapidly and flexibly realized whether the existing animation is changed or a new animation type is added, the flexibility and maintainability of the system are enhanced, the system is more easily adapted to changes and expansion, and the animation display of different scenes and requirements is met.
Drawings
FIG. 1 is an exemplary system architecture diagram of an animation implementation system provided by an embodiment of the present application;
FIG. 2 is a schematic flow chart of an animation implementation method provided by an embodiment of the present application;
FIG. 3 is a schematic interactive flow chart of an animation implementation method provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of class dependencies in an animation control layer according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an animation implementation device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical scheme of the application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B: the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and furthermore, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature.
The embodiment of the application provides an animation realization method. The method may be implemented in dependence on a computer program, and may be run on an animation implementation device or system based on von neumann systems. The computer program may be integrated in the application or may run as a stand-alone tool class application. The animation implementation device may be a terminal device including, but not limited to: personal computers, tablet computers, handheld devices, vehicle mounted devices, wearable devices, computing devices, or other processing devices connected to a wireless modem, and the like.
In the vehicle-to-machine UI of the Android system, the animation effect can clearly enrich visual expression and enhance user experience. The implementation of a typical animation may include the steps of: firstly, determining specific animation effect requirements, such as air conditioning blowing effect in a vehicle; then writing specific logic codes according to the requirements, and determining playing time, stopping time, playing content and the like of the animation; then selecting an animation implementation mode (such as frame animation, lottie animation and the like) according to specific requirements, and writing specific implementation codes of the animation; and finally, rendering the realized animation on a screen, and displaying the animation to a user.
However, the code for different types of animation implementations is different, e.g., the implementation code for frame animation and Lottie animation is different. Even though the business logic of some animations is very similar or identical, different animation implementations require separate writing and maintenance. Where the business logic of the animation may be an upper level logic operation that is independent of the behavior and behavior of the animation, which relates to the trigger condition of the animation, when the animation starts, when ends, how to respond to user interactions, etc., rather than the details of the animation frame or the specific implementation of the animation.
Taking a car navigation system as an example, when the driver is about to reach a turn, the navigation system will exhibit an animation effect to indicate the direction and distance of the turn. Business logic may include: the navigation system detects that the vehicle is 500 meters away from the next turning point; the system decides that an animation prompt needs to be displayed to the driver; after the driver finishes turning successfully according to the animation prompt, the animation disappears.
The business logic described above describes when and how to initiate an animation, independent of the specific visual effect and implementation of the animation of the "turn prompt". When different types of animations are adopted for independent writing and maintenance, a large number of business logic codes need to be repeatedly written, and if the business logic of the animations is modified or new animation types need to be added, multi-level code modification or a whole set of new animation realization codes need to be independently written, so that the complexity of development and maintenance is increased.
Because different animation realization modes can be used by the vehicle-mounted device of different vehicle types, in order to better adapt to the diversified demands of different vehicle types and scenes, the application provides an animation realization method which separates the creation and the use of animation management objects by packaging an animation management factory class so as to reduce the redundancy of codes, reduce the coupling degree and be easier to maintain and expand.
As shown in fig. 1, an exemplary system architecture diagram of an animation implementation system includes a business logic layer 110, an animation control layer 120, and a UI presentation layer 130.
The business logic layer 110 is the starting point of the whole architecture and is responsible for determining key parameter information of the animation to be displayed according to specific application scenes and requirements. Specifically, the business logic layer 110 may determine the type of animation to be specifically presented by analyzing the business requirements. The animation type may be different implementations of frame animation, lottie animation, three-dimensional animation (such as Unity), etc.; the business logic layer 110 may pass the animation type to the animation control layer 120, requesting the animation control layer 120 to construct a corresponding animation management object; the business logic layer 110 may also provide animation parameter information of a specific animation, such as animation content, play timing, play speed, repetition number, etc., so that the animation control layer 120 applies the animation parameter information to the animation management object to generate a specific animation instance, i.e., a target animation.
The animation control layer 120 is located between the business logic layer 110 and the UI presentation layer 130, and serves as an intermediate layer to manage and control creation, configuration, and playback of animations. Specifically, the animation control layer 120 may construct a corresponding animation management object according to the animation type received from the business logic layer 110 through an animation management factory class; the animation management factory class encapsulates the construction logic of the animation management object, so that different types of animation management objects can be constructed through a uniform interface; the animation control layer 120 is also responsible for applying the animation parameter information provided by the business logic layer 110 to a specific animation management object to generate a corresponding animation.
The UI presentation layer 130 is used as the forefront end of the whole architecture and is responsible for presenting the animation provided by the animation control layer 120 to the user, so as to realize the interaction between the user and the animation. Specifically, the UI presentation layer 130 may manage a view portion of the animation, render the animation view component returned by the animation control layer 120 onto a screen, and may control visual properties such as a position, a size, and transparency of the animation; the UI presentation layer 130 may be responsible for handling user interactions with the animation, such as clicking, sliding, etc., which may trigger play, pause, or other effects of the animation; the UI presentation layer 130 may also feed back the status or progress of the animation to the business logic layer 110 to enable further logic processing.
For example, in a vehicle system, an air conditioner blowing effect needs to be displayed, and a frame animation is selected to display the effect according to the specific requirement of the effect. At this point, the business logic layer will determine the animation type as "frame animation". The business logic layer will pass the "frame animation" type to the animation control layer requesting to build a corresponding frame animation management object. After the frame animation is determined to be used for displaying the air-conditioning blowing effect, the business logic layer also provides animation parameter information such as animation resources, playing speed, playing time and the like, and the animation control layer constructs a corresponding frame animation management object, and the frame animation management object generates the air-conditioning blowing effect animation by loading and configuring the animation parameter information. Finally, the animation control layer returns a View component (View) containing the air-conditioning blowing effect animation to the UI presentation layer, and the UI presentation layer adds the View component into the UI layout and adjusts the size or transparency of the View component according to the requirement so as to be rendered to a specific position of the user interface when in use.
Next, referring to fig. 1, an animation implementation method provided by an embodiment of the present application is described with an animation control layer as an execution body. Referring specifically to fig. 2, a schematic flow chart of an animation implementation method according to an embodiment of the present application is shown. Illustratively, as shown in FIG. 2, the animation implementation method includes the following steps:
S201, obtaining the animation type of the target animation.
In particular, different animation types are suitable for different scenes and requirements, if the system supports multiple animation types, the most suitable animation implementation (animation type) can be selected for specific requirements and environments, so that the selected animation matches the required visual effect and functional requirement, and in addition, different animation types may have different resource and performance requirements. For example, 3D animation may require more graphics processing power, while frame animation may require more memory space, selecting an appropriate animation type facilitates writing more targeted code, and facilitates later maintenance work, and may also enhance the user experience, making the animation appear smoother, natural, and more interactive.
Target animation refers to animation that is required to achieve a particular visual effect or functional purpose. For example, the visual effect of an air conditioner blower or the feedback effect of a button click may be a target animation that represents a specific effect that is desired to be presented to the user on the screen.
Animation Type (animation Type) refers to a specific technique or implementation method for implementing a target animation. Each type of animation has its own presentation, technical implementation, and resource requirements. Embodiments of the present application may obtain animation types including, but not limited to:
Frame Animation (Frame Animation): consists of a series of individual images or frames that are played in a particular order and rate, and can be used for complex animation effects.
Lottie animation: lottie is a library that can parse Adobe After Effects animations and render them on mobile devices and web pages, suitable for complex animation effects.
Unity animation: unity is an engine widely used for game development, provides powerful 3D rendering capability and physical engine, and can be used for creating various interactive 3D contents and virtual reality applications to realize realistic three-dimensional animation effects.
Kanzi animation: kanzi is a development tool for creating automotive and embedded device user interfaces that provides a visual editing environment for designing and building interfaces with complex animations and interactions.
In this embodiment, by acquiring the animation type of the target animation, the subsequent animation management object construction may be performed based on the animation type, so that only the animation type may be modified when the implementation of the target animation needs to be modified.
S202, constructing an animation management object corresponding to the animation type through an animation management factory class.
Specifically, the animation management factory class adopts a factory design mode, and can flexibly construct an animation management object corresponding to an animation type through internal encapsulation construction logic according to the input animation type, wherein the animation management object can manage and control interfaces of the corresponding specific types of animations (such as frame animations, lottie animations and the like), control operations of playing, suspending and the like of the animations through internal encapsulation methods and attributes of the animation management object, and can also manage attributes of resources, speed and the like of the animations. Therefore, the specific realization of the animation can be decoupled from a business logic layer (main program), the business logic layer can control different types of animations through a unified interface, the addition and modification of the animation types do not affect other logic codes, the modularization and reusability of the codes are realized, and the maintainability and expandability of the codes are improved.
For example, if an in-vehicle infotainment system is developed, in some cases the system may need to exhibit animation effects, such as: when the vehicle starts, displaying a brand mark of a frame animation; when the music playing switches songs, the record rotation effect of a Lottie animation is displayed.
When the vehicle starts, the business logic layer requests the type of "frame animation" and provides brand logo resources. Then, the animation control layer creates an animation management object corresponding to the frame animation according to the request by using the animation management factory class, and configures animation parameters according to the provided resources. After the configuration is completed, the business logic layer can control the playing of the frame animation by calling the appointed method of the animation management object, such as playing.
When the music play switches songs, the business logic layer likewise requests the type of "Lottie animation" and provides resources for album animation. Then, the animation control layer creates an animation management object corresponding to the 'Lottie animation' according to the request by using the animation management factory class, and configures animation parameters according to the provided resources. After the configuration is completed, the business logic layer can control the playing of the Lottie animation by calling the appointed method of the animation management object, such as playing.
In both scenarios, the call style of the business logic layer is consistent, except that the requested animation type and resources are different. Thus, even if more animation types are added in the future, only the animation management factory class needs to be expanded and corresponding animation management objects need to be newly added. For example, when a navigation instruction is received, a 3D animated turn arrow is presented, the call logic of the business logic layer remains consistent, and the business logic layer can support new animation types without any modification, as it relies solely on the unified interface provided by the animation management factory class.
In some embodiments, the building the animation management object corresponding to the animation type through the animation management factory class includes: determining an animation management class corresponding to the animation type through an animation management factory class; and instantiating the animation management class through an animation management factory class to construct an animation management object corresponding to the animation management class.
Specifically, the animation management factory class determines an animation management class for implementing the animation type according to the input animation type. Wherein, the animation type and the animation management class can be associated by a mapping relation and the like. After the animation management class is determined, the animation management factory class will instantiate according to the definition of the animation management class to construct a specific animation management object. Because the creation and configuration of the animation management objects are encapsulated in the animation management factory class and the animation management class, the code that invokes them can remain compact and consistent, while also providing better flexibility and extensibility.
For example, if a frame animation is required to show an effect, the frame animation management class may be selected by the animation management factory class, then the frame animation management class may be instantiated, a constructed frame animation management object returned, and the frame animation may be played by invoking the play animation logic of the frame animation management object.
The animation management class is used for defining the attribute and the behavior of the specific type of animation, encapsulates all the characteristics and the control logic of the specific type of animation, and provides templates for creating the animation management objects with the same attribute and method, thereby realizing the unified management of the specific type of animation.
In some embodiments, before determining the animation management class corresponding to the animation type by the animation management factory class, the method includes: packaging an animation management factory class and defining an animation management class corresponding to the animation type; wherein, the animation management factory class is packaged with a method for constructing an animation management object; the animation management class defines an animation type and animation management attributes corresponding to the animation type.
Specifically, the animation management factory class encapsulates a method of building an animation management object, and all details associated with building the animation management object are hidden within the factory class. The encapsulation separates the creation and use of the animation management objects, and other parts of the system (such as a business logic layer) do not need to know how to instantiate or configure a specific animation management class, and only needs to know the required animation type, then request the animation management factory class to create the animation management objects of the animation type, and directly call the created animation management objects. Thus, the complexity of creating the animation management object can be hidden, so that the code is tidier, and the expandability and maintainability of the code are improved. Also, since the creation logic of the animation management object is concentrated in one place, tracking and modification are easier.
Illustratively, when developing in-vehicle infotainment systems, there may be a variety of animation effects therein, such as start-up animations, navigation path animations, music playing animations, etc., which may involve a variety of animation types. When adding or modifying animation types, an animation management factory class is used, and only the creation logic of the corresponding animation management object needs to be added or modified in the factory class, without touching other codes using the animation. This not only reduces the chance of error, but also makes the system support for new animation types more rapid and flexible.
The animation management properties may be a set of properties and parameters for managing and controlling an animation. May include, but is not limited to:
play control attributes: such as play speed, number of plays, whether to loop, etc.
Animation resource properties: such as paths or references to resources such as pictures, audio, data files, etc. required for animation.
Animation effects attributes: such as the transparency, position, size, color, etc., of the animation.
Status attributes: such as the current state of the animation (play, pause, stop, etc.).
Defining animation management classes may provide a structured way to manage different types of animations. In this way, each type of animation will have its own management class in which the specific properties and methods of that type of animation are encapsulated. This avoids cluttering all animation logic together, facilitating the clarity and maintenance of code. When a function needs to be added or modified for a specific type of animation, the operation can be performed more flexibly and pertinently, thereby improving the development efficiency and the code quality.
In some embodiments, the animation management classes corresponding to different animation types inherit from the same animation management base class.
In particular, inheritance refers to a new class receiving (inheriting) the properties and methods of another class. Inheritance can build classes related to organization, ensuring code reusability and maintainability. When the animation management classes corresponding to different animation types are inherited from the same animation management base class, all the animation management classes will have a uniform interface and share a set of the same methods and attributes. That is, the calling end code may interact with the animation management object in the same manner, regardless of the particular type of animation. And if there are some general functions or attributes applicable to all animation types, these functions can be defined in the animation management base class and then shared by all the animation management classes, eliminating duplicate code and reducing maintenance complexity. For example, when adding a new type of animation management class, only the animation management base class needs to be inherited without overwriting existing code; when a general function is changed, the general function is only required to be modified in the animation management base class, and all animation management classes inheriting the animation management base class are updated.
In some embodiments, before the determining, by the animation management factory class, the animation management class corresponding to the animation type, the method further includes: defining an animation management base class; wherein, the animation management base class defines animation playing logic and an animation view acquisition method.
In particular, the animation management base class, as a parent class for various types of animation management classes, provides a common set of attributes and methods that help ensure that all animation management objects follow the same interface. In the present embodiment, the animation management base class defines animation playback logic and an animation view acquisition method. The animation playing logic is responsible for playing control of the animation, such as playing, pausing, stopping, cycling, and the like, and the method controls the playing flow of the animation according to the requirements and the configuration of the animation. The role of the animated view acquisition method is to return the animated view component, which can be added to the UI layout to render the animation at a particular location of the user interface.
S203, determining the target animation based on the animation parameter information of the animation management object and the target animation.
Specifically, the animation management object is constructed by the above-described animation management factory class, and is used for managing and controlling the specific type of animation corresponding thereto. The animation management object has animation management properties associated with a particular animation type that allow detailed configuration of the playback and presentation of the animation. The animation parameter information of the target animation is used for determining the playing and exhibiting of the animation, and comprises animation playing control information and animation playing content information. The animation play control information describes a logical structure (i.e., business logic) in which the animation is played according to a specific rule and sequence, and may include a play sequence, a play speed, a loop control, a trigger condition, and the like. The animation playing content information covers animation resources required for the actual display of the animation, and can comprise image resources, sound resources, 3D models, text contents and the like. And after loading and configuring the animation parameter information, the animation management object generates a target animation with a corresponding animation type and can control the playing of the target animation. By the animation view acquisition method, a view component bearing the target animation can be returned, so that the animation is rendered on a screen, and interaction between a user and the animation is realized. And returns a view component carrying the target animation.
In some embodiments, after determining the target animation based on the animation parameter information of the animation management object and the target animation, the method comprises: and outputting the view component bearing the target animation to a user interface layer for display.
Specifically, after the creation and configuration of the target animation are completed, the system needs to display the generated animation to the user. The view component carrying the target animation can be a specific view or control, and the animation management object generates the target animation by loading and configuring the animation parameter information and embeds the target animation into the view component. Once the target animation embeds the view component, the component may be added to the user interface layer.
The user interface layer (UI presentation layer) interacts directly with the user, being responsible for managing and displaying all visible components and controls. This layer may include various layout, buttons, text boxes, etc., of which the view component carrying the animation is a part. By integrating this component into the user interface layer, the target animation can be rendered on the interface and visually presented to the user. This approach not only makes the addition and configuration of animations flexible and convenient, but also helps to maintain consistency and maintainability throughout the user interface.
In the technical scheme of the application, the animation type of the target animation is obtained; constructing an animation management object corresponding to the animation type through an animation management factory class; and determining the target animation based on the animation parameter information of the animation management object and the target animation. The method adopts the animation management factory class to separate the creation and the use of the animation management object, thereby reducing the coupling degree between the business logic and the specific animation realization and realizing the flexible switching and multiplexing of different animation types. Therefore, the method can be rapidly and flexibly realized whether the existing animation is changed or a new animation type is added, the flexibility and maintainability of the system are enhanced, the system is more easily adapted to changes and expansion, and the animation display of different scenes and requirements is met.
Reference is next made to fig. 3, which is a schematic interaction flow chart of an animation implementation method according to an embodiment of the present application. Illustratively, as shown in FIG. 3, the animation implementation method includes the following steps:
s301, the business logic layer sends the animation type to the animation control layer.
Specifically, the business logic layer determines the animation type (e.g., frame animation, lottie animation, etc.) of the desired animation according to the specific requirements and scene. And sending the animation type information to an animation control layer as a construction basis of an animation management object.
S302, the animation control layer constructs an animation management object according to the animation type.
Specifically, after the animation control layer receives the animation type, a corresponding animation management object is constructed through an animation management factory class. The animation management object encapsulates general operations such as playing, suspending and the like of the animation of the type and specific attributes according to the animation type.
By way of example, fig. 4 is a schematic diagram of class dependencies in an animation control layer, as shown in fig. 4,
animation manager factory is an animation management factory class;
createManager (animation Type) is a factory method by which corresponding animation management objects can be created based on the type of animation provided, the animation management objects being based on baseanimation manager.
The BaseAnimationManager is an animation management base class, provides a basic framework and general functions of animation management, and can be inherited by animation management classes of different animation types;
playAnimation (animation) is a general method for playing an animation;
getanimation View (): view is a general method for obtaining View components that carry animations.
FrameAnimationManager is an animation management class corresponding to a frame animation type;
the LottieAnimation manager is an animation management class corresponding to the Lottie animation type;
Unityanimation manager is an animation management class corresponding to the Unity animation type;
kanzi animation manager is an animation management class corresponding to a Kanzi animation type;
wherein field type is used to describe a specific frame animation type, and the animation management class includes logic and attributes specific to the corresponding animation type.
In the animation control layer, an animation management factory class animation manager factory uses a createsman method to create an animation management object of a corresponding animation management class according to a provided animation type. If the animation type is a frame animation, then an instance of the animation management class frameanimation manager may be created. These animation management classes inherit the automatic picture management base class baseanimation manager, ensuring that they have a common interface and behavior, thereby enhancing the flexibility and maintainability of the code.
S303, the business logic layer sends the animation parameter information to the animation control layer.
Specifically, once the animation management object is created, the business logic layer determines the playing parameters of the animation according to specific requirements. The animation parameter information includes animation play control information and animation play content information such as play speed, play times, asset path, etc. The business logic layer sends these parameters to the animation control layer for configuring the target animation.
It should be noted that, the service logic layer may send the animation parameter information and the animation type to the animation control layer at the same time.
S304, the animation control layer configures a target animation according to the animation parameter information and the animation management object.
Specifically, the animation control layer may use the received animation parameters and the already constructed animation management objects to determine and configure the target animation. The configuration process of the target animation can comprise the steps of loading resources, setting playing attributes, applying visual effects and the like.
S305, the animation control layer returns the view component carrying the target animation to the UI presentation layer.
Specifically, after the target animation configuration is completed, the animation control layer embeds the target animation into a specific View component (View), and returns this component to the UI presentation layer. This allows the target animation to be presented as part of the interface.
S306, the UI presentation layer adds the view component to the UI layout.
Specifically, the UI presentation layer adds the received view component to the UI layout, so that the animation can be rendered and displayed on the user interface, consistency and coordination of the animation and other UI elements are ensured, and a foundation is provided for subsequent interaction.
The whole animation realization process covers the complete process of defining the animation requirement from a business logic layer, constructing and configuring the animation through an animation control layer and displaying the animation to a UI display layer. Through the animation realization method, the business logic layer and the animation control layer are decoupled, so that a plurality of different types of animations can be compatible, and the expandability and maintainability are strong. For example, if it is desired to switch the current animation to another animation type, the switch may be accomplished by merely modifying the animation type that the business logic layer passes to the animation control layer. If a new animation type is to be added, only the corresponding animation management class is needed to be added in the animation control layer, and the core logic code and the UI layer code of the business logic layer are not needed to be modified.
Fig. 5 is a schematic structural diagram of an animation implementation device according to an embodiment of the present application.
Illustratively, as shown in FIG. 5, the apparatus 500 includes:
an obtaining module 510, configured to obtain an animation type of the target animation;
a construction module 520, configured to construct an animation management object corresponding to the animation type through an animation management factory class;
a determining module 530, configured to determine the target animation based on the animation parameter information of the animation management object and the target animation.
In some possible embodiments, the building block 520 includes:
the identification unit is used for determining an animation management class corresponding to the animation type through an animation management factory class;
and the construction unit is used for instantiating the animation management class through the animation management factory class and constructing an animation management object corresponding to the animation management class.
In some possible embodiments, the apparatus 500 further comprises:
the first definition module is used for packaging the animation management factory class and defining the animation management class corresponding to the animation type;
wherein, the animation management factory class is packaged with a method for constructing an animation management object; the animation management class defines an animation type and animation management attributes corresponding to the animation type.
In some possible embodiments, the animation management classes corresponding to different animation types are inherited from the same animation management base class.
In some possible embodiments, the apparatus 500 further comprises:
the second definition module is used for defining an animation management base class;
wherein, the animation management base class defines animation playing logic and an animation view acquisition method.
In some possible embodiments, the animation parameter information includes animation play control information and animation play content information.
In some possible embodiments, the apparatus 500 further comprises:
and the display module is used for outputting the view component bearing the target animation to the user interface layer for display.
In summary, in the technical scheme of the application, the animation type of the target animation is obtained; constructing an animation management object corresponding to the animation type through an animation management factory class; and determining the target animation based on the animation parameter information of the animation management object and the target animation. The method adopts the animation management factory class to separate the creation and the use of the animation management object, thereby reducing the coupling degree between the business logic and the specific animation realization and realizing the flexible switching and multiplexing of different animation types. Therefore, the method can be rapidly and flexibly realized whether the existing animation is changed or a new animation type is added, the flexibility and maintainability of the system are enhanced, the system is more easily adapted to changes and expansion, and the animation display of different scenes and requirements is met.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Illustratively, as shown in FIG. 6, the electronic device 600 includes: a memory 610 and a processor 620, wherein the memory 610 stores executable program code 611, and the processor 620 is configured to call and execute the executable program code 611 to perform an animation implementation method.
In this embodiment, the electronic device may be divided into functional modules according to the above method example, for example, each functional module may be corresponding to one processing module, or two or more functions may be integrated into one processing module, where the integrated modules may be implemented in a hardware form. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
In the case of dividing each function module with corresponding each function, the electronic device may include: an acquisition module, a construction module, a determination module and the like. It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The electronic device provided in this embodiment is configured to execute the above-described animation implementation method, so that the same effects as the above-described implementation method can be achieved.
In case an integrated unit is employed, the electronic device may comprise a processing module, a memory module. The processing module can be used for controlling and managing the actions of the electronic equipment. The memory module may be used to support the electronic device in executing associated program code and data, etc.
Wherein the processing module may be a processor or controller that may implement or execute the various exemplary logic blocks, modules and circuits described in connection with the present disclosure. A processor may also be a combination of computing functions, e.g., including one or more microprocessors, digital signal processing (digital signal processing, DSP) and microprocessor combinations, etc., and a memory module may be a memory.
The present embodiment also provides a computer-readable storage medium having stored therein computer program code which, when run on a computer, causes the computer to perform the above-described related method steps to implement an animation implementation method in the above-described embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-described relevant steps to implement an animation implementation method in the above-described embodiments.
In addition, the electronic device provided by the embodiment of the application can be a chip, a component or a module, and the electronic device can comprise a processor and a memory which are connected; the memory is used for storing instructions, and when the electronic device runs, the processor can call and execute the instructions to enable the chip to execute the animation implementation method in the embodiment.
The electronic device, the computer readable storage medium, the computer program product or the chip provided in this embodiment are used to execute the corresponding method provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding method provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the device embodiments described above are merely illustrative; for example, the division of modules or units is only one logic function division, and other division modes can be adopted in actual implementation; for example, multiple units or components may be combined or may be integrated into another device, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices, or units, which may be in electrical, mechanical, or other forms.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (10)

1. A method of implementing an animation, the method comprising:
obtaining the animation type of the target animation;
constructing an animation management object corresponding to the animation type through an animation management factory class;
the target animation is determined based on the animation parameter information of the animation management object and the target animation.
2. The method according to claim 1, wherein said constructing, by an animation management factory class, an animation management object corresponding to said animation type, comprises:
determining an animation management class corresponding to the animation type through an animation management factory class;
and instantiating the animation management class through the animation management factory class to construct an animation management object corresponding to the animation management class.
3. The method according to claim 2, wherein before determining the animation management class corresponding to the animation type by the animation management factory class, the method comprises:
Encapsulating the animation management factory class and defining the animation management class corresponding to the animation type;
wherein, the animation management factory class is packaged with a method for constructing an animation management object; the animation management class is defined with an animation type and an animation management attribute corresponding to the animation type.
4. The method of claim 2, wherein the animation management classes corresponding to different ones of the animation types inherit from a same animation management base class.
5. The method of claim 4, wherein prior to determining the animation management class corresponding to the animation type by the animation management factory class, further comprising:
defining an animation management base class;
wherein, the animation management base class defines animation playing logic and an animation view acquisition method.
6. The method of claim 1, wherein the animation parameter information comprises animation playback control information and animation playback content information.
7. The method according to claim 1, wherein after the determining the target animation based on the animation parameter information of the animation management object and the target animation, comprising:
and outputting the view component bearing the target animation to a user interface layer for display.
8. An animation realizing device, characterized in that the device comprises:
the acquisition module is used for acquiring the animation type of the target animation;
the building module is used for building an animation management object corresponding to the animation type through an animation management factory class;
and the determining module is used for determining the target animation based on the animation parameter information of the animation management object and the target animation.
9. An electronic device, the electronic device comprising:
a memory for storing executable program code;
a processor for calling and running the executable program code from the memory, causing the electronic device to perform the method of any one of claims 1 to 7.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when executed, implements the method according to any of claims 1 to 7.
CN202311107359.4A 2023-08-30 2023-08-30 Animation realization method and device, electronic equipment and storage medium Pending CN117193720A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311107359.4A CN117193720A (en) 2023-08-30 2023-08-30 Animation realization method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311107359.4A CN117193720A (en) 2023-08-30 2023-08-30 Animation realization method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117193720A true CN117193720A (en) 2023-12-08

Family

ID=88997200

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311107359.4A Pending CN117193720A (en) 2023-08-30 2023-08-30 Animation realization method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117193720A (en)

Similar Documents

Publication Publication Date Title
CN101553771B (en) Rendering hypertext markup language content
US7793268B2 (en) Method, system, and program product for composing a virtualized computing environment
CN109445783B (en) Method and device for constructing dynamic configuration application driven by service
TWI413933B (en) Application programming interfaces for graphical user interfaces
US8739120B2 (en) System and method for stage rendering in a software authoring tool
Tang et al. A platform independent game technology model for model driven serious games development
CN103870674A (en) Implementing a remote gaming server on a desktop computer
CN113082721B (en) Resource management method and device for application program of integrated game module, electronic equipment and storage medium
US20130127849A1 (en) Common Rendering Framework and Common Event Model for Video, 2D, and 3D Content
KR20130133319A (en) Apparatus and method for authoring graphic user interface using 3d animations
WO2024078458A1 (en) Three-dimensional control implementation method for in-vehicle system user interface, device, and storage medium
WO2022042162A1 (en) Method and apparatus for implementing user interface
CN114130017A (en) Game engine-based interface deployment method, device, equipment and storage medium
US8448190B2 (en) Methods, systems, and computer readable media for high reliability downloading of background assets using a manifest in a virtual world application
CN114115870A (en) User interface implementation method and device
US20230139886A1 (en) Device control method and device
WO2023025233A1 (en) Method and apparatus for writing animation playback program package, electronic device, and storage medium
CN117193720A (en) Animation realization method and device, electronic equipment and storage medium
Behr et al. Beyond the web browser-x3d and immersive vr
CN115167940A (en) 3D file loading method and device
CN114404996A (en) Resource data processing method, system, editor, electronic device and storage medium
Wynblatt et al. Control layer primitives for the layered multimedia data model
CN115170707B (en) 3D image implementation system and method based on application program framework
US20100131862A1 (en) method of dynamically creating complex multimedia objects
WO2024011733A1 (en) 3d image implementation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination