CN110136230B - Animation display method, device, electronic equipment and storage medium - Google Patents

Animation display method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN110136230B
CN110136230B CN201910253191.5A CN201910253191A CN110136230B CN 110136230 B CN110136230 B CN 110136230B CN 201910253191 A CN201910253191 A CN 201910253191A CN 110136230 B CN110136230 B CN 110136230B
Authority
CN
China
Prior art keywords
animation
target
objects
information
atomic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910253191.5A
Other languages
Chinese (zh)
Other versions
CN110136230A (en
Inventor
张一磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN201910253191.5A priority Critical patent/CN110136230B/en
Publication of CN110136230A publication Critical patent/CN110136230A/en
Application granted granted Critical
Publication of CN110136230B publication Critical patent/CN110136230B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to an animation display method, an apparatus, an electronic device, and a storage medium, the animation display method comprising: acquiring parameter information of a target animation to be displayed, wherein the target animation is an animation drawn by an animation application program installed on a second terminal; obtaining model data and rendering information of a target animation according to the parameter information and a plurality of pre-stored animation classes matched with different animation types; according to a preset mapping relation between a preset programming interface and programming interfaces of different development platforms, converting the data types of model data and rendering information from a first data type supported by the preset programming interface to a second data type supported by a target programming interface of a first terminal; and rendering the converted model data according to the converted rendering information, and displaying the target animation. The method and the device can display the target animation in a cross-platform mode without modifying codes.

Description

Animation display method, device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of graphics processing technologies, and in particular, to an animation display method, an animation display device, an electronic device, and a storage medium.
Background
Currently, there are many types of animation software on the computer terminal side, and a designer may design and produce various animation special effects using the animation software on the computer terminal side (for example, afterEffects, AE for short, a graphic video processing software proposed by Adobe corporation).
If the animation produced by the animation producing software at the side of the computer terminal needs to be displayed on the mobile terminal, a developer needs to re-write an animation APP (Application program) to be installed on the mobile terminal according to an API (Application Programming Interface, application program interface) supported by the operating system of the mobile terminal for use.
However, programming languages of different operating systems and supported APIs are different, so that for the same animation special effect, different developers who are familiar with development languages of the operating systems need to develop multiple sets of animation APP for terminals of the respective platforms respectively so as to display the same animation special effect on the terminals of the different platforms.
Obviously, the animation implementation scheme in the related art has the problems of difficult multiplexing of the animation codes, high difficulty in maintaining the animation codes and high development cost of the animation codes.
Disclosure of Invention
In order to overcome the problems of difficult multiplexing of animation codes, high maintenance difficulty of the animation codes and high development cost of the animation codes in the animation implementation scheme in the related art, the disclosure provides an animation display method, an animation display device, electronic equipment and a storage medium.
According to a first aspect of an embodiment of the present disclosure, there is provided an animation display method applied to a first terminal having a target programming interface, the method including:
acquiring parameter information of a target animation to be displayed, wherein the target animation is an animation drawn by an animation application program installed on a second terminal;
obtaining model data and rendering information of the target animation according to the parameter information and a plurality of animation classes which are prestored and matched with different animation types;
according to a preset mapping relation between a preset programming interface and programming interfaces of different development platforms, converting the data types of the model data and the rendering information from a first data type supported by the preset programming interface to a second data type supported by a target programming interface of the first terminal;
and rendering the converted model data according to the converted rendering information, and displaying a target animation.
In one possible implementation manner, the obtaining parameter information of the target animation to be displayed includes:
acquiring a configuration file of a target animation, wherein the configuration file is a configuration file corresponding to the target animation of an animation application program installed on the second terminal;
and acquiring parameter information of the target animation according to the configuration file.
In one possible implementation manner, the obtaining the model data and the rendering information of the target animation according to the parameter information and a plurality of pre-stored animation classes matched with different animation types includes:
acquiring a plurality of animation objects matched with the target animation according to the parameter information and a plurality of animation classes which are prestored and matched with different animation types;
and according to the parameter information, obtaining model data, first rendering information and second rendering information for describing animation relations among different animation objects in the plurality of animation objects.
In one possible implementation manner, the target animation is composed of one or more atomic animations, and the parameter information includes an animation type of each atomic animation corresponding to the target animation, an animation parameter of each atomic animation, a model identifier of each atomic animation, and a correlation parameter between different atomic animations;
The obtaining a plurality of animation objects matched with the target animation according to the parameter information and a plurality of pre-stored animation classes matched with different animation types, comprising:
according to a plurality of animation classes which are pre-stored and matched with different animation types, creating a plurality of animation objects matched with the target animation according to the animation type and the animation parameters of each atomic animation in the parameter information, wherein each animation object is respectively configured with the corresponding animation parameters;
the obtaining, according to the parameter information, model data of each of the plurality of animation objects, first rendering information, and second rendering information for describing animation relationships between different animation objects of the plurality of animation objects, includes:
according to the model identification of each atomic animation corresponding to the target animation in the parameter information, obtaining the model data of each animation object in the plurality of animation objects;
generating first rendering information of each animation object according to the animation parameters configured by each animation object;
creating association information between different animation objects in the plurality of animation objects according to association parameters between different atomic animations corresponding to the target animation;
And generating second rendering information for describing the animation relation between different animation objects in the plurality of animation objects according to the association information.
In a possible implementation manner, the obtaining the model data of each animation object in the plurality of animation objects according to the model identifier of each atomic animation corresponding to the target animation in the parameter information includes:
obtaining a model file;
and obtaining the model data of each animation object in the plurality of animation objects from the model file according to the model identification of each atomic animation corresponding to the target animation in the parameter information.
In one possible implementation, the animation types corresponding to the plurality of animation classes are animation types supported by the animation application program.
According to a second aspect of the embodiments of the present disclosure, there is provided an animation display device applied to a first terminal having a target programming interface, including:
the first acquisition module is configured to acquire parameter information of a target animation to be displayed, wherein the target animation is an animation drawn by an animation application program installed on the second terminal;
the second acquisition module is configured to acquire model data and rendering information of the target animation according to the parameter information and a plurality of pre-stored animation classes matched with different animation types;
The mapping module is configured to convert the data types of the model data and the rendering information from a first data type supported by a preset programming interface to a second data type supported by a target programming interface of the first terminal according to a preset mapping relation between the preset programming interface and programming interfaces of different development platforms;
and the rendering module is configured to render the converted model data according to the converted rendering information and display a target animation.
In one possible implementation manner, the first obtaining module includes:
a first obtaining submodule configured to obtain a configuration file of a target animation, wherein the configuration file is a configuration file corresponding to the target animation of an animation application program installed on the second terminal;
and the second acquisition sub-module is configured to acquire the parameter information of the target animation according to the configuration file.
In one possible implementation manner, the second obtaining module includes:
a third obtaining sub-module configured to obtain a plurality of animation objects matched with the target animation according to the parameter information and a plurality of pre-stored animation classes matched with different animation types;
And a fourth obtaining sub-module configured to obtain, according to the parameter information, model data of each of the plurality of animation objects, first rendering information, and second rendering information for describing an animation relationship between different animation objects of the plurality of animation objects.
In one possible implementation manner, the target animation is composed of one or more atomic animations, and the parameter information includes an animation type of each atomic animation corresponding to the target animation, an animation parameter of each atomic animation, a model identifier of each atomic animation, and a correlation parameter between different atomic animations;
the third acquisition submodule includes:
a first creating unit configured to create a plurality of animation objects matching the target animation according to the animation type and the animation parameters of each atomic animation in the parameter information according to a plurality of animation classes matching different animation types stored in advance, wherein each animation object is configured with the corresponding animation parameters;
the fourth acquisition submodule includes:
the first acquisition unit is configured to acquire model data of each animation object in the plurality of animation objects according to the model identification of each atomic animation corresponding to the target animation in the parameter information;
A first generation unit configured to generate first rendering information of each animation object according to the animation parameters configured by each animation object;
the second creation unit is configured to create association information between different animation objects in the plurality of animation objects according to association parameters between different atomic animations corresponding to the target animation;
and a second generation unit configured to generate second rendering information for describing an animation relationship between different animation objects among the plurality of animation objects according to the association information.
In one possible embodiment, the first acquisition unit includes:
a first acquisition subunit configured to acquire a model file;
and the second acquisition subunit is configured to acquire the model data of each animation object in the plurality of animation objects from the model file according to the model identification of each atomic animation corresponding to the target animation in the parameter information.
In one possible implementation, the animation types corresponding to the plurality of animation classes are animation types supported by the animation application program.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic device, comprising:
A processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform operations performed to implement the animation display method of any of the above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium, which when executed by a processor of an electronic device, causes the electronic device to perform an operation to implement the animation display method as described in any one of the above.
According to a fifth aspect of embodiments of the present disclosure, there is provided an application program which, when executed by a processor of an electronic device, enables the electronic device to perform an operation to implement the animation display method as described in any one of the above.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
in this way, the embodiment of the disclosure obtains the model data and the rendering data of the target animation by obtaining the parameter information of the target animation to be displayed and combining a plurality of animation classes which are pre-stored and matched with different animation types, and then converts the data types of the model data and the rendering data from the first data type supported by the pre-set programming interface to the second data type supported by the target programming interface of the first terminal according to the pre-set mapping relation between the pre-set programming interface and the programming interfaces of different development platforms, so that the display of the target animation is completed by using the converted data. The method of the embodiment of the disclosure can be respectively applied to each first terminal with a programming interface different from the preset programming interface without modifying the animation code, so that multiplexing of the animation code is realized, the maintenance difficulty and development cost of the animation code are reduced, and multiple sets of code realization are not required to be respectively carried out for the same animation aiming at different development platforms.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating an animation display method according to an exemplary embodiment;
FIG. 2 is a flowchart illustrating an animation display method, according to an exemplary embodiment;
FIG. 3 is a flowchart illustrating an animation display method, according to an exemplary embodiment;
FIG. 4 is a block diagram of an animation display device according to an exemplary embodiment;
FIG. 5 is a block diagram of an apparatus for animated display, shown in accordance with an exemplary embodiment;
fig. 6 is a block diagram illustrating an apparatus for animated display according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
Fig. 1 is a flowchart illustrating an animation display method applied to a first terminal having a target programming interface according to an exemplary embodiment, which may include the steps of:
step 101, acquiring parameter information of a target animation to be displayed;
wherein the target animation is an animation drawn by an animation application (e.g., AE) installed at a second terminal (e.g., computer side). Then to enable the target animation to be displayed on each terminal having a programming interface different from that of the second terminal, the embodiments of the present disclosure develop a target APP (i.e., a target animation application program) that is commonly used for each programming interface, and the target APP may acquire parameter information of the target animation to be displayed, where the parameter information is parameter information of the target animation in an existing animation application program (e.g., AE installed on the computer side) and the parameter information may describe the target animation.
The present disclosure is not limited to the developed animation application installed on the second terminal, and may be any developed animation application that a designer may use to create a target animation.
The method of the embodiment of the invention aims at redevelopment of each animation supported by the developed animation application program and enables the redevelopment target animation application program to be suitable for being installed and used by the mobile terminal with each programming interface, thereby drawing each animation supported by the developed animation application program on the mobile terminal with each programming interface and avoiding repeated development of the same set of animation codes.
In one possible implementation, the configuration file of the target animation may be obtained when step 101 is performed; and then, according to the configuration file, acquiring the parameter information of the target animation.
The configuration file is a configuration file corresponding to the target animation of the animation application program installed on the second terminal.
The following description will take the second terminal as a computer terminal and the first terminal as a mobile terminal as an example. However, according to different application scenarios, the invention does not limit the types of the terminals of the first terminal and the second terminal specifically, and the terminals can be both mobile terminals, computer terminals, one computer terminal, and the other mobile terminal, but it is to be noted that the programming interfaces supported by the first terminal and the second terminal are different, or the operating systems of the first terminal and the second terminal are different.
In order to obtain the parameter information of the target animation, a configuration file (for example, json file) of the target animation on an animation application program at a computer end can be read into a memory, and then a parser is arranged in the target APP of the embodiment of the disclosure, and the data in the configuration file in the memory can be parsed, so that the parameter information of the target animation is obtained.
In this way, the embodiment of the disclosure only needs to obtain the configuration file of the target animation to be displayed on the animation application program of the second terminal, the parameter information of the target animation can be obtained according to the configuration file, less external data is required, and the model data and the rendering information of the target animation can be obtained based on the parameter information and a plurality of animation classes stored in advance, so that the mapping of the model data and the rendering information is completed, the target animation is displayed on the first terminal by using the mapped data, the operation is simple, less external data is required, repeated development according to different platforms is not required, and the development cost is saved.
102, obtaining model data and rendering information of the target animation according to the parameter information and a plurality of pre-stored animation classes matched with different animation types;
In one possible implementation, the animation types corresponding to the plurality of animation classes are animation types supported by the animation application installed at the second terminal.
Since the special effects of the animation that the AE can produce are various, in order to enable the target APP installed in the first terminal of the embodiment of the present disclosure to display various types of animation, the target APP of the embodiment of the present disclosure includes an animation system in which a plurality of animation classes, that is, a plurality of Class (Class) files describing the animation, are stored in advance. Wherein each class file describes one type of animation (i.e., atomic animation described later), and is therefore referred to as an animation class. And different animation classes correspond to different animation types, and the animation types corresponding to the animation classes can be, for example, animation types supported by AE.
Wherein the animation type includes, but is not limited to, rotation, displacement, etc. In the case of a rotation type animation class, the animation class may describe animation parameters, rendering information, etc. that are involved in rotating an animation. In this way, the target APP of the embodiment of the present disclosure can implement the animation function regardless of which animation types of the AE make are combined into the target animation, so that the animation achievable by the target APP installed at the first terminal is identical to the animation achievable by the animation application installed at the second terminal.
Wherein, default values of various animation parameters of the animation class can be stored in the pre-stored animation class. For example, the default animation parameters of the animation class of the rotating animation, which is rotated to the left by 30 degrees by default, and the rotation duration is 2 seconds, are stored.
Then in this step, the animation system in the target APP according to the embodiment of the present disclosure may acquire the model data and rendering information of the target animation according to the parameter information of the target animation and a plurality of animation classes stored in advance.
Among them, model data is an application object of an animation. Taking a rotation type animation as an example, the rotation action needs to have an application object, and the application object is model data, for example, model 1, and the generated animation is that the model 1 rotates.
The rendering data may be understood as a rendering instruction, which is input to hardware (for example, GPU (graphics processor, graphics Processing Unit)), and then causes the GPU to render the model data on the screen of the first terminal according to the rendering instruction.
In one possible implementation, when step 102 is performed, referring to fig. 2, it may be implemented by S201 and S202:
s201, acquiring a plurality of animation objects matched with the target animation according to the parameter information and a plurality of pre-stored animation classes matched with different animation types;
Wherein, the animation parameters of each animation type, default values of the animation parameters and the like are defined by a plurality of animation classes stored in the target APP in advance, however, the values of the animation parameters of the target animation are not necessarily the default values, and the target animation may comprise a combination of the animations of a plurality of animation types. Therefore, the present step can create a plurality of animation objects according to a plurality of animation classes stored in advance and the parameter information, wherein the animation types respectively corresponding to the plurality of animation objects are all animation types included in the parameter information of the target animation. And the created plurality of animation objects have been configured with animation parameters of custom values according to the parameter information.
S202, obtaining model data, first rendering information and second rendering information for describing animation relations among different animation objects in the plurality of animation objects according to the parameter information.
Wherein, the parameter information of the target animation can carry data indicating an animation model related to the target animation, so that the parameter information can be utilized to obtain the model data of each animation object in the plurality of animation objects; in addition, the first rendering information of each animation object corresponding to the target animation and the second rendering information for describing the animation relationship between different animation objects in the plurality of animation objects can be obtained according to the parameter information of the target animation.
In this way, when the model data and the rendering information of the target animation are acquired, the parameter information of the target animation and the prestored multiple animation classes are utilized, so that multiple animation objects matched with the target animation can be created, the model data and the first rendering information of each animation object can be acquired by utilizing the parameter information, the second rendering information for describing the animation relation between different animation objects in the multiple animation objects is acquired, and the generation efficiency of the target animation on the first terminal is improved.
In one possible implementation, the target animation is composed of one or more atomic animations, and the parameter information includes an animation type of each atomic animation corresponding to the target animation, an animation parameter of each atomic animation, a model identification of each atomic animation, and a correlation parameter between different atomic animations.
In this case, since any one of the designed target animations is composed of a plurality of sub-animations, each of which is not detachable, the animations constituting the target animations and not detachable may be referred to herein as atomic animations. One target animation may include one or more atomic animations, the types of which may include, but are not limited to, rotation, displacement, scaling, and the like.
The configuration file of the target animation can record which types of atomic animations the target animation is composed of, the animation parameters of each atomic animation constituting the target animation and the association parameters between different atomic animations, and in addition, the configuration file of the target animation can record the application object, namely the model identifier, of each atomic animation.
Therefore, the method according to the embodiment of the present disclosure can obtain the above-listed parameter information by analyzing the parameter information of the target animation.
For animation parameters, atomic animation in a rotation type, for example, may include, but is not limited to: rotational angular velocity, rotational linear velocity, about which axis to rotate, duration of rotation, direction of rotation, etc.; taking the example of displacement type atomic animation, the animation parameters may include, but are not limited to, initial velocity, acceleration, displacement trajectory, etc.
Regarding the associated parameters between different atomic animations, a rotation animation and a displacement animation are taken as examples to describe how the display sequence between the two animations is, how the two animations are combined (such as a combination mode of displacement during rotation, displacement after rotation, etc.), the interval duration between different animations, etc. Wherein the associated parameters between the different atomic animations are embodied in the second rendering information described above.
In the embodiment of the present disclosure, referring to fig. 3, when S201 is performed, it may be implemented by S301:
s301, creating a plurality of animation objects matched with the target animation according to the animation type and the animation parameters of each atomic animation in the parameter information according to a plurality of animation classes matched with different animation types and stored in advance, wherein each animation object is respectively configured with the corresponding animation parameters.
Specifically, the pre-stored animation class is, for example, all animation types supported by the animation application, but the target animation may be constituted of only a part of the atomic animations, and therefore, two animation objects may be created for each of the pre-stored animation classes, only for the animation class of the rotational animation and the animation class of the displacement animation, according to the animation type (including, for example, rotation and displacement) of each of the atomic animations in the parameter information of the target animation, and the corresponding animation parameters may be configured for each of the created animation objects, according to the animation parameters of each of the atomic animations in the parameter information. Wherein, the animation type of the pre-stored animation class corresponds to the animation type of one atomic animation, and the animation type of different animation classes corresponds to the animation type of different atomic animations.
For example, the animation parameters of the target animation include three atomic animations, namely a displacement animation 1, a displacement animation 2 and a rotation animation 3, respectively, the displacement animation 1 has an animation parameter 1, the displacement animation 2 has an animation parameter 2, and the rotation animation 3 has an animation parameter 3. Then the step needs to create two moving picture objects in the position corresponding to the moving picture class from a plurality of pre-stored moving picture classes, namely a displacement moving picture object 1 and a displacement moving picture object 2, and the moving picture parameters of the position corresponding to the moving picture object 1 are configured by adopting the moving picture parameters 1, and the moving picture parameters of the position corresponding to the moving picture object 2 are configured by adopting the moving picture parameters 2; and creating a rotary animation object 3 for the rotary animation class, and configuring animation parameters for the rotary animation 3 by adopting the animation parameters 3.
In this way, the animation parameters and the values of the animation parameters configured by each created animation object are matched with the target animation, rather than default values of the various animation parameters in the pre-stored animation class.
Then in performing S202, referring to fig. 3, this can be achieved by S302-S305:
s302, obtaining model data of each animation object in the plurality of animation objects according to the model identification of each atomic animation corresponding to the target animation in the parameter information;
The method of the embodiment of the disclosure can acquire model data matched with the model identifier by using the model identifier, and takes the model data as model data of the atomic animation corresponding to the model identifier. The atomic animations are in one-to-one correspondence with the animation objects described in the above embodiments, and thus, the model data of each animation object corresponding to the target animation can be obtained here. For example, the model data of the displacement moving object 1 is model 1, the model data of the displacement moving object 2 is model 2, and the model data of the rotation moving object 3 is model 1.
It should be noted that the model data corresponding to different animation objects may be the same or different, which is not limited in this disclosure.
In one possible implementation, in executing S302, a model file may be first acquired; and then, according to the model identification of each atomic animation corresponding to the target animation in the parameter information, obtaining the model data of each animation object in the plurality of animation objects from the model file.
Specifically, when the model file is acquired, the model file derived from the animation application program can be directly acquired, and the model file is the model file corresponding to the target animation produced by the animation application program; the model file may also be obtained by using the configuration file, for example, path information of the model file may be recorded in the configuration file, and then by reading the path information, the model file may be read from the corresponding path. The model data in the model file may include two-dimensional model data, three-dimensional model data, and the like, and the corresponding relation between the model identifier and the model data is stored in the model file.
Since the parameter information of the target animation includes the model identifier of each atomic animation corresponding to the target animation, and the correspondence relationship between the model identifier and the model data is stored in the model file, the model data corresponding to the model identifier of each atomic animation can be read from the model file, and each atomic animation is created with a corresponding animation object, so that the model data of each animation object corresponding to the target animation can be obtained.
In this way, the model file about the target animation in the animation application program is also imported into the target APP of the embodiment of the disclosure, so that the model data corresponding to each atomic animation can be obtained from the model file by means of the model identifier of each atomic animation corresponding to the target animation in the parameter information of the target animation, and the method of the embodiment of the disclosure has created the corresponding animation object for each atomic animation in advance, so that the model data of each animation object is obtained, and the target animation displayed by the target APP of the embodiment of the disclosure is consistent with the effect of the target animation displayed on the computer side.
S303, generating first rendering information of each animation object according to the animation parameters configured by each animation object;
taking the animation object as the rotating animation object 3 as an example, the animation parameters of the rotating animation object include: the rotation time is 2s from 45 degrees to 90 degrees, the rotation direction is clockwise, and the rotation times are 60 times around the Y axis.
The animation system of an embodiment of the present disclosure may analyze and calculate the above-described animation parameters of the rotating animation object to determine the rendering instructions of the rotating animation object 3. For example, a certain rendering instruction is information that a point is drawn on the upper left corner of the screen, 10 cm from the horizontal axis, 2 cm from the vertical axis, and the vertical axis.
Since the rendering instruction is an instruction to be executed by the GPU, the rendering instruction describes only information on how to draw graphics on the screen of the first terminal. The animation system of the embodiment of the disclosure calculates the drawing information, that is, the rendering instruction, according to the animation parameters of the animation object.
S304, creating association information between different animation objects in the plurality of animation objects according to association parameters between different atomic animations corresponding to the target animation;
Wherein, since the parameter information of the target animation may include the association parameter between different atom animations corresponding to the target animation, and the steps have created a corresponding animation object for each atom animation, it is also necessary to establish the association information between different animation objects in the plurality of animation objects corresponding to the target animation by means of the association parameter.
The essential content of the association parameters and the association information may be the same, but their data structures are different, because the association information belongs to an attribute of the animation object, and it is required to meet the definition requirements of the attribute in the animation class. But the associated parameters are only parameter information of the target animation.
And S305, generating second rendering information for describing the animation relation between different animation objects in the plurality of animation objects according to the association information.
In step S303, first rendering information is generated for each animation object corresponding to a target animation, where one target animation is formed by a plurality of atomic animations, and there are relationships among the plurality of atomic animations, so that the animation system according to the embodiment of the disclosure may perform processing calculation on the relationships among different animation objects in the plurality of animation objects corresponding to the target animation, so as to generate rendering instructions among different animation objects in the target animation.
Since the rendering instruction is an instruction to be executed by the GPU, the rendering instruction describes only information on how to draw graphics on the screen of the first terminal.
In this way, the embodiment of the present disclosure creates a plurality of animation objects matching the target animation by using a plurality of animation classes stored in advance by acquiring parameter information such as an animation type, an animation parameter, a model identification, and a correlation parameter between different atomic animations for each atomic animation corresponding to the target animation, wherein two different animation objects are created even though the animation types are the same but the animation parameters are different; and using the model identification in the parameter information to obtain the model data corresponding to each animation object, generating first rendering information of each animation object according to the animation parameters of each object, generating second rendering information for describing the animation relationship between different animation objects in the plurality of animation objects according to the association relationship between different animation objects, and generating rendering information of each atomic animation in the target animation, and rendering information between different atomic animations, and the object of each atomic animation application, namely the model data. Avoiding the development of multiple sets of code.
Step 103, converting the data types of the model data and the rendering information from a first data type supported by a preset programming interface to a second data type supported by a target programming interface of the first terminal according to a preset mapping relation between the preset programming interface and programming interfaces of different development platforms;
specifically, the target APP of the embodiment of the present disclosure is written in a language of a preset programming interface (for example, openGL (Open GraphicsLibrary, open graphics library)), where the preset programming interface is a graphics API, and when the target APP is developed, any one of the graphics APIs may be used for development. The target APP of the embodiment of the present disclosure includes an intermediate layer storing a preset mapping relationship between the preset programming interface and a programming interface of a different development platform (e.g., vulkan (Vulkan is a cross-platform 2D and 3D drawing API), directX (DX for short), a Metal (a low-level rendering application programming interface), etc., so that the written target APP can be applied to each development platform without replacing codes.
Wherein, in the preset mapping relationship, the preset programming interface and the programming interfaces of different development platforms respectively have one-to-one mapping relationship, and for any one-to-one mapping relationship, the mapped content can include, but is not limited to, at least one of the following: memory information (corresponding to byte length in the examples below), coordinate axis information, clipping space information (i.e., range of depth information). Through the mapping of the content, the purpose of data type conversion can be achieved.
For example, the preset programming interface used by the target APP is OpenGL, and the target programming interface supported by the first terminal to which the target APP is installed is Vulkan, then the middle layer in the embodiment of the disclosure may convert the data type of the model data and the data type of the rendering information from the data type required by OpenGL to the data type supported by Vulkan according to the mapping relationship between OpenGL and Vulkan in the preset mapping relationship, so that the driver of the first terminal and the GPU may recognize the model data and the rendering information after the data type is converted, thereby completing the display of the target animation.
In other words, the data types supported by different programming interfaces are different, so that the model data and the rendering information need to be converted into the data types supported by the programming interface of the first terminal to which the target APP is currently installed in the data types corresponding to the preset programming interface. For example, the rendering function 1 of the programming interface 1 supports the rendering information of the a type, and the rendering function 2 of the programming interface 2 supports the rendering information of the B type, so that it is necessary to convert the data type of the rendering information from the a type to the B type, for example, to convert the byte length of the rendering information from 1k supported by the a type to 0.5k supported by the B type.
Specifically, when the data type is converted, in order to achieve the purpose of converting the data type, the rendering information can be subjected to memory space conversion, coordinate value information conversion and clipping space conversion, so that the purpose of converting the rendering information from the type A to the type B data is achieved, and the converted rendering information belonging to the type B can be called by the rendering function 2 of the programming interface 2.
And 104, rendering the converted model data according to the converted rendering information, and displaying a target animation.
The middle layer of the embodiment of the present disclosure issues both the converted rendering data and the converted model data to the hardware driving layer, so that the GPU of the embodiment of the present disclosure may render the converted model data according to the converted rendering information, thereby displaying the target animation on the screen of the first terminal.
In one possible implementation manner, the real-time display of the target animation can be realized by rendering the model data corresponding to each animation object according to the converted first rendering information and rendering the animation relationship among different model data corresponding to different animation objects according to the converted second rendering data.
In this way, the embodiment of the disclosure obtains the model data and the rendering data of the target animation by obtaining the parameter information of the target animation to be displayed and combining a plurality of animation classes which are pre-stored and matched with different animation types, and then converts the data types of the model data and the rendering data from the first data type supported by the pre-set programming interface to the second data type supported by the target programming interface of the first terminal according to the pre-set mapping relation between the pre-set programming interface and the programming interfaces of different development platforms, so that the display of the target animation is completed by using the converted data. The method of the embodiment of the disclosure can be respectively applied to each first terminal with a programming interface different from the preset programming interface without modifying the animation code, so that multiplexing of the animation code is realized, the maintenance difficulty and development cost of the animation code are reduced, and multiple sets of code realization are not required to be respectively carried out for the same animation aiming at different development platforms.
Fig. 4 is a block diagram illustrating a structure of an animation display device applied to a first terminal having a target programming interface according to an exemplary embodiment. Referring to fig. 4, the apparatus includes:
A first acquisition module 41 configured to acquire parameter information of a target animation to be displayed, wherein the target animation is an animation drawn by an animation application installed on a second terminal;
a second obtaining module 42 configured to obtain model data and rendering information of the target animation according to the parameter information and a plurality of animation classes that are pre-stored and matched with different animation types;
the mapping module 43 is configured to convert the data types of the model data and the rendering information from a first data type supported by the preset programming interface to a second data type supported by a target programming interface of the first terminal according to a preset mapping relationship between the preset programming interface and programming interfaces of different development platforms;
and a rendering module 44 configured to render the converted model data according to the converted rendering information, and display a target animation.
In one possible implementation, the first obtaining module 41 includes:
a first obtaining submodule configured to obtain a configuration file of a target animation, wherein the configuration file is a configuration file corresponding to the target animation of an animation application program installed on the second terminal;
And the second acquisition sub-module is configured to acquire the parameter information of the target animation according to the configuration file.
In one possible implementation, the second obtaining module 42 includes:
a third obtaining sub-module configured to obtain a plurality of animation objects matched with the target animation according to the parameter information and a plurality of pre-stored animation classes matched with different animation types;
and a fourth obtaining sub-module configured to obtain, according to the parameter information, model data of each of the plurality of animation objects, first rendering information, and second rendering information for describing an animation relationship between different animation objects of the plurality of animation objects.
In one possible implementation manner, the target animation is composed of one or more atomic animations, and the parameter information includes an animation type of each atomic animation corresponding to the target animation, an animation parameter of each atomic animation, a model identifier of each atomic animation, and a correlation parameter between different atomic animations;
the third acquisition submodule includes:
a first creating unit configured to create a plurality of animation objects matching the target animation according to the animation type and the animation parameters of each atomic animation in the parameter information according to a plurality of animation classes matching different animation types stored in advance, wherein each animation object is configured with the corresponding animation parameters;
The fourth acquisition submodule includes:
the first acquisition unit is configured to acquire model data of each animation object in the plurality of animation objects according to the model identification of each atomic animation corresponding to the target animation in the parameter information;
a first generation unit configured to generate first rendering information of each animation object according to the animation parameters configured by each animation object;
the second creation unit is configured to create association information between different animation objects in the plurality of animation objects according to association parameters between different atomic animations corresponding to the target animation;
and a second generation unit configured to generate second rendering information for describing an animation relationship between different animation objects among the plurality of animation objects according to the association information.
In one possible embodiment, the first acquisition unit includes:
a first acquisition subunit configured to acquire a model file;
and the second acquisition subunit is configured to acquire the model data of each animation object in the plurality of animation objects from the model file according to the model identification of each atomic animation corresponding to the target animation in the parameter information.
In one possible implementation, the animation types corresponding to the plurality of animation classes are animation types supported by the animation application program.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 5 is a block diagram illustrating an apparatus 800 for animated display according to an exemplary embodiment. For example, apparatus 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 5, apparatus 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the apparatus 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the device 800. Examples of such data include instructions for any application or method operating on the device 800, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 806 provides power to the various components of the device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen between the device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the apparatus 800. For example, the sensor assembly 814 may detect an on/off state of the device 800, a relative positioning of the components, such as a display and keypad of the apparatus 800, the sensor assembly 814 may also detect a change in position of the apparatus 800 or one component of the apparatus 800, the presence or absence of user contact with the apparatus 800, an orientation or acceleration/deceleration of the apparatus 800, and a change in temperature of the apparatus 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the apparatus 800 and other devices, either in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, an operator network (e.g., 2G, 3G, 4G, or 5G), or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including instructions executable by processor 820 of apparatus 800 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
In an exemplary embodiment, an application program is also provided that includes instructions, such as memory 804 including instructions, that are executable by processor 820 of apparatus 800 to perform the above-described methods. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Fig. 6 is a block diagram illustrating an apparatus 1900 for animated display according to an exemplary embodiment. For example, the apparatus 1900 may be provided as a server. Referring to fig. 6, the apparatus 1900 includes a processing component 1922 that further includes one or more processors and memory resources represented by memory 1932 for storing instructions, such as application programs, that can be executed by the processing component 1922. The application programs stored in memory 1932 may include one or more modules each corresponding to a set of instructions. Further, processing component 1922 is configured to execute instructions to perform the methods described above.
The apparatus 1900 may further include a power component 1926 configured to perform power management of the apparatus 1900, a wired or wireless network interface 1950 configured to connect the apparatus 1900 to a network, and an input/output (I/O) interface 1958. The device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, or the like.
It should be noted that, the execution subject of the present disclosure may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like; or may be a server. When an electronic device such as a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, etc., is shown in fig. 5. When the electronic device is a server, as shown in fig. 6.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. An animation display method applied to a first terminal having a target programming interface, the method comprising:
acquiring parameter information of a target animation to be displayed, wherein the target animation is an animation drawn by an animation application program installed on a second terminal;
obtaining model data and rendering information of the target animation according to the parameter information and a plurality of prestored animation classes matched with different animation types, wherein the plurality of animation classes are a plurality of class files for describing the animation, and the animation classes are used for describing animation parameters and rendering information of corresponding animation types;
according to a preset mapping relation between a preset programming interface and programming interfaces of different development platforms, converting the data types of the model data and the rendering information from a first data type supported by the preset programming interface to a second data type supported by a target programming interface of the first terminal;
and rendering the converted model data according to the converted rendering information, and displaying a target animation.
2. The method for displaying an animation according to claim 1, wherein,
the obtaining the parameter information of the target animation to be displayed comprises the following steps:
Acquiring a configuration file of the target animation, wherein the configuration file is a configuration file corresponding to the target animation of an animation application program installed on the second terminal;
and acquiring parameter information of the target animation according to the configuration file.
3. The animation display method according to claim 1, wherein the obtaining the model data and the rendering information of the target animation based on the parameter information and a plurality of animation classes previously stored to match different animation types, comprises:
acquiring a plurality of animation objects matched with the target animation according to the parameter information and a plurality of animation classes which are prestored and matched with different animation types;
and according to the parameter information, obtaining model data, first rendering information and second rendering information for describing animation relations among different animation objects in the plurality of animation objects.
4. The animation display method according to claim 3, wherein the target animation is composed of one or more atomic animations, and the parameter information includes an animation type of each atomic animation corresponding to the target animation, an animation parameter of each atomic animation, a model identification of each atomic animation, and a correlation parameter between different atomic animations;
The obtaining a plurality of animation objects matched with the target animation according to the parameter information and a plurality of pre-stored animation classes matched with different animation types, comprising:
according to a plurality of animation classes which are pre-stored and matched with different animation types, creating a plurality of animation objects matched with the target animation according to the animation type and the animation parameters of each atomic animation in the parameter information, wherein each animation object is respectively configured with the corresponding animation parameters;
the obtaining, according to the parameter information, model data of each of the plurality of animation objects, first rendering information, and second rendering information for describing animation relationships between different animation objects of the plurality of animation objects, includes:
according to the model identification of each atomic animation corresponding to the target animation in the parameter information, obtaining the model data of each animation object in the plurality of animation objects;
generating first rendering information of each animation object according to the animation parameters configured by each animation object;
creating association information between different animation objects in the plurality of animation objects according to association parameters between different atomic animations corresponding to the target animation;
And generating second rendering information for describing the animation relation between different animation objects in the plurality of animation objects according to the association information.
5. The method for displaying an animation according to claim 4, wherein,
the obtaining the model data of each animation object in the plurality of animation objects according to the model identification of each atomic animation corresponding to the target animation in the parameter information comprises the following steps:
obtaining a model file;
and obtaining the model data of each animation object in the plurality of animation objects from the model file according to the model identification of each atomic animation corresponding to the target animation in the parameter information.
6. The animation display method according to claim 1, wherein the animation types corresponding to the plurality of animation classes are animation types supported by the animation application.
7. An animation display device, for use with a first terminal having a target programming interface, comprising:
the first acquisition module is configured to acquire parameter information of a target animation to be displayed, wherein the target animation is an animation drawn by an animation application program installed on the second terminal;
The second acquisition module is configured to acquire model data and rendering information of the target animation according to the parameter information and a plurality of pre-stored animation classes matched with different animation types, wherein the plurality of animation classes are a plurality of class files for describing the animation, and the animation classes are used for describing animation parameters and rendering information of the corresponding animation types;
the mapping module is configured to convert the data types of the model data and the rendering information from a first data type supported by a preset programming interface to a second data type supported by a target programming interface of the first terminal according to a preset mapping relation between the preset programming interface and programming interfaces of different development platforms;
and the rendering module is configured to render the converted model data according to the converted rendering information and display a target animation.
8. The animation display device of claim 7, wherein,
the first acquisition module includes:
a first obtaining submodule configured to obtain a configuration file of a target animation, wherein the configuration file is a configuration file corresponding to the target animation of an animation application program installed on the second terminal;
And the second acquisition sub-module is configured to acquire the parameter information of the target animation according to the configuration file.
9. The animation display device of claim 7, wherein the second acquisition module comprises:
a third obtaining sub-module configured to obtain a plurality of animation objects matched with the target animation according to the parameter information and a plurality of pre-stored animation classes matched with different animation types;
and a fourth obtaining sub-module configured to obtain, according to the parameter information, model data of each of the plurality of animation objects, first rendering information, and second rendering information for describing an animation relationship between different animation objects of the plurality of animation objects.
10. The animation display device according to claim 9, wherein the target animation is composed of one or more atomic animations, and the parameter information includes an animation type of each atomic animation corresponding to the target animation, an animation parameter of each atomic animation, a model identification of each atomic animation, and a correlation parameter between different atomic animations;
the third acquisition submodule includes:
a first creating unit configured to create a plurality of animation objects matching the target animation according to the animation type and the animation parameters of each atomic animation in the parameter information according to a plurality of animation classes matching different animation types stored in advance, wherein each animation object is configured with the corresponding animation parameters;
The fourth acquisition submodule includes:
the first acquisition unit is configured to acquire model data of each animation object in the plurality of animation objects according to the model identification of each atomic animation corresponding to the target animation in the parameter information;
a first generation unit configured to generate first rendering information of each animation object according to the animation parameters configured by each animation object;
the second creation unit is configured to create association information between different animation objects in the plurality of animation objects according to association parameters between different atomic animations corresponding to the target animation;
and a second generation unit configured to generate second rendering information for describing an animation relationship between different animation objects among the plurality of animation objects according to the association information.
11. The animation display device of claim 10, wherein,
the first acquisition unit includes:
a first acquisition subunit configured to acquire a model file;
and the second acquisition subunit is configured to acquire the model data of each animation object in the plurality of animation objects from the model file according to the model identification of each atomic animation corresponding to the target animation in the parameter information.
12. The animation display device of claim 7, wherein the animation types corresponding to the plurality of animation classes are animation types supported by the animation application.
13. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to perform operations performed to implement the animation display method of any of claims 1 to 6.
14. A non-transitory computer-readable storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform an operation to implement the animation display method of any one of claims 1-6.
CN201910253191.5A 2019-03-29 2019-03-29 Animation display method, device, electronic equipment and storage medium Active CN110136230B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910253191.5A CN110136230B (en) 2019-03-29 2019-03-29 Animation display method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910253191.5A CN110136230B (en) 2019-03-29 2019-03-29 Animation display method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110136230A CN110136230A (en) 2019-08-16
CN110136230B true CN110136230B (en) 2023-09-05

Family

ID=67568848

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910253191.5A Active CN110136230B (en) 2019-03-29 2019-03-29 Animation display method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110136230B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110659024B (en) * 2019-08-21 2023-12-26 北京达佳互联信息技术有限公司 Graphics resource conversion method and device, electronic equipment and storage medium
CN110751592A (en) * 2019-08-21 2020-02-04 北京达佳互联信息技术有限公司 Graphic resource conversion method, apparatus, electronic device and storage medium
CN110647325A (en) * 2019-08-21 2020-01-03 北京达佳互联信息技术有限公司 Graphic resource conversion method, apparatus, electronic device and storage medium
CN110865800B (en) * 2019-11-01 2021-03-09 浙江大学 Full-platform three-dimensional reconstruction code processing method based on engine modularization
CN111951355A (en) * 2020-08-04 2020-11-17 北京字节跳动网络技术有限公司 Animation processing method and device, computer equipment and storage medium
CN112560397B (en) * 2020-12-24 2022-02-25 成都极米科技股份有限公司 Drawing method, drawing device, terminal equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000011199A (en) * 1998-06-18 2000-01-14 Sony Corp Automatic generating method for animation
CN101484921A (en) * 2006-03-28 2009-07-15 斯特里米泽公司 Method for calculating animation parameters of objects of a multimedia scene
CN109359262A (en) * 2018-10-11 2019-02-19 广州酷狗计算机科技有限公司 Animation playing method, device, terminal and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9977566B2 (en) * 2014-06-24 2018-05-22 Google Llc Computerized systems and methods for rendering an animation of an object in response to user input

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000011199A (en) * 1998-06-18 2000-01-14 Sony Corp Automatic generating method for animation
CN101484921A (en) * 2006-03-28 2009-07-15 斯特里米泽公司 Method for calculating animation parameters of objects of a multimedia scene
CN109359262A (en) * 2018-10-11 2019-02-19 广州酷狗计算机科技有限公司 Animation playing method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN110136230A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
CN110136230B (en) Animation display method, device, electronic equipment and storage medium
US11741583B2 (en) Face image processing method and apparatus, electronic device, and storage medium
CN110989901B (en) Interactive display method and device for image positioning, electronic equipment and storage medium
CN110874217A (en) Interface display method and device for fast application and storage medium
CN112785672B (en) Image processing method and device, electronic equipment and storage medium
CN111459586A (en) Remote assistance method, device, storage medium and terminal
WO2022134475A1 (en) Point cloud map construction method and apparatus, electronic device, storage medium and program
CN110989905A (en) Information processing method and device, electronic equipment and storage medium
CN110929616B (en) Human hand identification method and device, electronic equipment and storage medium
CN110865863B (en) Interface display method and device for fast application and storage medium
CN112433724A (en) Target component style generation method and device, electronic equipment and storage medium
CN112950712B (en) Positioning method and device, electronic equipment and storage medium
CN114067085A (en) Virtual object display method and device, electronic equipment and storage medium
CN112882784A (en) Application interface display method and device, intelligent equipment and medium
CN112508020A (en) Labeling method and device, electronic equipment and storage medium
CN110865864A (en) Interface display method, device and equipment for fast application and storage medium
CN112437090B (en) Resource loading method and device, electronic equipment and storage medium
CN111610856B (en) Vibration feedback method, vibration feedback device and storage medium
CN110312117B (en) Data refreshing method and device
CN114827721A (en) Video special effect processing method and device, storage medium and electronic equipment
CN114266305A (en) Object identification method and device, electronic equipment and storage medium
CN113419650A (en) Data moving method and device, storage medium and electronic equipment
CN109407942B (en) Model processing method and device, control client and storage medium
CN113031781A (en) Augmented reality resource display method and device, electronic equipment and storage medium
CN109389547B (en) Image display method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant