CN106611435B - Animation processing method and device - Google Patents

Animation processing method and device Download PDF

Info

Publication number
CN106611435B
CN106611435B CN201611201647.6A CN201611201647A CN106611435B CN 106611435 B CN106611435 B CN 106611435B CN 201611201647 A CN201611201647 A CN 201611201647A CN 106611435 B CN106611435 B CN 106611435B
Authority
CN
China
Prior art keywords
animation
frame
image
elements
contained
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611201647.6A
Other languages
Chinese (zh)
Other versions
CN106611435A (en
Inventor
崔明辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Cubesili Information Technology Co Ltd
Original Assignee
Guangzhou Cubesili Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Cubesili Information Technology Co Ltd filed Critical Guangzhou Cubesili Information Technology Co Ltd
Priority to CN202211667831.5A priority Critical patent/CN115908644A/en
Priority to CN202211688081.XA priority patent/CN115830190A/en
Priority to CN201611201647.6A priority patent/CN106611435B/en
Publication of CN106611435A publication Critical patent/CN106611435A/en
Application granted granted Critical
Publication of CN106611435B publication Critical patent/CN106611435B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2213/00Indexing scheme for animation
    • G06T2213/04Animation description language

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses an animation processing method and device, wherein the method comprises the following steps: acquiring a sequence frame of an SVG format of a cartoon to be played; dividing animation objects contained in each frame of image in the sequence frame into a plurality of animation elements; acquiring a bitmap image of each animation element as an element bitmap corresponding to the animation element; generating an animation description file according to animation elements contained in each frame of image and animation parameters of the contained animation elements in the frame of image, wherein the animation description file comprises animation attributes, the animation elements contained in each frame of image and the animation parameters of the contained animation elements in each frame of image; determining the element bitmap corresponding to the animation description file and each animation element: as the animation source file of the animation to be played. By implementing the method and the device, the animation source file with smaller occupied space can be obtained, and the consumption of a processor and a memory can be effectively reduced while animation playing is rapidly realized by processing the animation source file.

Description

Animation processing method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to an animation processing method and apparatus.
Background
In the current computer technology, animation technology is gradually becoming a hot spot for internet applications. Particularly, with the rise of online live broadcast services, the demand for high-performance low-consumption animation composition technology is further promoted. In the related art, technologies such as a-PNG (Animated Portable Network Graphics), flash (interactive Vector Graphics and Web animation standard), SVG (Scalable Vector Graphics) and the like are generally used to produce animations.
However, loading and playing the animation produced by the related technology greatly consumes the processor and the memory, and the processing efficiency of the animation is low.
Disclosure of Invention
The application provides an animation processing method and device, which can reduce the consumption of a processor and a memory when animation is loaded and played and improve animation processing efficiency.
According to a first aspect of embodiments of the present application, there is provided an animation processing method, including:
acquiring a sequence frame of an SVG format of a cartoon to be played;
dividing animation objects contained in each frame of image in the sequence frame into a plurality of animation elements;
acquiring a bitmap image of each animation element as an element bitmap corresponding to the animation element;
generating an animation description file according to animation elements contained in each frame of image and animation parameters of the contained animation elements in the frame of image, wherein the animation description file comprises animation attributes, the animation elements contained in each frame of image and the animation parameters of the contained animation elements in each frame of image;
determining the element bitmap corresponding to the animation description file and each animation element: as the animation source file of the animation to be played.
In one embodiment, the obtaining of the sequential frames in the SVG format of the animation to be played includes:
converting the animation to be played in the Flash format into a sequence frame in the SVG format through a Flash editor;
or,
and converting the animation to be played in the AE format into the sequence frames in the SVG format through the BodyMovin.
In one embodiment, the dividing the animation object included in each frame image in the sequence frame into multiple animation elements includes:
comparing each sequence frame by frame to obtain comparison information between each two adjacent frames, wherein the comparison information comprises the same animation objects between the adjacent frames, the change parameters between the same animation objects, different animation objects and the position relationship between the different animation objects;
based on the comparison information, the parts of each animation object, of which the vectors have not changed, and the parts of each animation object, of which the vectors have changed: respectively as different animation elements.
In one embodiment, the animation parameters include a position parameter, a transparency parameter, a size parameter, and a layer sequence parameter.
In one embodiment, the animation attributes include the total frame number of the animation, the identification of each frame image, the FPS, and the animation size.
In one embodiment, the method further comprises the steps of:
performing file compression on the animation source file;
and transmitting the compressed animation source file to a specified address so that the animation playing end can obtain the animation source file from the specified address.
According to a second aspect of embodiments of the present application, there is provided an animation processing method, including:
acquiring an animation source file of an animation to be played, wherein the animation source file comprises an animation description file and element bitmaps, the element bitmaps are bitmap images of animation elements contained in the animation to be played, and the animation description file comprises animation attributes, animation elements contained in each frame image and animation parameters of the contained animation elements in each frame image;
rendering all element bitmaps to generate animation layers of all animation elements;
acquiring animation playing information from the animation description file, wherein the animation playing information comprises animation attributes, animation elements contained in each frame of image and animation parameters of the contained animation elements in the frame of image;
and combining the animation layers of the animation elements contained in each frame of image according to the animation parameters based on the animation playing information to realize the playing of the animation to be played.
In one embodiment, the combining, based on the animation playing information, the animation layers of the animation elements included in each frame of image according to the animation parameters to realize the playing of the animation to be played includes:
comparing animation elements contained in each frame image and animation parameters of the contained animation elements in each frame image frame by frame to obtain comparison information between each two adjacent frame images, wherein the comparison information comprises the same animation elements between the adjacent frame images, change parameters between the same animation elements, different animation elements and hierarchical relations among the animation elements;
and based on the acquired comparison information, playing the next frame of image by adjusting the animation layer type of the previous frame of image, the superposition sequence of each animation layer, the superposition position of each animation layer and other animation parameters of each animation layer.
According to a third aspect of embodiments of the present application, there is provided an animation processing apparatus including:
the sequence frame acquisition module is used for acquiring sequence frames of the to-be-played animation in the SVG format;
the element dividing module is used for dividing animation objects contained in each frame image in the sequence frame into a plurality of animation elements;
the bitmap acquisition module is used for acquiring a bitmap image of each animation element as an element bitmap corresponding to the animation element;
the file generation module is used for generating an animation description file according to animation elements contained in each frame of image and animation parameters of the contained animation elements in the frame of image, wherein the animation description file comprises animation attributes, the animation elements contained in each frame of image and the animation parameters of the contained animation elements in each frame of image;
a source file determining module, configured to determine that the animation description file and the element bitmap corresponding to each animation element: as the animation source file of the animation to be played.
In one embodiment, the sequence frame acquisition module comprises:
the first acquisition module is used for converting the animation to be played in the Flash format into a sequence frame in the SVG format through a Flash editor;
or,
and the second acquisition module is used for converting the animation to be played in the AE format into the sequence frame in the SVG format through the BodyMovin.
In one embodiment, the element partitioning module comprises:
the frame-by-frame comparison module is used for comparing each sequence frame by frame to obtain comparison information between each two adjacent frames, wherein the comparison information comprises the same animation objects between the adjacent frames, change parameters between the same animation objects, different animation objects and position relations between the different animation objects;
and the element division submodule is used for dividing the unchanged vector part and the changed vector part in each animation object based on the comparison information: respectively as different animation elements.
In one embodiment, the animation parameters comprise a position parameter, a transparency parameter, a size parameter and a layer sequence parameter.
In one embodiment, the animation attributes include a total frame number of the animation, an identification of each frame image, an FPS, and an animation size.
In one embodiment, the apparatus further comprises:
the file compression module is used for carrying out file compression on the animation source file;
and the file transmission module is used for transmitting the compressed animation source file to a specified address so that the animation playing end can obtain the animation source file from the specified address.
According to a fourth aspect of embodiments of the present application, there is provided an animation processing apparatus including:
the source file acquisition module is used for acquiring an animation source file of an animation to be played, wherein the animation source file comprises an animation description file and bitmap images of all elements, the bitmap images are bitmap images of animation elements contained in the animation to be played, and the animation description file comprises animation attributes, animation elements contained in all frame images and animation parameters of the contained animation elements in all frame images;
the bitmap rendering module is used for rendering all element bitmaps to generate animation layers of various animation elements;
the information acquisition module is used for acquiring animation playing information from the animation description file, wherein the animation playing information comprises animation attributes, animation elements contained in each frame of image and animation parameters of the contained animation elements in the frame of image;
and the animation playing module is used for combining the animation layers of the animation elements contained in each frame of image according to the animation parameters based on the animation playing information to realize the playing of the animation to be played.
In one embodiment, the animation playback module includes:
the comparison information acquisition module is used for acquiring comparison information between every two adjacent frame images by comparing animation elements contained in every frame image and animation parameters of the contained animation elements in every frame image frame by frame, wherein the comparison information comprises the same animation elements between the adjacent frame images, change parameters between the same animation elements, different animation elements and hierarchical relations among all the animation elements;
and the layer adjusting module is used for realizing the playing of the next frame of image by adjusting the animation layer type of the previous frame of image, the superposition sequence of each animation layer, the superposition position of each animation layer and other animation parameters of each animation layer based on the acquired comparison information.
In the embodiment of the application, sequence frames in an SVG format of a to-be-played animation are obtained, an animation object contained in each frame of image is split into a plurality of animation elements, a bitmap image of each animation element is obtained, an animation description file for describing animation attributes, the animation elements contained in each frame of image and animation parameters of the contained animation elements in each frame of image is generated, and the animation description file and the bitmap images of each animation element are finally determined: as the animation source file of the animation to be played. Because the animation source file only comprises the bitmap image of the animation element and the animation description file, the occupied space size is far smaller than that of the bitmap image of the animation element: originally, each frame image of the bitmap image containing each animation element is repeated, so that the volume of the animation file can be effectively reduced, and the defect that the volume of the animation file is overlarge in the prior art can be overcome.
In addition, when the animation source file is processed to realize animation playing, the animation source file comprising the animation description file and the bitmap images of all the animation elements is obtained, all the bitmap images are rendered, the animation layers of all the animation elements are generated, animation playing information is obtained from the animation description file, and the animation layers of the animation elements contained in each frame of image are combined according to the animation parameters based on the animation playing information, so that the playing of the animation to be played is realized. Therefore, when different frames of animations are played, the bitmap image of the same animation element does not need to be repeatedly rendered, and therefore the animation playing can be quickly realized, and meanwhile, the consumption of a processor and a memory can be effectively reduced.
Drawings
FIG. 1 is a flow chart of one embodiment of an animation processing method of the present application;
FIG. 2a is a flow chart of another embodiment of the animation processing method of the present application;
FIG. 2b is a first schematic diagram of an animation element shown herein according to an exemplary embodiment;
FIG. 2c is a second schematic diagram of an animation element shown herein according to an exemplary embodiment;
FIG. 3 is a block diagram of an embodiment of an animation processing apparatus according to the present application;
FIG. 4 is a block diagram of another embodiment of an animation processing apparatus according to the present application;
fig. 5 is a hardware configuration diagram of the animation processing device according to the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at" \8230; "or" when 8230; \8230; "or" in response to a determination ", depending on the context.
With the development of network technologies, static characters and pictures can not meet the requirements of content display in a network application environment, and animation technologies capable of dynamically displaying scene information are developed. Particularly, the rise of online live broadcast services promotes the demand of the industry on high-performance low-consumption animation construction technology. For example: in the online live broadcast scene, when a user purchases a virtual article, celebration animations can be loaded in a live broadcast page, so that the user is provided with sufficient achievement sense. And the celebration animation can consume certain resources of the server side to construct and load the celebration animation, and when a large amount of contents similar to the celebration animation need to be loaded, a high-performance low-consumption animation construction technology is particularly important. In a network application environment, the computational resources of a server and a client are limited, and the network transmission bandwidth is also limited, so that when an animation technology is applied, the animation composition and playing processes are not too complicated, so that the higher hardware requirements on the server and the client are reduced; the animation should not be too bulky to avoid high bandwidth consumption. The animation composition technology is extended to environments such as web pages, desktop application programs, mobile platform application programs of intelligent terminals and the like, relates to application scenes using animation contents, and needs high-performance and low-consumption animation composition technology.
The present application provides a method for processing animation, which includes that in a sequence frame of an SVG (Scalable Vector Graphics) format of the animation to be played: dividing an animation object contained in each frame of image into a plurality of animation elements, acquiring a bitmap image of each animation element, generating an animation description file for describing animation attributes, the animation elements contained in each frame of image and animation parameters of the animation elements contained in each frame of image, and finally determining the animation description file and the bitmap image of each animation element: as the animation source file of the animation to be played. The animation source file only comprises a bitmap image of the animation element and an animation description file, and the size of the occupied space of the animation source file is far smaller than that of the bitmap image of the animation element: originally, each frame of image of the bitmap image repeatedly containing each animation element can effectively reduce the volume of the animation file and solve the defect of overlarge volume of the animation file in the prior art.
In addition, when the animation source file is processed to realize animation playing, the animation source file comprising the animation description file and the bitmap images of all the animation elements is obtained, all the bitmap images are rendered, the animation layers of all the animation elements are generated, animation playing information is obtained from the animation description file, and the animation layers of the animation elements contained in each frame of image are combined according to the animation parameters based on the animation playing information, so that the playing of the animation to be played is realized. Therefore, when different frames of animations are played, the bitmap image of the same animation element does not need to be repeatedly rendered, and therefore the animation playing can be quickly realized, and meanwhile, the consumption of a processor and a memory can be effectively reduced.
The animation according to the present application has a concept of a frame for representing a single video screen of a minimum unit in the animation. For example, 1 second 20 frames of an animation, that is, the animation is composed of 20 pictures in every 1 second, and may be 1 picture occupying one frame or may be a plurality of repeated frames of the same picture. Each frame represents the motion in the movement or change of a character or object, and the character or object in each frame is characterized in the concept of "animation object" in the present application.
The present application is described below with reference to specific embodiments and specific application scenarios.
Fig. 1 is a flowchart of an embodiment of an animation processing method according to the present application, which can be used in a terminal, and includes the following steps 101-105:
step 101: and acquiring the sequence frame of the animation to be played in the SVG format.
The terminal related to the embodiment of the present application may be a terminal having an animation editing function, such as: the animation involved may be a gift animation, a special effect animation, a bullet screen animation, etc. in a live broadcast application, or may be other types of animations in the field, and the present application is not limited thereto.
When the animation to be played is a gift animation in a live application, it is typically an animation manuscript edited by an animation designer, such as: a Photoshop format, a Flash format, or an AE format.
For animations to be played in different formats, the animations can be converted into sequence frames in the SVG format through different conversion means, and in an optional implementation manner, the sequence frames in the SVG format of the animations to be played can be obtained through the following operations:
converting the animation to be played in the Flash format into a sequence frame in the SVG format through a Flash editor;
or,
and converting the animation to be played in the AE format into the sequence frames in the SVG format through the BodyMovin.
In other embodiments of the present application, for animations to be played in other formats, a corresponding animation editor may be used to convert the sequence frames into the SVG format.
Step 102: and dividing animation objects contained in each frame of image in the sequence frame into a plurality of animation elements.
In the adjacent sequence frames of the animation to be played, one animation object may only partially change in the front and back two sequence frames, and the other part may remain unchanged. If the whole animation object is rendered for rendering the local change of one animation object when the subsequent sequence frame is rendered, obviously, the part which does not generate the change is repeatedly rendered, thereby causing additional animation file volume increase and waste of loading resources. In order to reduce the animation volume and reduce the waste of loading resources, the animation to be played can be decomposed into each frame of image, the animation object in each frame of image is split into a plurality of animation elements according to the rule conforming to the composition mode of the animation object, and therefore each frame of image is decomposed into each animation element and the animation parameters of each animation element in the frame of image. For example: an animation object contained in each frame of image of the animation to be played is a character object, and the character object can be split into elements such as a head, a trunk, hands and feet.
When an image is decomposed into animation elements, the images need to be compared frame by frame, the same parts and different parts between adjacent frames are found, and then the images are decomposed, in one example, animation objects contained in each frame image in the sequence frame can be divided into a plurality of animation elements through the following operations:
and comparing the sequence frames frame by frame to obtain comparison information between every two adjacent frames, wherein the comparison information comprises the same animation objects between the adjacent frames, the change parameters between the same animation objects, different animation objects and the position relationship between the different animation objects.
Based on the comparison information, the parts of each animation object, of which the vectors have not changed, and the parts of each animation object, of which the vectors have changed, are: respectively as different animation elements.
In this example, after each frame of image is divided, the same animation element is searched, one animation element is retained from a plurality of the same animation elements as one animation element, and which frame of image each animation element belongs to, a combination relationship of each animation element in each frame of image, animation parameters, and the like are recorded, where the combination relationship is used to describe how to combine different animation elements into an animation object originally included in each frame of image, and the animation parameters are used to describe feature information of each animation element in each frame of image, and may include a position parameter, a transparency parameter, a size parameter, a layer sequence parameter, and the like.
Step 103: and acquiring the bitmap image of each animation element as the element bitmap corresponding to the animation element.
In the present embodiment, the bitmap image may also be referred to as a dot matrix image or a drawn image, and is composed of individual dots called pixels, which may be arranged and colored differently to constitute a pattern. Bitmap images of various animation elements may be obtained by means of techniques known in the art. For the same animation element, only one of the bitmap images is acquired.
Step 104: and generating an animation description file according to the animation elements contained in each frame image and the animation parameters of the contained animation elements in the frame image, wherein the animation description file comprises animation attributes, the animation elements contained in each frame image and the animation parameters of the contained animation elements in each frame image.
In this embodiment of the application, the animation parameters may include a position parameter, a transparency parameter, a size parameter, a layer sequence parameter, and the like. The animation attribute is used to describe the overall animation characteristics of the animation to be played, and may include the total frame number of the animation, the identifier of each frame of image, an FPS (definition in the field of images, which refers to the number of frames transmitted per second of a picture, and colloquially refers to the number of pictures of the animation or video), the size of the animation, and the like. The identification of each frame image may include the name, playing order, etc. of each frame image.
Step 105: determining the element bitmap corresponding to the animation description file and each animation element: as the animation source file of the animation to be played.
In the embodiment of the application, in order to reduce the volume of the animation description file, element bitmaps of one animation element may be reserved in the same animation elements, and therefore the element bitmaps included may be different from each other.
After the animation source file of the animation to be played is determined, file compression can be carried out on the animation source file; and transmitting the compressed animation source file to a specified address so that an animation playing end can obtain the animation source file from the specified address. The designated address mentioned here may be an address of an animation database, an address of an animation server, or an address of an animation sharing network disk, or the like.
Fig. 2a is a flowchart of another embodiment of the animation processing method of the present application, which can be used in a terminal, and includes the following steps 201-204:
step 201: the method comprises the steps of obtaining an animation source file of an animation to be played, wherein the animation source file comprises an animation description file and bitmap images of all elements, the bitmap images are bitmap images of animation elements contained in the animation to be played, and the animation description file comprises animation attributes, animation elements contained in all frame images and animation parameters of the contained animation elements in all frame images.
The terminal related to the embodiment of the present application may be a terminal having an animation playing function, such as: the animation involved may be a gift animation, a special effect animation, a bullet screen animation, etc. in a live broadcast application, or may be other types of animations in the field, and the present application is not limited thereto.
The terminal with the animation playing function can be provided with an SVGA player, and before the animation needs to be played, an animation source file of the animation to be played can be downloaded from a designated address. The downloaded animation source file is generated by the animation processing method corresponding to fig. 1. After downloading to the animation source file, if the animation source file is a compressed file, the picture description file and the bitmap of each element contained in the animation source file can be obtained through decompression.
Step 202: and rendering all element bitmaps to generate animation layers of all animation elements.
In the embodiment of the application, after the animation layers of various animation elements are generated, the same layer can be repeatedly used in different frame images of the animation to be played.
Step 203: and acquiring animation playing information from the animation description file, wherein the animation playing information comprises animation attributes, animation elements contained in each frame of image and animation parameters of the contained animation elements in the frame of image.
In the embodiment of the present application, the animation attribute and the animation parameter correspond to the animation attribute and the animation parameter described in the animation processing method shown in fig. 1. The animation playing information can be directly read from the animation description file.
Step 204: and combining the animation layers of the animation elements contained in each frame of image according to the animation parameters based on the animation playing information to realize the playing of the animation to be played.
In the embodiment of the application, based on the animation playing information, the playing sequence of each frame of image, the animation layers of contained animation elements, the loading sequence of each animation layer, the position relation of each animation layer, and the animation parameters of each animation layer in each frame of image can be obtained, and then based on the information, the animation layers are loaded in sequence and the animation parameters of each animation layer are set, each frame of image is rendered, and animation playing is realized.
In addition, when each frame image is rendered, the difference and the same place between the adjacent frame images can be obtained by comparing the playing sequence of each frame image, the animation elements contained in each frame image and the animation parameters of the contained animation elements in each frame image, based on the difference and the same place, only the part of the animation elements which are changed can be rendered, and the animation elements which are not changed are not processed, thereby saving the expenditure of loading resources.
In order to save animation volume and processor overhead, a description file mechanism is introduced, the loading sequence (playing sequence) of each frame image is recorded in a centralized manner, and each animation element is preset with a change parameter (animation parameter) of a rendered change form in a current key frame image. The description file is read, the content in the description file is executed, the change form of each animation element in each key frame image can be sequentially rendered, and each frame image is called according to the loading sequence to load to form the animation.
Because the change parameters of each animation element are recorded in the description file set, resources required to be called when the form change of each animation element is rendered can be preloaded before the first frame of image is rendered. The method avoids calculating animation elements needing to be rendered in each frame of image in real time when each frame of image is rendered, dynamically inquiring and rendering resources needed by the morphological change of the animation elements, and further avoids the consumption of operation resources for forming the animation in real time and dynamic inquiring, loading and operating processes, so that a description file mechanism is introduced, the process for forming the animation is optimized, and the consumption of the operation resources for forming the animation can be reduced.
In an optional implementation manner, based on the animation playing information, the animation layers of the animation elements included in each frame of image are combined according to the animation parameters by the following operations, so that the animation to be played is played:
comparing animation elements contained in each frame image and animation parameters of the contained animation elements in each frame image frame by frame to obtain comparison information between each two adjacent frame images, wherein the comparison information comprises the same animation elements between the two adjacent frame images, change parameters between the same animation elements, different animation elements and hierarchical relations among the animation elements.
And based on the acquired comparison information, playing the next frame of image by adjusting the animation layer type of the previous frame of image, the superposition sequence of each animation layer, the superposition position of each animation layer and other animation parameters of each animation layer.
In this way, one type of animation layers corresponds to one type of animation element, and the stacking sequence and the stacking position of each animation layer are set by a designer according to the animation presentation effect when designing the animation. The other animation parameters can be transparency, deformation quantity, displacement, container size and the like of the animation layers.
In addition, in the loading process (rendering process) of the animation, some animation elements are displayed in the animation for the first time in the first frame image, and some animation elements do not appear in the animation for the first time until the tenth frame image. As shown in fig. 2b, for the animation element A1 which needs to be displayed for the first frame image, its initial state is visible in the first frame image; for the animation elements B1 and C1 which are rendered and displayed for the first time in the images after the first frame image, the initial change form in the first frame image can be invisible, when the description file is loaded to one frame image which presets the display of the animation elements which are rendered for the first time, the change forms of the animation elements are set to be visible by setting the animation parameters of the animation elements, as shown in FIG. 2C, the animation element A1 is the animation element A2 which is still in a visible state in the figure, and the animation element C1 is the animation element C2 which is still in an invisible state in the figure by setting the animation parameters of the animation element B1 to make the form of the animation element A2 be in a visible state in the first-appearing image. Therefore, all animation elements are pre-rendered before the first frame image of the animation is rendered, and animation elements which are not displayed in the first frame image can be set to be invisible).
The static pre-rendering of the animation elements prior to displaying the first frame of image is significantly less resource overhead than the dynamic rendering of the animation elements when rendering each frame of image, in terms of resource overhead alone for rendering the animation elements. And compared with the method for re-rendering one animation element of the animation, only the visible state of the animation element is changed, so that the loading expense of the animation can be obviously reduced.
In general, the visible state of an animation element can be changed by setting the following position parameters, transparency, container size (dimension parameter), deformation amount, displacement amount, layer sequence, and the like of the animation element:
setting the position parameter of the animation element can control the initial position of the animation element appearing in each frame image.
Setting the transparency of the animation element, so that whether the animation element is visible in the sequence frame can be controlled, and if the transparency is 100%, the animation element is completely visible; when the transparency is 0%, the animation elements are completely invisible; when the transparency is between 100% and 0%, the animation element shows different degrees of perspective effect.
The transparency of the animation element is set, the color of the animation element can be controlled, the process similar to color matching is realized by changing the transparency of the three primary colors of red, yellow and blue of the animation element, and the effect of changing the color of the animation element is achieved.
The deformation quantity of the animation element is set, and the form change of the animation element can be controlled.
The size of the container of the animation element is set, so that the display range of the animation element in the sequence frame can be controlled.
By setting the displacement amount of the animation element, the moving distance of the animation element relative to the initial position can be controlled.
The same layer sequence of the animation elements is set, the same animation elements (which can refer to the same animation elements in different layers in the same sequence of frames) in the same frame of image can be distinguished and controlled, and the mutual shielding relation of all the animation elements in each frame of image can be controlled.
From the above embodiment, it can be seen that: acquiring sequence frames in an SVG format of animation to be played, splitting an animation object contained in each frame of image into a plurality of animation elements, acquiring a bitmap image of each animation element, generating an animation description file for describing animation attributes, the animation elements contained in each frame of image and animation parameters of the animation elements contained in each frame of image, and finally determining the animation description file and the bitmap image of each animation element: as the animation source file of the animation to be played. Because the animation source file only comprises the bitmap image of the animation element and the animation description file, the occupied space size is far smaller than that of the bitmap image of the animation element: originally, each frame image of the bitmap image containing each animation element is repeated, so that the volume of the animation file can be effectively reduced, and the defect that the volume of the animation file is overlarge in the prior art can be overcome.
In addition, when the animation source file is processed to realize animation playing, the animation source file comprising the animation description file and the bitmap images of all the animation elements is obtained, all the bitmap images are rendered, the animation layers of all the animation elements are generated, animation playing information is obtained from the animation description file, and the animation layers of the animation elements contained in each frame of image are combined according to the animation parameters based on the animation playing information, so that the playing of the animation to be played is realized. Therefore, when different frames of animations are played, the bitmap image of the same animation element does not need to be repeatedly rendered, and therefore the animation playing can be quickly realized, and meanwhile, the consumption of a processor and a memory can be effectively reduced.
After the method is applied to the field of live broadcast, the CPU occupancy rate is only half of that of the original broadcast scheme when the animation is played, the GPU occupancy rate is not greatly increased, and the memory occupancy is also only half of that of the original broadcast scheme. The volume of the animation source file is only 10% of the original playing scheme.
Corresponding to the embodiment of the animation processing method, the application also provides an embodiment of the animation processing device.
Referring to fig. 3, fig. 3 is a block diagram of an embodiment of an animation processing apparatus according to the present application, which may include: a sequence frame acquisition module 310, an element division module 320, a bitmap acquisition module 330, a file generation module 340, and a source file determination module 350.
The sequence frame acquiring module 310 is configured to acquire a sequence frame in an SVG format of a to-be-played animation.
An element dividing module 320, configured to divide the animation object included in each frame image in the sequence frame into multiple animation elements.
The bitmap obtaining module 330 is configured to obtain a bitmap image of each animation element as an element bitmap corresponding to the animation element.
The file generating module 340 is configured to generate an animation description file according to the animation elements included in each frame of image and the animation parameters of the included animation elements in the frame of image, where the animation description file includes animation attributes, the animation elements included in each frame of image, and the animation parameters of the included animation elements in each frame of image.
A source file determining module 350, configured to determine that the animation description file and the element bitmap corresponding to each animation element: as the animation source file of the animation to be played.
In an optional implementation, the sequence frame acquiring module 310 may further include (not shown in fig. 3):
the first acquisition module is used for converting the animation to be played in the Flash format into the sequence frame in the SVG format through the Flash editor.
Or,
and the second acquisition module is used for converting the animation to be played in the AE format into the sequence frame in the SVG format through the BodyMovin.
In another alternative implementation, the element division module 320 may further include (not shown in fig. 3):
and the frame-by-frame comparison module is used for comparing each sequence frame by frame to obtain comparison information between each two adjacent frames, wherein the comparison information comprises the same animation object between the adjacent frames, the change parameter between the same animation object, different animation objects and the position relation between the different animation objects.
And the element division submodule is used for dividing the unchanged vector part and the changed vector part in each animation object based on the comparison information: respectively as different animation elements.
In another optional implementation manner, the animation parameters include a position parameter, a transparency parameter, a size parameter, and a layer sequence parameter.
In another optional implementation manner, the animation attributes include a total frame number of the animation, an identification of each frame image, an FPS, and an animation size.
In another optional implementation manner, the animation processing apparatus according to this embodiment may further include (not shown in fig. 3):
and the file compression module is used for carrying out file compression on the animation source file.
And the file transmission module is used for transmitting the compressed animation source file to a specified address so that the animation playing end can obtain the animation source file from the specified address.
Referring to fig. 4, fig. 4 is a block diagram of another embodiment of an animation processing apparatus according to the present application, which may include: a source file obtaining module 410, a bitmap rendering module 420, an information obtaining module 430, and an animation playing module 440.
The source file obtaining module 410 is configured to obtain an animation source file of an animation to be played, where the animation source file includes an animation description file and each element bitmap, the element bitmap is a bitmap image of an animation element included in the animation to be played, and the animation description file includes an animation attribute, an animation element included in each frame image, and an animation parameter of an animation element included in each frame image.
And the bitmap rendering module 420 is configured to render all the element bitmaps to generate animation layers of various animation elements.
And the information obtaining module 430 obtains animation playing information from the animation description file, wherein the animation playing information comprises animation attributes, animation elements contained in each frame of image, and animation parameters of the contained animation elements in the frame of image.
And the animation playing module 440 is configured to combine animation layers of animation elements included in each frame of image according to the animation parameters based on the animation playing information, so as to play the animation to be played.
In an alternative implementation, the animation playing module 440 may further include (not shown in fig. 4):
the comparison information acquisition module is used for comparing animation elements contained in each frame image and animation parameters of the contained animation elements in each frame image frame by frame to acquire comparison information between each two adjacent frame images, wherein the comparison information comprises the same animation elements between the adjacent frame images, change parameters between the same animation elements, different animation elements and the hierarchical relationship between each two animation elements.
And the layer adjusting module is used for realizing the playing of the next frame of image by adjusting the animation layer type of the previous frame of image, the superposition sequence of each animation layer, the superposition position of each animation layer and other animation parameters of each animation layer based on the acquired comparison information.
The implementation process of the functions and actions of each unit (or module) in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are only illustrative, and the units or modules described as separate parts may or may not be physically separate, and parts displayed as units or modules may or may not be physical units or modules, may be located in one position, or may be distributed on multiple network units or modules. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
The embodiment of the animation processing device can be applied to electronic equipment. The implementation may be realized by a computer chip or entity, or by a product with a certain functionality. In a typical implementation, the electronic device is a computer, which may be embodied in the form of a personal computer, laptop computer, cellular telephone, camera phone, smart phone, personal digital assistant, media player, navigation device, email messaging device, game console, tablet computer, wearable device, internet television, smart car, smart home device, or a combination of any of these devices.
The device embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. In the case of software implementation, as a logical device, a processor of the electronic device reads corresponding computer program instructions in a readable medium such as a non-volatile memory into an internal memory for execution. From a hardware aspect, as shown in fig. 5, the present application is a hardware structure diagram of an electronic device where an animation processing apparatus is located, where the electronic device where the apparatus is located in the embodiment may further include other hardware according to an actual function of the electronic device, in addition to the processor, the memory, the network interface, and the nonvolatile memory shown in fig. 5, and details of this are not repeated. The storage processor of the electronic device may be a memory that stores executable instructions; the processor may be coupled to the memory for reading program instructions stored by the memory and, in response, performing the following: acquiring a sequence frame of an SVG format of a cartoon to be played; dividing animation objects contained in each frame of image in the sequence frame into a plurality of animation elements; acquiring a bitmap image of each animation element as an element bitmap corresponding to the animation element; generating an animation description file according to animation elements contained in each frame of image and animation parameters of the contained animation elements in the frame of image, wherein the animation description file comprises animation attributes, the animation elements contained in each frame of image and the animation parameters of the contained animation elements in each frame of image; determining the element bitmap corresponding to the animation description file and each animation element: as the animation source file of the animation to be played.
In another embodiment, a storage processor of an electronic device may be a memory of executable instructions; the processor may be coupled to the memory for reading program instructions stored in the memory and, in response, performing the following: acquiring an animation source file of an animation to be played, wherein the animation source file comprises an animation description file and bitmap images of all elements, the bitmap images are bitmap images of animation elements contained in the animation to be played, and the animation description file comprises animation attributes, animation elements contained in all frame images and animation parameters of the animation elements contained in all frame images; rendering all element bitmaps to generate animation layers of all animation elements; acquiring animation playing information from the animation description file, wherein the animation playing information comprises animation attributes, animation elements contained in each frame of image and animation parameters of the contained animation elements in the frame of image; and combining the animation layers of the animation elements contained in each frame of image according to the animation parameters based on the animation playing information to realize the playing of the animation to be played.
In other embodiments, the operations performed by the processor may refer to the description related to the above method embodiments, which is not repeated herein.
The above description is only a preferred embodiment of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (12)

1. An animation processing method, characterized by comprising the steps of:
acquiring an animation source file of an animation to be played, wherein the animation source file comprises an animation description file and bitmap images of all elements, the bitmap images are bitmap images of animation elements contained in the animation to be played, and the animation description file comprises animation attributes, animation elements contained in all frame images and animation parameters of the animation elements contained in all frame images;
the animation element is determined based on the following steps: acquiring a sequence frame of the SVG format of the animation to be played; comparing the sequence frames to be played frame by frame to obtain comparison information between adjacent frames, wherein the comparison information comprises the same animation objects between the adjacent frames, the change parameters between the same animation objects, different animation objects and the position relationship between the different animation objects; and on the basis of the comparison information, respectively taking the part of each animation object, of which the vector has not changed, and the part of each animation object, of which the vector has changed, as different animation elements;
rendering all element bitmaps to generate animation layers of all animation elements;
acquiring animation playing information from the animation description file, wherein the animation playing information comprises a playing sequence of each frame of image, an animation layer contained in each frame of image, a loading sequence of the animation layer contained in each frame of image and animation parameters of the animation layer contained in each frame of image in the frame of image;
based on the animation playing information, combining animation layers of animation elements contained in each frame of image according to the animation parameters to realize playing of the animation to be played, and the method specifically comprises the following steps:
comparing animation elements contained in each frame image and animation parameters of the contained animation elements in each frame image frame by frame to obtain comparison information between each two adjacent frame images, wherein the comparison information comprises the same animation elements between the adjacent frame images, change parameters between the same animation elements, different animation elements and hierarchical relations among the animation elements;
based on the obtained comparison information, playing of the next frame of image is realized by adjusting the animation layer type of the previous frame of image, the superposition sequence of each animation layer, the superposition position of each animation layer and other animation parameters of each animation layer; wherein the other animation parameters at least comprise the transparency of the animation layer.
2. The method of claim 1, wherein the animation source file is determined based on:
acquiring a bitmap image of each animation element as an element bitmap corresponding to the animation element;
generating the animation description file according to the animation elements contained in each frame of image and the animation parameters of the contained animation elements in the frame of image;
determining the element bitmap corresponding to the animation description file and each animation element: as the animation source file of the animation to be played.
3. The method of claim 1, wherein said obtaining a sequence of frames in SVG format for a motion picture to be played comprises:
converting the animation to be played in the Flash format into a sequence frame in the SVG format through a Flash editor;
or,
and converting the animation to be played in the AE format into the sequence frames in the SVG format through the BodyMovin.
4. The method of claim 1, wherein the animation parameters comprise a position parameter, a transparency parameter, a size parameter, and a layer order parameter.
5. The method of claim 1, wherein the animation properties comprise a total frame number of the animation, an identification of each frame image, an FPS, an animation size.
6. The method according to any one of claims 1 to 5, characterized in that the method further comprises the steps of:
performing file compression on the animation source file;
and transmitting the compressed animation source file to a specified address so that the animation playing end can obtain the animation source file from the specified address.
7. An animation processing apparatus, comprising:
the source file acquisition module is used for acquiring an animation source file of an animation to be played, wherein the animation source file comprises an animation description file and bitmap images of all elements, the bitmap images are bitmap images of animation elements contained in the animation to be played, and the animation description file comprises animation attributes, animation elements contained in all frame images and animation parameters of the contained animation elements in all frame images;
the source file acquisition module comprises:
the sequence frame acquisition module is used for acquiring the sequence frame of the animation to be played in the SVG format;
the element dividing module is used for comparing the sequence frames to be played frame by frame to obtain comparison information between every two adjacent frames, wherein the comparison information comprises the same animation objects between the adjacent frames, change parameters between the same animation objects, different animation objects and position relations between the different animation objects; and on the basis of the comparison information, respectively taking the part of each animation object, of which the vector has not changed, and the part of each animation object, of which the vector has changed, as different animation elements;
the bitmap rendering module is used for rendering all element bitmaps to generate animation layers of various animation elements;
the information acquisition module is used for acquiring animation playing information from the animation description file, wherein the animation playing information comprises a playing sequence of each frame of image, an animation layer contained in each frame of image, a loading sequence of the animation layer contained in each frame of image and animation parameters of the animation layer contained in each frame of image in the frame of image;
the animation playing module is used for combining the animation layers of the animation elements contained in each frame of image according to the animation parameters based on the animation playing information to realize the playing of the animation to be played;
the animation playing module comprises:
the comparison information acquisition module is used for acquiring comparison information between every two adjacent frame images by comparing animation elements contained in every frame image and animation parameters of the contained animation elements in every frame image frame by frame, wherein the comparison information comprises the same animation elements between the adjacent frame images, change parameters between the same animation elements, different animation elements and hierarchical relations among all the animation elements;
the layer adjusting module is used for realizing the playing of the next frame of image by adjusting the animation layer type of the previous frame of image, the superposition sequence of each animation layer, the superposition position of each animation layer and other animation parameters of each animation layer based on the acquired comparison information; wherein the other animation parameters at least comprise the transparency of the animation layer.
8. The apparatus of claim 7, wherein the source file obtaining module comprises:
the bitmap acquisition module is used for acquiring a bitmap image of each animation element as an element bitmap corresponding to the animation element;
the file generation module is used for generating the animation description file according to the animation elements contained in each frame of image and the animation parameters of the contained animation elements in the frame of image;
a source file determining module, configured to determine that the animation description file and the element bitmap corresponding to each animation element: and the animation source file is used as the animation source file of the animation to be played.
9. The apparatus of claim 7, wherein the sequence frame acquisition module comprises:
the first acquisition module is used for converting the animation to be played in the Flash format into a sequence frame in the SVG format through a Flash editor;
or,
and the second acquisition module is used for converting the animation to be played in the AE format into the sequence frame in the SVG format through the BodyMovin.
10. The apparatus of claim 7, wherein the animation parameters comprise a position parameter, a transparency parameter, a size parameter, a layer order parameter.
11. The apparatus of claim 7, wherein the animation properties comprise a total frame number of the animation, an identification of each frame image, an FPS, an animation size.
12. The apparatus of any of claims 7 to 11, further comprising:
the file compression module is used for carrying out file compression on the animation source file;
and the file transmission module is used for transmitting the compressed animation source file to a specified address so that the animation playing end can obtain the animation source file from the specified address.
CN201611201647.6A 2016-12-22 2016-12-22 Animation processing method and device Active CN106611435B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202211667831.5A CN115908644A (en) 2016-12-22 2016-12-22 Animation processing method and device
CN202211688081.XA CN115830190A (en) 2016-12-22 2016-12-22 Animation processing method and device
CN201611201647.6A CN106611435B (en) 2016-12-22 2016-12-22 Animation processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611201647.6A CN106611435B (en) 2016-12-22 2016-12-22 Animation processing method and device

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202211667831.5A Division CN115908644A (en) 2016-12-22 2016-12-22 Animation processing method and device
CN202211688081.XA Division CN115830190A (en) 2016-12-22 2016-12-22 Animation processing method and device

Publications (2)

Publication Number Publication Date
CN106611435A CN106611435A (en) 2017-05-03
CN106611435B true CN106611435B (en) 2022-11-11

Family

ID=58636707

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202211688081.XA Pending CN115830190A (en) 2016-12-22 2016-12-22 Animation processing method and device
CN201611201647.6A Active CN106611435B (en) 2016-12-22 2016-12-22 Animation processing method and device
CN202211667831.5A Pending CN115908644A (en) 2016-12-22 2016-12-22 Animation processing method and device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202211688081.XA Pending CN115830190A (en) 2016-12-22 2016-12-22 Animation processing method and device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202211667831.5A Pending CN115908644A (en) 2016-12-22 2016-12-22 Animation processing method and device

Country Status (1)

Country Link
CN (3) CN115830190A (en)

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107230244A (en) * 2017-06-08 2017-10-03 深圳第七大道科技有限公司 The generation method and rendering system of a kind of animation file
CN109242934B (en) * 2017-07-06 2023-09-05 浙江天猫技术有限公司 Animation code generation method and equipment
CN107403460B (en) * 2017-07-11 2021-07-06 北京龙之心科技有限公司 Animation generation method and device
CN109389661B (en) * 2017-08-04 2024-03-01 阿里健康信息技术有限公司 Animation file conversion method and device
CN108242070A (en) * 2017-10-09 2018-07-03 北京车和家信息技术有限公司 A kind of image drawing method, image plotting device and computer equipment
CN108364335A (en) * 2018-01-23 2018-08-03 腾讯科技(深圳)有限公司 A kind of animation method for drafting and device
CN108520491A (en) * 2018-04-24 2018-09-11 上海仪电汽车电子系统有限公司 Full frame boot animation driving method based on QNX operating systems
CN110473275B (en) * 2018-05-09 2023-05-30 鸿合科技股份有限公司 Frame animation realization method and device under android system and electronic equipment
CN108810132B (en) * 2018-06-07 2022-02-11 腾讯科技(深圳)有限公司 Animation display method, device, terminal, server and storage medium
CN108881997A (en) * 2018-07-24 2018-11-23 北京奇艺世纪科技有限公司 Animation file generates and playback method, device and system
CN109147016A (en) * 2018-07-26 2019-01-04 乐蜜有限公司 The dynamic effect screen generating method of one kind, device, electronic equipment and storage medium
CN109636884A (en) * 2018-10-25 2019-04-16 阿里巴巴集团控股有限公司 Animation processing method, device and equipment
CN110090437A (en) * 2019-04-19 2019-08-06 腾讯科技(深圳)有限公司 Video acquiring method, device, electronic equipment and storage medium
CN111968197A (en) * 2019-05-20 2020-11-20 北京字节跳动网络技术有限公司 Dynamic image generation method, device, electronic equipment and computer readable storage medium
CN110213638B (en) * 2019-06-05 2021-10-08 北京达佳互联信息技术有限公司 Animation display method, device, terminal and storage medium
CN112069042B (en) * 2019-06-11 2023-04-14 腾讯科技(深圳)有限公司 Animation performance monitoring method and device, storage medium and computer equipment
CN110475147A (en) * 2019-07-29 2019-11-19 阿里巴巴集团控股有限公司 Animation playing method, device, terminal and server
CN110351599B (en) * 2019-07-29 2021-12-21 创新先进技术有限公司 Animation file playing method and device and terminal equipment
CN112435319A (en) * 2019-08-26 2021-03-02 上海卷石文化传媒有限公司 Two-dimensional animation generating system based on computer processing
CN110662105A (en) * 2019-10-16 2020-01-07 广州华多网络科技有限公司 Animation file generation method and device and storage medium
CN110784739A (en) * 2019-10-25 2020-02-11 稿定(厦门)科技有限公司 Video synthesis method and device based on AE
CN110990601A (en) * 2019-11-13 2020-04-10 珠海格力电器股份有限公司 Image processing method and device
CN111292387B (en) * 2020-01-16 2023-08-29 广州小鹏汽车科技有限公司 Dynamic picture loading method and device, storage medium and terminal equipment
CN111309227B (en) * 2020-02-03 2022-05-31 联想(北京)有限公司 Animation production method and equipment and computer readable storage medium
CN111932660A (en) * 2020-08-11 2020-11-13 深圳市前海手绘科技文化有限公司 Hand-drawn video production method based on AE (Enterprise edition) file
CN112312043A (en) * 2020-10-20 2021-02-02 深圳市前海手绘科技文化有限公司 Optimization method and device for deriving animation video
CN112689168A (en) * 2020-12-09 2021-04-20 四川金熊猫新媒体有限公司 Dynamic effect processing method, dynamic effect display method and dynamic effect processing device
CN113409427B (en) * 2021-07-21 2024-04-19 北京达佳互联信息技术有限公司 Animation playing method and device, electronic equipment and computer readable storage medium
CN113992995A (en) * 2021-10-22 2022-01-28 广州博冠信息科技有限公司 Virtual gift sending method and device, storage medium and electronic equipment
CN113885345B (en) * 2021-10-29 2024-03-19 广州市技师学院(广州市高级技工学校、广州市高级职业技术培训学院、广州市农业干部学校) Interaction method, device and equipment based on intelligent home simulation control system
CN113986438B (en) * 2021-10-30 2024-01-30 深圳市快易典教育科技有限公司 Animation loading method, system, device and computer readable storage medium
CN115484488B (en) * 2022-08-23 2023-08-04 惠州拓邦电气技术有限公司 Animation control method and device and electric appliance

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102368247A (en) * 2011-09-16 2012-03-07 杭州典能科技有限公司 Method for executing SWF (Small Web Format) file on handheld terminal
JP2012133640A (en) * 2010-12-22 2012-07-12 Avix Inc Execution file for creating moving image work file by editing favorite moving image while viewing a model moving image on user's computer, and usage thereof
CN105335410A (en) * 2014-07-31 2016-02-17 优视科技有限公司 Synthesis rendering acceleration based webpage updating method and apparatus

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100428279C (en) * 2006-11-10 2008-10-22 北京金山软件有限公司 Cartoon realizing method and cartoon drawing system thereof
CN101470893B (en) * 2007-12-26 2011-01-19 中国科学院声学研究所 Vector graphic display acceleration method based on bitmap caching
CN102117489A (en) * 2010-01-06 2011-07-06 深圳市网域计算机网络有限公司 Animation playing method and device
CN104392474B (en) * 2014-06-30 2018-04-24 贵阳朗玛信息技术股份有限公司 A kind of method and device for generating, showing animation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012133640A (en) * 2010-12-22 2012-07-12 Avix Inc Execution file for creating moving image work file by editing favorite moving image while viewing a model moving image on user's computer, and usage thereof
CN102368247A (en) * 2011-09-16 2012-03-07 杭州典能科技有限公司 Method for executing SWF (Small Web Format) file on handheld terminal
CN105335410A (en) * 2014-07-31 2016-02-17 优视科技有限公司 Synthesis rendering acceleration based webpage updating method and apparatus

Also Published As

Publication number Publication date
CN115830190A (en) 2023-03-21
CN115908644A (en) 2023-04-04
CN106611435A (en) 2017-05-03

Similar Documents

Publication Publication Date Title
CN106611435B (en) Animation processing method and device
CN111193876B (en) Method and device for adding special effect in video
CN108959392B (en) Method, device and equipment for displaying rich text on 3D model
US20100060652A1 (en) Graphics rendering system
CN113457160B (en) Data processing method, device, electronic equipment and computer readable storage medium
CN111899322B (en) Video processing method, animation rendering SDK, equipment and computer storage medium
CN111161392B (en) Video generation method and device and computer system
CN112804459A (en) Image display method and device based on virtual camera, storage medium and electronic equipment
WO2021135320A1 (en) Video generation method and apparatus, and computer system
CN111899155A (en) Video processing method, video processing device, computer equipment and storage medium
CN112073794B (en) Animation processing method, animation processing device, computer readable storage medium and computer equipment
CN114051734A (en) Method and device for decoding three-dimensional scene
WO2020258907A1 (en) Virtual article generation method, apparatus and device
CN107767437B (en) Multilayer mixed asynchronous rendering method
CN112804460A (en) Image processing method and device based on virtual camera, storage medium and electronic equipment
CN115082609A (en) Image rendering method and device, storage medium and electronic equipment
CN114491352A (en) Model loading method and device, electronic equipment and computer readable storage medium
CN111064986B (en) Animation data sending method with transparency, animation data playing method and computer equipment
CN111260760A (en) Image processing method, image processing device, electronic equipment and storage medium
WO2009011657A1 (en) Methods of providing graphics data and displaying graphics data
CN112954452B (en) Video generation method, device, terminal and storage medium
JP2004201004A (en) Three-dimensional video display device, program and recording medium
WO2024087971A1 (en) Method and apparatus for image processing, and storage medium
US20240009560A1 (en) 3D Image Implementation
WO2024051394A1 (en) Video processing method and apparatus, electronic device, computer-readable storage medium, and computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210108

Address after: 511442 3108, 79 Wanbo 2nd Road, Nancun Town, Panyu District, Guangzhou City, Guangdong Province

Applicant after: GUANGZHOU CUBESILI INFORMATION TECHNOLOGY Co.,Ltd.

Address before: 511442 24 floors, B-1 Building, Wanda Commercial Square North District, Wanbo Business District, 79 Wanbo Second Road, Nancun Town, Panyu District, Guangzhou City, Guangdong Province

Applicant before: GUANGZHOU HUADUO NETWORK TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant