CN113496537A - Animation playing method and device and server - Google Patents

Animation playing method and device and server Download PDF

Info

Publication number
CN113496537A
CN113496537A CN202110768889.8A CN202110768889A CN113496537A CN 113496537 A CN113496537 A CN 113496537A CN 202110768889 A CN202110768889 A CN 202110768889A CN 113496537 A CN113496537 A CN 113496537A
Authority
CN
China
Prior art keywords
frame
bitmap
difference
image
animation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110768889.8A
Other languages
Chinese (zh)
Other versions
CN113496537B (en
Inventor
杨泽伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110768889.8A priority Critical patent/CN113496537B/en
Publication of CN113496537A publication Critical patent/CN113496537A/en
Application granted granted Critical
Publication of CN113496537B publication Critical patent/CN113496537B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides an animation playing method, device and server, comprising: acquiring a frame-by-frame animation to be played, and determining a target frame bitmap from a plurality of frame bitmaps contained in the frame-by-frame animation; respectively generating a background image and a difference image based on the difference part of each frame bitmap relative to a target frame bitmap in the frame-by-frame animation; the background image comprises non-difference parts between each frame bitmap, and the difference image comprises difference parts of each frame bitmap relative to the target frame bitmap; and superposing the background image and the difference image, and controlling the difference image to move relative to the background image so as to combine the non-difference part with each difference part respectively in the moving process of the difference image, thereby realizing the playing of the animation frame by frame. The invention can effectively improve the front-end performance problem of Web frame-by-frame animation playing.

Description

Animation playing method and device and server
Technical Field
The invention relates to the technical field of rendering, in particular to an animation playing method, an animation playing device and a server.
Background
Currently, a Web (World Wide Web, global area network) frame-by-frame animation is widely used in large Web sites in recent years as a small and elegant animation scheme, and the Web frame-by-frame animation is smaller than a video and more exquisite than a GIF (Graphics Interchange Format), so that the Web frame-by-frame animation occupies a place in the visual dynamic of a Web page. However, if the playing time of the Web frame-by-frame animation is long or the playing area is too large, the number of frame bitmaps or the size of the frame bitmaps required for the Web frame-by-frame animation playing will increase by orders of magnitude, and at this time, many front-end performance problems will be caused, such as more bandwidth consuming network resources, larger loss of page rendering performance, causing js (javascript) thread blocking, larger packaging volume of the Web frame-by-frame animation, and the like.
Disclosure of Invention
In view of the above, the present invention provides an animation playing method, an animation playing device, and a server, which can effectively improve the front-end performance problem when a Web frame-by-frame animation is played.
In a first aspect, an embodiment of the present invention provides an animation playing method, including: acquiring a frame-by-frame animation to be played, and determining a target frame bitmap from a plurality of frame bitmaps contained in the frame-by-frame animation; respectively generating a background image and a difference image based on the difference position of each frame bitmap relative to the target frame bitmap in the frame-by-frame animation; wherein the background image comprises non-difference portions between each of the frame bitmaps, and the difference image comprises difference portions of each of the frame bitmaps relative to the target frame bitmap; and superposing the background image and the difference image, and controlling the difference image to move relative to the background image so as to combine the non-difference part with each difference part respectively in the moving process of the difference image to realize the playing of the frame-by-frame animation.
In one embodiment, the step of generating a background image and a difference image based on a difference portion of each frame bitmap relative to the target frame bitmap in the frame-by-frame animation respectively includes: comparing each frame bitmap in the frame-by-frame animation with the target frame bitmap, determining a non-difference part between each frame bitmap, and generating a background image based on the non-difference part; comparing each frame bitmap in the frame-by-frame animation with the background image, determining the difference part of each frame bitmap relative to the background image, and generating a difference image based on the difference part corresponding to each frame bitmap.
In one embodiment, the step of comparing each frame bitmap in the frame-by-frame animation with the target frame bitmap, determining a non-difference portion between each frame bitmap, and generating a background image based on the non-difference portion includes: comparing the first frame bitmap with the target frame bitmap in the frame-by-frame animation, and adjusting pixel data of the target frame bitmap based on a non-difference part of the first frame bitmap relative to the target frame bitmap to obtain an intermediate reference picture corresponding to the first frame bitmap; comparing the frame bitmap with an intermediate reference map corresponding to a previous frame bitmap based on the bitmap sequence of the frame-by-frame animation, and adjusting the pixel data of the intermediate reference map corresponding to the previous frame bitmap based on the non-difference part of the frame bitmap relative to the intermediate reference map corresponding to the previous frame bitmap to obtain the intermediate reference map corresponding to the frame bitmap; and determining the intermediate reference image corresponding to the last frame bitmap in the frame-by-frame animation as a background image.
In one embodiment, the step of adjusting the pixel data of the target frame bitmap based on the non-difference portion of the first frame bitmap relative to the target frame bitmap to obtain an intermediate reference map corresponding to the first frame bitmap includes: and replacing the pixel data of the pixel points except the non-difference part in the target frame bitmap by using the designated numerical value to obtain an intermediate reference picture corresponding to the first frame bitmap.
In one embodiment, the step of comparing each frame bitmap in the frame-by-frame animation with the background image and determining a difference part of each frame bitmap relative to the background image includes: for each frame bitmap in the frame-by-frame animation, converting the frame bitmap into a first file array, and converting the background image into a second file array; the first file array comprises pixel data of each pixel point in the frame bitmap, and the second file array comprises pixel data of each pixel point in the background image; judging whether the pixel data corresponding to the pixel points at the same position in the frame bitmap and the background image are consistent or not; if not, determining that the pixel point is positioned at the difference part of the frame bitmap relative to the background image.
In one embodiment, the step of converting the frame bitmap into a first file array and the step of converting the background image into a second file array comprises: converting the frame bitmap and the background image into text objects; and converting the text object corresponding to the frame bitmap into a first file array by using a first specified function, and converting the text object corresponding to the background image into a second file array by using the first specified function.
In one embodiment, the step of generating a difference image based on the difference portion corresponding to each frame bitmap includes: for each frame bitmap in the frame-by-frame animation, generating a difference sub-image corresponding to the frame bitmap based on a difference part corresponding to the frame bitmap; and splicing the difference sub-images corresponding to each frame bitmap according to the bitmap sequence of the frame-by-frame animation to obtain a difference image.
In one embodiment, the step of generating the difference sub-image corresponding to the frame bitmap based on the difference portion corresponding to the frame bitmap includes: and replacing the pixel data of the pixel points except the difference part in the frame bitmap by using the designated numerical value to obtain a difference sub-image corresponding to the frame bitmap.
In one embodiment, the step of controlling the difference image to move relative to the background image comprises: calling a second specified function; wherein the second specified function comprises a step function; controlling the difference image to move relative to the background image by the second specified function.
In one embodiment, the step of controlling the difference image to move relative to the background image further comprises: determining the image size of the difference sub-image as a moving unit; and controlling the difference image to move relative to the background image according to the movement unit.
In one embodiment, the image size of the difference sub-image corresponds to the image size of the background image.
In one embodiment, the step of determining a target frame bitmap from a plurality of frame bitmaps included in the frame-by-frame animation includes: and determining a first frame bitmap in the frame-by-frame animation as a target frame bitmap according to the bitmap sequence of the frame-by-frame animation.
In a second aspect, an embodiment of the present invention further provides an animation playback apparatus, including: the target determination module is used for acquiring the frame-by-frame animation to be played and determining a target frame bitmap from a plurality of frame bitmaps contained in the frame-by-frame animation; the image generation module is used for respectively generating a background image and a difference image based on the difference position of each frame bitmap relative to the target frame bitmap in the frame-by-frame animation; wherein the background image comprises non-difference portions between each of the frame bitmaps, and the difference image comprises difference portions of each of the frame bitmaps relative to the target frame bitmap; and the animation playing module is used for superposing the background image and the difference image and controlling the difference image to move relative to the background image so as to combine the non-difference part with each difference part respectively in the moving process of the difference image and realize the playing of the frame-by-frame animation.
In a third aspect, an embodiment of the present invention further provides a server, including a processor and a memory; the memory has stored thereon a computer program which, when executed by the processor, performs the method of any one of the aspects as provided in the first aspect.
In a fourth aspect, an embodiment of the present invention further provides a computer storage medium for storing computer software instructions for use in any one of the methods provided in the first aspect.
The animation playing method, the device and the server provided by the embodiment of the invention firstly obtain the frame-by-frame animation, determine the target frame bitmap from a plurality of frame bitmaps contained in the frame-by-frame animation, respectively generate a difference image and a background image based on the difference part of each frame bitmap in the frame-by-frame animation relative to the target frame bitmap, overlap the difference image and the background image, and control the difference image to move relative to the background image so as to respectively combine the non-difference part with each difference part in the moving process of the difference image, thereby realizing the playing of the frame-by-frame animation. In the method, the generated difference image comprises a difference part of each frame bitmap relative to a target frame bitmap, the generated background image comprises a non-difference part between each frame bitmap, the difference part of a certain frame bitmap in the difference image relative to the target frame bitmap is combined with the non-difference part in the background image to be consistent with the frame bitmap, and the frame-by-frame animation playing can be realized by controlling the difference image to move relative to the background image and combining the difference part and the non-difference part of different frame bitmaps relative to the target frame bitmap Relieving the JS thread blocking problem and reducing the frame-by-frame animation packaging volume and the like.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic flow chart of an animation playing method according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a frame-by-frame animation according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating another frame-by-frame animation according to an embodiment of the invention;
fig. 4 is a schematic structural diagram of an animation playback device according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a server according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the embodiments, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The Web frame-by-frame animation is composed of a plurality of animation segments which are connected in series through time, wherein different contents are drawn on each frame of a time axis and are continuously played to form the animation. The browser plays each frame bitmap of the Web frame-by-frame animation into segments in sequence, and the Web frame-by-frame animation can be considered to be animation similar to GIF. When too many frame bitmaps are needed for constructing a Web frame-by-frame animation, many front-end performance problems are necessarily caused, and from the aspects of both network level and rendering level, the following problems may be caused:
(1) too many frame bitmaps will cause multiple HTTP (Hypertext Transfer Protocol) requests, thereby consuming too much bandwidth of network resources.
(2) The excessive frame bitmap will cause the loss of page rendering performance, and the continuous backflow redrawing in the animation running process will cause the operation load of rendering threads in the browser, thereby reducing the FPS (Frames Per Second) of animation running, and further enabling the frame-by-frame animation to play pause.
(3) In the related art, a setInterval () function is used for operating the interval playing of the animation, that is, the animation execution is controlled, from the view point of the browser event loop, it means that each animation is newly created with a macro task, the macro task as a blocking task generates a plurality of micro tasks in the execution process, if a plurality of Web frame-by-frame animations exist in the same page, a JS thread is blocked, and at this time, a user sees that one Web frame-by-frame animation is completely loaded and other Web frame-by-frame animations are not yet loaded.
(4) The more frame bitmaps contained in the Web frame-by-frame animation, the larger the packaging volume of the Web frame-by-frame animation.
Based on the above, the invention provides an animation playing method, device and server, which can effectively improve the front-end performance problem during Web frame-by-frame animation playing.
To facilitate understanding of the present embodiment, first, a detailed description is given to an animation playing method disclosed in the present embodiment, referring to a flow diagram of an animation playing method shown in fig. 1, where the method mainly includes the following steps S102 to S106:
step S102, obtaining the frame-by-frame animation to be played, and determining a target frame bitmap from a plurality of frame bitmaps contained in the frame-by-frame animation. The frame-by-frame animation refers to drawing different contents on each frame of a time axis and continuously playing the contents to form animation, each frame corresponds to an image, namely the frame bitmap, and the type and the size of each frame bitmap are the same. In one embodiment, any one of the frame bitmaps included in the frame-by-frame animation may be selected as the target frame bitmap, for example, a first frame bitmap in the frame-by-frame animation may be determined as the target frame bitmap, and the target frame bitmap may be used as a reference for comparison with each frame bitmap in the frame-by-frame animation.
And step S104, respectively generating a background image and a difference image based on the difference part of each frame bitmap relative to the target frame bitmap in the frame-by-frame animation. The background image comprises non-difference parts between each frame bitmap, and the difference image comprises difference parts of each frame bitmap relative to the target frame bitmap. The different part of the frame bitmap relative to the target frame bitmap can be understood as an area with different content represented at the same position of the frame bitmap and the target frame bitmap, and the non-different part of the frame bitmap relative to the target frame bitmap can be understood as an area with the same content represented at the same position of the frame bitmap and the target frame bitmap. In an implementation manner, whether the pixel data of each pixel point in the two frame bitmaps is the same or not can be judged by traversing the pixel data of each pixel point in the frame bitmap and the target frame bitmap, and if not, the pixel point is determined to be located at the difference position. In addition, the regions other than the non-difference portions in the background image are transparent regions, and the regions other than each difference portion in the difference image are transparent regions.
For example, the frame-by-frame animation includes three frame bitmaps, a frame bitmap a1, a frame bitmap a2, and a frame bitmap A3, and the frame bitmap a1 is a target frame bitmap, the frame bitmap a2 and the frame bitmap A3 are respectively compared with the frame bitmap a1, non-difference portions between the frame bitmap a1, the frame bitmap a2, and the frame bitmap A3 are determined, a background image is generated based on the non-difference portions, the frame bitmap a1, the frame bitmap a2, and the frame bitmap A3 are respectively compared with the background image, a difference portion of the frame bitmap a1 with respect to the background image is determined, a difference portion of the frame bitmap a2 with respect to the background image, and a difference portion of the frame bitmap A3 with respect to the background image is determined, and a difference image is generated based on each difference portion.
And step S106, overlapping the background image and the difference image, and controlling the difference image to move relative to the background image so as to combine the non-difference part with each difference part respectively in the moving process of the difference image, thereby realizing the playing of the animation frame by frame. In one embodiment, since the regions of the background image except the non-difference region are transparent regions and the regions of the difference image except each difference region are transparent regions, the background image and the difference image are superimposed, the non-difference region in the background image and the difference region in the difference image can be combined, and the content contained in the combined image is consistent with the original frame bitmap. For example, the content obtained by combining the difference portion of the frame bitmap a1 with respect to the background image and the non-difference portion in the background image will coincide with the original frame bitmap a 1.
Alternatively, the motion of the difference image with respect to the background image may be controlled based on the bitmap order and the image size of the frame-by-frame animation, for example, the difference portion of the frame bitmap a1 with respect to the background image is first combined with the non-difference portion in the background image, and then the motion of the difference image with respect to the background image is controlled, so that the difference portion of the frame bitmap a2 with respect to the background image is combined with the non-difference portion in the background image, and the motion of the difference image with respect to the background image is continuously controlled, so that the difference portion of the frame bitmap A3 with respect to the background image is combined with the non-difference portion in the background image, thereby realizing the playing of the frame-by-frame animation.
In the animation playing method provided by the embodiment of the invention, the generated difference image comprises the difference part of each frame bitmap relative to the target frame bitmap, the generated background image comprises the non-difference part between each frame bitmap, the difference part of a certain frame bitmap in the difference image relative to the target frame bitmap is combined with the non-difference part in the background image to be consistent with the frame bitmap, and the frame-by-frame animation playing can be realized by controlling the difference image to move relative to the background image and combining the difference part and the non-difference part of the different frame bitmap relative to the target frame bitmap The method has the effects of reducing the loss of page rendering performance, relieving the JS thread blocking problem, reducing the frame-by-frame animation packaging volume and the like.
In an embodiment, an embodiment of the present invention provides an implementation manner for determining a target frame bitmap from a plurality of frame bitmaps included in a frame-by-frame animation, and a first frame bitmap in the frame-by-frame animation may be determined as the target frame bitmap according to a bitmap order of the frame-by-frame animation. The bitmap sequence is also the playing sequence of each frame image in the frame-by-frame animation. Illustratively, referring to a schematic diagram of a frame-by-frame animation shown in fig. 2, the frame-by-frame animation includes four frame bitmaps, each frame bitmap of the frame-by-frame animation is named in bitmap order, for example, named as "time-1. png", "time-2. png", "time-3. png", and "time-4. png", and the size of each frame bitmap is completely consistent, and then the frame bitmap time-1.png is determined as a target frame bitmap. Optionally, other frame bitmaps may also be selected from the plurality of frame bitmaps as the target frame bitmap, such as selecting the frame bitmap time-2.png as the target frame bitmap, which may be specifically selected based on actual requirements.
For convenience of understanding, an embodiment of the present invention further provides an implementation manner for generating a background image and a difference image respectively based on a difference portion of each frame bitmap in a frame-by-frame animation with respect to a target frame bitmap, where reference is made to the following steps 1 to 2:
step 1, comparing each frame bitmap in the frame-by-frame animation with a target frame bitmap, determining non-difference parts between each frame bitmap, and generating a background image based on the non-difference parts. In one embodiment, see step 1.1 through step 1.3 below:
step 1.1, comparing the first frame bitmap with the target frame bitmap in the frame-by-frame animation, and adjusting the pixel data of the target frame bitmap based on the non-difference part of the first frame bitmap relative to the target frame bitmap to obtain an intermediate reference picture corresponding to the first frame bitmap. In one embodiment, the frame bitmap time-1.png and the target frame bitmap may be both converted into a text object (e.g., canvas object), the text object corresponding to the frame bitmap time-1.png is converted into a third file array by using a first specified function (e.g., getImageData () function of canvas), the text object corresponding to the target frame bitmap is converted into a fourth file array, the third file array includes pixel data corresponding to each pixel point in the frame bitmap time-1.png, the fourth file array includes pixel data corresponding to each pixel point in the template frame bitmap, and whether a pixel point at the same position is in a non-difference portion can be determined by comparing the pixel data corresponding to the pixel point at the same position in the third file array and the fourth file array if the pixel data are completely consistent. And traversing the pixel data corresponding to each pixel point in the third file array and the fourth file array, and determining the non-difference part of the frame bitmap time-1.png relative to the target frame bitmap.
After the non-difference portion of the frame bitmap time-1.png relative to the target frame bitmap is determined, the pixel data of the target frame bitmap may be adjusted to obtain the intermediate reference map corresponding to the first frame bitmap, and in an optional implementation, the pixel data of the pixel points other than the non-difference portion in the target frame bitmap is replaced by the designated value to obtain the intermediate reference map corresponding to the first frame bitmap, where the designated value may be "0". In an embodiment, each pixel point in the target frame bitmap corresponds to one pixel data, the pixel data of the pixel point in the non-difference portion in the target frame bitmap can be retained, and "0" is used to replace the pixel data of the pixel point outside the non-difference portion in the target frame bitmap, the non-difference portion is retained in the obtained intermediate reference image, and the rest regions are transparent regions.
And step 1.2, comparing the frame bitmap with an intermediate reference picture corresponding to a previous frame bitmap for the rest frame bitmaps in the frame-by-frame animation based on the bitmap sequence of the frame-by-frame animation, and adjusting the pixel data of the intermediate reference picture corresponding to the previous frame bitmap based on the non-difference part of the frame bitmap relative to the intermediate reference picture corresponding to the previous frame bitmap to obtain the intermediate reference picture corresponding to the frame bitmap. For example, the frame bitmap time-2.png is compared with the middle reference map M1 corresponding to the first frame bitmap to determine the non-difference portion of the frame bitmap time-2.png relative to the middle reference map M1, so that the region of the middle reference map M1 except the non-difference portion is set as a transparent region to obtain the middle reference map M2 corresponding to the frame bitmap time-2. png; similarly, the middle reference M3 corresponding to the frame bitmap time-3.png and the middle reference M4 corresponding to the frame bitmap time-4.png are determined in sequence. In a specific implementation, refer to the process of generating the middle reference map corresponding to the first frame bitmap in step 1.1, and the embodiment of the present invention is not described herein again.
And step 1.3, determining an intermediate reference image corresponding to the last frame bitmap in the frame-by-frame animation as a background image. In practical applications, the intermediate reference map M4 corresponding to the frame bitmap time-4.png can be determined as a background image, and the background image includes only non-difference regions (i.e., the same regions) in all the frame bitmaps.
In an optional implementation mode, the frame bitmap time-1.png can be copied to generate a copy image time-0.png, the copy image time-0.png is used as a target frame bitmap and is compared with the rest frame bitmaps, only non-difference parts are reserved when the copy image is compared with the rest frame bitmaps each time, and until all frame bitmaps are compared, a background image only containing the non-difference parts of all frame bitmaps can be obtained.
And 2, comparing each frame bitmap in the frame-by-frame animation with the background image, determining the difference part of each frame bitmap relative to the background image, and generating a difference image based on the difference part corresponding to each frame bitmap. In one embodiment, see step 2.1 through step 2.5 below:
and 2.1, for each frame bitmap in the frame-by-frame animation, converting the frame bitmap into a first file array, and converting the background image into a second file array. The first file array comprises pixel data of each pixel point in the frame bitmap, the second file array comprises pixel data of each pixel point in the background image, the pixel data can comprise a color channel value and a transparency channel value (Alpha), and the color channel value can comprise a Red channel value (Red), a Green channel value (Green) and a Blue channel value (Blue).
In one embodiment, the frame bitmap and the background image may be converted into text objects, and then the text objects corresponding to the frame bitmap are converted into a first file array by using a first specified function, and the text objects corresponding to the background image are converted into a second file array by using the first specified function. For example, the frame bitmap and background image are converted into canvas objects, labeled a and B, respectively. A and B are converted into a file array by the canvas's getImageData () function, labeled as a first file array Fa and a second file array Fb, respectively. The two file arrays are both in the form of a regular array, wherein the array can be in the form of [ R1, G1, B1, A1, R2, G2, B2, A2, … ]. Traversing from the index of 0 to the right in the array, forming pixel data of a pixel point by every four data, such as [ R1, G1, B1, A1] and [ R2, G2, B2, A2], wherein the file array comprises the Red channel value (Red), the Green channel value (Green), the Blue channel value (Blue) and the transparency channel value (Alpha).
And 2.2, judging whether the pixel data corresponding to the pixel points at the same position in the frame bitmap and the background image are consistent or not. In practical applications, the transparent pixel data is [ R, G, B,0], may be [0,0,0,0] or [255, 0], and the non-transparent pixel data is [ R, G, B,1 ]. On the basis, the frame bitmap and the background image can be traversed simultaneously, and because the sizes of the frame bitmap and the background image are completely consistent, each time the traversal pointer points to a certain position of the frame bitmap, the position also corresponds to the same position of the background image, and at this time, two positions are respectively marked as Pa and Pb. When Pa is completely equal to Pb (R1 ═ R2, G1 ═ G2, B1 ═ B2, and a1 ═ a2), that is, it is determined that the color channel values corresponding to the pixel points at the same position are consistent; and when Pa is not completely equal to Pb, determining that the color channel values corresponding to the pixel points at the same position are inconsistent.
And 2.3, if not, determining that the pixel point is positioned at the difference part of the frame bitmap relative to the background image.
And 2.4, for each frame bitmap in the frame-by-frame animation, generating a difference sub-image corresponding to the frame bitmap based on the difference part corresponding to the frame bitmap. In one embodiment, the pixel data of the pixel points other than the difference portion in the frame bitmap is replaced by the designated value, and the pixel data of the pixel points of the difference portion in the frame bitmap is retained to obtain the difference sub-image corresponding to the frame bitmap. When Pa is completely equal to Pb, the pixel data of both are non-difference portions, and Pb is filled with a transparency value to become a transparent region, that is, [ R2, G2, B2, a2] to [0,0,0,0 ]. When Pa is not completely equal to Pb, the two pixel data are difference parts, [ R2, G2, B2, a2] remain. When the frame bitmap and the background image are traversed simultaneously, only the difference part is left in the background image, and other non-difference parts are filled into the transparent area, so that the difference sub-image can be obtained. The above-mentioned processing is performed on each frame image in the frame-by-frame animation, so as to obtain a difference sub-image corresponding to each frame image, such as another schematic diagram of the frame-by-frame animation shown in fig. 3. The image size of the difference sub-image is consistent with the image size of the background image and is consistent with the image size of the original frame bitmap.
And 2.5, splicing the difference sub-images corresponding to each frame bitmap according to the bitmap sequence of the frame-by-frame animation to obtain a difference image. In one embodiment, the difference sub-images corresponding to the first frame bitmap are combined into a long image from left to right in sequence, and the long image is the difference image which contains the difference parts of the bitmap compared with the background image from the first frame bitmap to the last frame bitmap.
According to the embodiment of the invention, through the steps 1 to 2, the difference is accurate to the pixels, so that the finest difference details of the Web frame-by-frame animation are reserved, and stable output is provided for the animation transition effect in the later period.
To facilitate understanding of the step S108, the following embodiments for controlling the difference image to move relative to the background image are also provided in the embodiments of the present invention, which are described in the following first to second modes:
in the first mode, a second specified function is called, and then the difference image is controlled to move relative to the background image through the second specified function. Wherein the second specified function comprises a step function. In one embodiment, the difference image is fixed on the background image by CSS Animation, and Animation is generated by step () function playing frame by frame. The embodiment of the invention skillfully utilizes CSS (Cascading Style Sheets) animation to automatically optimize the layer depending on the GPU (graphics processing unit) performance of the browser.
In the second mode, the image size of the disparity sub-image is determined as a unit of movement, and then the disparity image is controlled to move relative to the background image in the unit of movement. In one embodiment, during the playing of the frame-by-frame animation, the difference image is moved according to the image size of the difference sub-image, for example, the difference sub-image corresponding to the frame bitmap time-1.png is fixed on the background image, the difference image is controlled to move relative to the background image according to the moving unit, the difference sub-image corresponding to the frame bitmap time-2.png is fixed on the background image, until the difference sub-image corresponding to the frame bitmap time-4.png is fixed on the background image, and the playing of the whole frame-by-frame animation is realized.
The animation playing method provided by the embodiment of the invention actually uses the background image as the background, and the display of the difference parts is switched from left to right in sequence so as to generate the animation. The whole animation playing process has no essential change, and the same parts of each frame bitmap are erased and transparently filled into transparent areas, so that the rendering time and the image pixel volume of the same parts of the frame bitmap are indirectly reduced, and the effect of dynamic duplicate removal (repeated part) compression (image volume) is achieved. The core idea of the integration is to draw out the difference parts of the images, the same parts of the images are unchanged and are fixedly displayed, the difference parts are played in the whole animation process, and the animation effect is formed by switching the difference parts.
In practical application, from the perspective of front-end performance optimization, it is not sufficient to optimize the volume of one frame bitmap for hundreds of lines of codes, and based on this, the embodiment of the present invention provides the animation playing method, and before the Web frame-by-frame animation is played, deduplication compression processing can be dynamically performed on all frame bitmaps of the Web frame-by-frame animation, so that the volumes of all frame bitmaps of the Web frame-by-frame animation can be effectively reduced, and the animation execution performance can be improved.
In summary, the animation playing method provided in the embodiment of the present invention at least has the following characteristics:
(1) the same Web frame-by-frame animation is played, and the final image volume optimized by the embodiment of the invention is smaller and lighter, so that the front-end performance optimization specification is met, and the source throttling is opened for the execution of the page.
(2) The overall implementation scheme of the embodiment of the invention displays the animation in a manner similar to the superposition of Photoshop layers, and not switching a frame of bitmap in the same layer to generate an animation effect, but fixing the unchanged part as a layer, and promoting the changed part as a new layer, so that the changeable part is changed in the new layer, the influence of backflow redrawing of the layer is effectively reduced, and the rendering performance of the browser is indirectly accelerated.
(3) According to the animation playing method provided by the embodiment of the invention, only the layer of the variable part is changed in the unchanged rendering mode, so that the effect of locally modifying the view is achieved, the modification of the whole layer is not needed, the smoothness of the animation is improved, and the influence on the animation is relatively small even if the FPS is reduced.
(4) The embodiment of the invention combines the variable parts of the frame-by-frame animation into a long image according to the original sequence of the frame-by-frame animation, thereby effectively combining a plurality of image requests. Taking the above as an example, 4 images need to be requested when not combined, and only 2 images (one background image and one difference image) need to be requested after combination, thereby effectively reducing HTTP requests and saving server bandwidth.
As for the animation playing method provided in the foregoing embodiment, an embodiment of the present invention provides an animation playing device, referring to a schematic structural diagram of an animation playing device shown in fig. 4, where the device mainly includes the following components:
a target determining module 402, configured to obtain a frame-by-frame animation to be played, and determine a target frame bitmap from multiple frame bitmaps included in the frame-by-frame animation;
an image generating module 404, configured to generate a background image and a difference image based on a difference portion of each frame bitmap in the frame-by-frame animation relative to the target frame bitmap; the background image comprises non-difference parts between each frame bitmap, and the difference image comprises difference parts of each frame bitmap relative to the target frame bitmap;
and an animation playing module 406, configured to superimpose the background image and the difference image, and control the difference image to move relative to the background image, so as to combine the non-difference portion with each difference portion in the moving process of the difference image, so as to implement frame-by-frame animation playing.
In the animation playback device provided in the embodiment of the present invention, the generated difference image includes a difference portion of each frame bitmap with respect to the target frame bitmap, the generated background image includes a non-difference portion of each frame bitmap, the difference portion of a frame bitmap in the difference image with respect to the target frame bitmap is combined with the non-difference portion in the background image to be consistent with the frame bitmap, and the difference portion and the non-difference portion of the different frame bitmap with respect to the target frame bitmap are combined by controlling the difference image to move with respect to the background image, so as to implement the playback of the frame-by-frame animation The method has the effects of reducing the loss of page rendering performance, relieving the JS thread blocking problem, reducing the frame-by-frame animation packaging volume and the like.
In one embodiment, the image generation module 404 is further configured to: comparing each frame bitmap in the frame-by-frame animation with a target frame bitmap, determining non-difference parts between each frame bitmap, and generating a background image based on the non-difference parts; comparing each frame bitmap in the frame-by-frame animation with the background image, determining the difference part of each frame bitmap relative to the background image, and generating the difference image based on the difference part corresponding to each frame bitmap.
In one embodiment, the image generation module 404 is further configured to: comparing the first frame bitmap with the target frame bitmap in the frame-by-frame animation, and adjusting pixel data of the target frame bitmap based on a non-difference part of the first frame bitmap relative to the target frame bitmap to obtain an intermediate reference picture corresponding to the first frame bitmap; comparing the frame bitmap with an intermediate reference map corresponding to a previous frame bitmap based on the bitmap sequence of the frame-by-frame animation, and adjusting the pixel data of the intermediate reference map corresponding to the previous frame bitmap based on the non-difference part of the frame bitmap relative to the intermediate reference map corresponding to the previous frame bitmap to obtain the intermediate reference map corresponding to the frame bitmap; and determining the intermediate reference image corresponding to the last frame bitmap in the frame-by-frame animation as a background image.
In one embodiment, the image generation module 404 is further configured to: and replacing the pixel data of the pixel points except the non-difference part in the target frame bitmap by using the designated numerical value to obtain an intermediate reference corresponding to the first frame bitmap.
In one embodiment, the image generation module 404 is further configured to: for each frame bitmap in the frame-by-frame animation, converting the frame bitmap into a first file array, and converting the background image into a second file array; the first file array comprises pixel data of each pixel point in the frame bitmap, and the second file array comprises pixel data of each pixel point in the background image; judging whether the pixel data corresponding to the pixel points at the same position in the frame bitmap and the background image are consistent or not; if not, determining that the pixel point is positioned at the difference part of the frame bitmap relative to the background image.
In one embodiment, the image generation module 404 is further configured to: converting the frame bitmap into a first file array and converting the background image into a second file array, comprising: converting the frame bitmap and the background image into text objects; and converting the text object corresponding to the frame bitmap into a first file array by using a first specified function, and converting the text object corresponding to the background image into a second file array by using the first specified function.
In one embodiment, the image generation module 404 is further configured to: for each frame bitmap in the frame-by-frame animation, generating a difference sub-image corresponding to the frame bitmap based on the difference part corresponding to the frame bitmap; and splicing the difference sub-images corresponding to each frame bitmap according to the bitmap sequence of the frame-by-frame animation to obtain a difference image.
In one embodiment, the image generation module 404 is further configured to: and replacing the pixel data of the pixel points except the difference part in the frame bitmap by using the designated numerical value to obtain a difference sub-image corresponding to the frame bitmap.
In one embodiment, the animation playback module 406 is further configured to: calling a second specified function; wherein the second specified function comprises a step function; the difference image is controlled to move relative to the background image by a second specified function.
In one embodiment, the animation playback module 406 is further configured to: determining the image size of the difference sub-image as a moving unit; the difference image is controlled to move relative to the background image in units of movement.
In one embodiment, the image size of the difference sub-image coincides with the image size of the background image.
In one embodiment, the targeting module 402 is further configured to: and determining a first frame bitmap in the frame-by-frame animation as a target frame bitmap according to the bitmap sequence of the frame-by-frame animation.
The device provided by the embodiment of the present invention has the same implementation principle and technical effect as the method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the method embodiments without reference to the device embodiments.
The embodiment of the invention provides a server, which particularly comprises a processor and a storage device; the storage means has stored thereon a computer program which, when executed by the processor, performs the method of any of the above described embodiments.
Fig. 5 is a schematic structural diagram of a server according to an embodiment of the present invention, where the server 100 includes: the device comprises a processor 50, a memory 51, a bus 52 and a communication interface 53, wherein the processor 50, the communication interface 53 and the memory 51 are connected through the bus 52; the processor 50 is arranged to execute executable modules, such as computer programs, stored in the memory 51.
The Memory 51 may include a high-speed Random Access Memory (RAM) and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 53 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used.
The bus 52 may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 5, but this does not indicate only one bus or one type of bus.
The memory 51 is used for storing a program, the processor 50 executes the program after receiving an execution instruction, and the method executed by the apparatus defined by the flow process disclosed in any of the foregoing embodiments of the present invention may be applied to the processor 50, or implemented by the processor 50.
The processor 50 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 50. The Processor 50 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 51, and the processor 50 reads the information in the memory 51 and completes the steps of the method in combination with the hardware thereof.
The computer program product of the readable storage medium provided in the embodiment of the present invention includes a computer readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiment, and specific implementation may refer to the foregoing method embodiment, which is not described herein again.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (15)

1. An animation playing method, comprising:
acquiring a frame-by-frame animation to be played, and determining a target frame bitmap from a plurality of frame bitmaps contained in the frame-by-frame animation;
respectively generating a background image and a difference image based on the difference position of each frame bitmap relative to the target frame bitmap in the frame-by-frame animation; wherein the background image comprises non-difference portions between each of the frame bitmaps, and the difference image comprises difference portions of each of the frame bitmaps relative to the target frame bitmap;
and superposing the background image and the difference image, and controlling the difference image to move relative to the background image so as to combine the non-difference part with each difference part respectively in the moving process of the difference image to realize the playing of the frame-by-frame animation.
2. The method of claim 1, wherein the step of generating a background image and a difference image based on a difference portion of each frame bitmap relative to the target frame bitmap in the frame-by-frame animation comprises:
comparing each frame bitmap in the frame-by-frame animation with the target frame bitmap, determining a non-difference part between each frame bitmap, and generating a background image based on the non-difference part;
comparing each frame bitmap in the frame-by-frame animation with the background image, determining the difference part of each frame bitmap relative to the background image, and generating a difference image based on the difference part corresponding to each frame bitmap.
3. The method of claim 2, wherein the step of comparing each of the frame bitmaps in the frame-by-frame animation with the target frame bitmap, determining non-difference portions between each of the frame bitmaps, and generating a background image based on the non-difference portions comprises:
comparing the first frame bitmap with the target frame bitmap in the frame-by-frame animation, and adjusting pixel data of the target frame bitmap based on a non-difference part of the first frame bitmap relative to the target frame bitmap to obtain an intermediate reference picture corresponding to the first frame bitmap;
comparing the frame bitmap with an intermediate reference map corresponding to a previous frame bitmap based on the bitmap sequence of the frame-by-frame animation, and adjusting the pixel data of the intermediate reference map corresponding to the previous frame bitmap based on the non-difference part of the frame bitmap relative to the intermediate reference map corresponding to the previous frame bitmap to obtain the intermediate reference map corresponding to the frame bitmap;
and determining the intermediate reference image corresponding to the last frame bitmap in the frame-by-frame animation as a background image.
4. The method according to claim 3, wherein the step of adjusting the pixel data of the target frame bitmap based on the non-difference portion of the first frame bitmap relative to the target frame bitmap to obtain the middle reference map corresponding to the first frame bitmap comprises:
and replacing the pixel data of the pixel points except the non-difference part in the target frame bitmap by using the designated numerical value to obtain an intermediate reference picture corresponding to the first frame bitmap.
5. The method of claim 2, wherein the step of comparing each frame bitmap of the frame-by-frame animation with the background image and determining the difference position of each frame bitmap relative to the background image comprises:
for each frame bitmap in the frame-by-frame animation, converting the frame bitmap into a first file array, and converting the background image into a second file array; the first file array comprises pixel data of each pixel point in the frame bitmap, and the second file array comprises pixel data of each pixel point in the background image;
judging whether the pixel data corresponding to the pixel points at the same position in the frame bitmap and the background image are consistent or not;
if not, determining that the pixel point is positioned at the difference part of the frame bitmap relative to the background image.
6. The method of claim 5, wherein the step of converting the frame bitmap to a first file array and the background image to a second file array comprises:
converting the frame bitmap and the background image into text objects;
and converting the text object corresponding to the frame bitmap into a first file array by using a first specified function, and converting the text object corresponding to the background image into a second file array by using the first specified function.
7. The method of claim 5, wherein the step of generating a difference image based on the difference portion corresponding to each frame bitmap comprises:
for each frame bitmap in the frame-by-frame animation, generating a difference sub-image corresponding to the frame bitmap based on a difference part corresponding to the frame bitmap;
and splicing the difference sub-images corresponding to each frame bitmap according to the bitmap sequence of the frame-by-frame animation to obtain a difference image.
8. The method of claim 7, wherein the step of generating the difference sub-image corresponding to the frame bitmap based on the difference portion corresponding to the frame bitmap comprises:
and replacing the pixel data of the pixel points except the difference part in the frame bitmap by using the designated numerical value to obtain a difference sub-image corresponding to the frame bitmap.
9. The method of claim 1, wherein the step of controlling the difference image to move relative to the background image comprises:
calling a second specified function; wherein the second specified function comprises a step function;
controlling the difference image to move relative to the background image by the second specified function.
10. The method of claim 7, wherein the step of controlling the difference image to move relative to the background image further comprises:
determining the image size of the difference sub-image as a moving unit;
and controlling the difference image to move relative to the background image according to the movement unit.
11. The method of claim 10, wherein the image size of the difference sub-image is consistent with the image size of the background image.
12. The method according to claim 1, wherein the step of determining a target frame bitmap from a plurality of frame bitmaps included in the frame-by-frame animation comprises:
and determining a first frame bitmap in the frame-by-frame animation as a target frame bitmap according to the bitmap sequence of the frame-by-frame animation.
13. An animation playback apparatus, comprising:
the target determination module is used for acquiring the frame-by-frame animation to be played and determining a target frame bitmap from a plurality of frame bitmaps contained in the frame-by-frame animation;
the image generation module is used for respectively generating a background image and a difference image based on the difference position of each frame bitmap relative to the target frame bitmap in the frame-by-frame animation; wherein the background image comprises non-difference portions between each of the frame bitmaps, and the difference image comprises difference portions of each of the frame bitmaps relative to the target frame bitmap;
and the animation playing module is used for superposing the background image and the difference image and controlling the difference image to move relative to the background image so as to combine the non-difference part with each difference part respectively in the moving process of the difference image and realize the playing of the frame-by-frame animation.
14. A server, comprising a processor and a memory;
the memory has stored thereon a computer program which, when executed by the processor, performs the method of any of claims 1 to 12.
15. A computer storage medium storing computer software instructions for use in the method of any one of claims 1 to 12.
CN202110768889.8A 2021-07-07 2021-07-07 Animation playing method, device and server Active CN113496537B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110768889.8A CN113496537B (en) 2021-07-07 2021-07-07 Animation playing method, device and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110768889.8A CN113496537B (en) 2021-07-07 2021-07-07 Animation playing method, device and server

Publications (2)

Publication Number Publication Date
CN113496537A true CN113496537A (en) 2021-10-12
CN113496537B CN113496537B (en) 2023-06-30

Family

ID=77996195

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110768889.8A Active CN113496537B (en) 2021-07-07 2021-07-07 Animation playing method, device and server

Country Status (1)

Country Link
CN (1) CN113496537B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999065224A2 (en) * 1998-06-11 1999-12-16 Presenter.Com Creating animation from a video
US20080091365A1 (en) * 2006-10-11 2008-04-17 Microsoft Corporation Image verification with tiered tolerance
US20100225765A1 (en) * 2009-03-03 2010-09-09 Fijitsu Limited Monitoring support apparatus, monitoring support method, and recording medium
US20140368669A1 (en) * 2012-10-04 2014-12-18 Google Inc. Gpu-accelerated background replacement
CN104254001A (en) * 2013-06-28 2014-12-31 广州华多网络科技有限公司 Remote sharing method, device and terminal
CN105844683A (en) * 2016-03-23 2016-08-10 深圳市富途网络科技有限公司 Pixel difference frame-by-frame animation realization method based on Canvas and WebWorker
CN107886560A (en) * 2017-11-09 2018-04-06 网易(杭州)网络有限公司 The processing method and processing device of animation resource
CN107943837A (en) * 2017-10-27 2018-04-20 江苏理工学院 A kind of video abstraction generating method of foreground target key frame
CN109165013A (en) * 2018-07-04 2019-01-08 惠州市德赛西威汽车电子股份有限公司 Multi-drawing layer stacking display methods, device, storage medium and terminal in application program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999065224A2 (en) * 1998-06-11 1999-12-16 Presenter.Com Creating animation from a video
US20080091365A1 (en) * 2006-10-11 2008-04-17 Microsoft Corporation Image verification with tiered tolerance
US20100225765A1 (en) * 2009-03-03 2010-09-09 Fijitsu Limited Monitoring support apparatus, monitoring support method, and recording medium
US20140368669A1 (en) * 2012-10-04 2014-12-18 Google Inc. Gpu-accelerated background replacement
CN104254001A (en) * 2013-06-28 2014-12-31 广州华多网络科技有限公司 Remote sharing method, device and terminal
CN105844683A (en) * 2016-03-23 2016-08-10 深圳市富途网络科技有限公司 Pixel difference frame-by-frame animation realization method based on Canvas and WebWorker
CN107943837A (en) * 2017-10-27 2018-04-20 江苏理工学院 A kind of video abstraction generating method of foreground target key frame
CN107886560A (en) * 2017-11-09 2018-04-06 网易(杭州)网络有限公司 The processing method and processing device of animation resource
CN109165013A (en) * 2018-07-04 2019-01-08 惠州市德赛西威汽车电子股份有限公司 Multi-drawing layer stacking display methods, device, storage medium and terminal in application program

Also Published As

Publication number Publication date
CN113496537B (en) 2023-06-30

Similar Documents

Publication Publication Date Title
CN106611435B (en) Animation processing method and device
US11783522B2 (en) Animation rendering method and apparatus, computer-readable storage medium, and computer device
US7439982B2 (en) Optimized scene graph change-based mixed media rendering
JP5037574B2 (en) Image file generation device, image processing device, image file generation method, and image processing method
JP2001229391A (en) Method for encoding animation included in image file
KR20150081638A (en) Electronic apparatus and operating method of web-platform
CN104641412A (en) Method and device for selective display refresh
US10140268B2 (en) Efficient browser composition for tiled-rendering graphics processing units
CN107707965A (en) The generation method and device of a kind of barrage
CN108881873B (en) Method, device and system for fusing high-resolution images
CN111696034B (en) Image processing method and device and electronic equipment
CN111402369B (en) Interactive advertisement processing method and device, terminal equipment and storage medium
JP2005135415A (en) Graphic decoder including command based graphic output accelerating function, graphic output accelerating method therefor, and image reproducing apparatus
CN113496537B (en) Animation playing method, device and server
US20100171750A1 (en) Method and device for generating a complexity vector for at leat one part of an svg scene, and method and testing device for testing a playback suitability of at least part of an svg scene on device
KR20050040712A (en) 2-dimensional graphic decoder including graphic display accelerating function based on commands, graphic display accelerating method therefor and reproduction apparatus
CN113157275B (en) Frame animation rendering method and device, electronic equipment and storage medium
KR101347178B1 (en) Apparatus and method for drawing vector image
CN115391692A (en) Video processing method and device
JP4260747B2 (en) Moving picture composition method and scene composition method
CN107273072B (en) Picture display method and device and electronic equipment
US20240046410A1 (en) Foveated scaling for rendering and bandwidth workloads
CN117596353A (en) Intelligent projector horse race lamp processing method, device, equipment and medium
CN116049589A (en) Visual intelligent large screen performance optimization method and system
CN114627228A (en) Method and device for displaying sequence frame image and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant