WO2020248951A1 - 动画渲染方法、装置、计算机可读存储介质和计算机设备 - Google Patents

动画渲染方法、装置、计算机可读存储介质和计算机设备 Download PDF

Info

Publication number
WO2020248951A1
WO2020248951A1 PCT/CN2020/095013 CN2020095013W WO2020248951A1 WO 2020248951 A1 WO2020248951 A1 WO 2020248951A1 CN 2020095013 W CN2020095013 W CN 2020095013W WO 2020248951 A1 WO2020248951 A1 WO 2020248951A1
Authority
WO
WIPO (PCT)
Prior art keywords
animation
drawing data
rendering
file
interval
Prior art date
Application number
PCT/CN2020/095013
Other languages
English (en)
French (fr)
Inventor
陈海中
陈仁健
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP20823436.9A priority Critical patent/EP3985612A4/en
Priority to JP2021563216A priority patent/JP7325535B2/ja
Publication of WO2020248951A1 publication Critical patent/WO2020248951A1/zh
Priority to US17/379,998 priority patent/US11783522B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • G06T9/40Tree coding, e.g. quadtree, octree
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding

Definitions

  • This application relates to the field of image processing technology, in particular to an animation rendering method, device, computer readable storage medium and computer equipment.
  • an animation rendering method According to various embodiments of the present application, an animation rendering method, apparatus, computer-readable storage medium, and computer equipment are provided.
  • An animation rendering method executed by a computer device, the method including:
  • An animation rendering device comprising:
  • File acquisition module for acquiring animation files in target format
  • the determining module is used to determine the animation drawing data interval that meets the static condition from the animation drawing data obtained by decoding when the animation file is decoded;
  • a data buffer module for buffering the initial animation drawing data in the animation drawing data interval
  • a data reading module configured to read the buffered initial animation drawing data corresponding to the frame to be played when the animation drawing data corresponding to the frame to be played meets the static condition during the playing of the animation file;
  • the animation rendering module is used to perform animation rendering according to the read initial animation drawing data.
  • a computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor executes the steps of the animation rendering method.
  • a computer device includes a memory and a processor, and the memory stores a computer program.
  • the processor executes the steps of the animation rendering method.
  • Fig. 1 is an application environment diagram of an animation rendering method in an embodiment
  • FIG. 2 is a schematic flowchart of an animation rendering method in an embodiment
  • FIG. 3 is a schematic flowchart of steps of decoding an animation file to obtain animation drawing data and determining an animation drawing data interval in an embodiment
  • FIG. 4 is a schematic flow chart of the step of calculating the animation drawing data interval of the animation layer in an embodiment
  • FIG. 5 is a schematic diagram of calculating the static interval of an animation layer in an embodiment
  • FIG. 6 is a schematic flow diagram of the steps of rendering animation according to the initial animation attribute value and the acquired animation attribute value in an embodiment
  • FIG. 7 is a schematic flowchart of the steps of rendering animation according to the initial group drawing data and the acquired group drawing data in an embodiment
  • FIG. 8 is a schematic flowchart of a rendering step using shared animation drawing data when multiple applications play the same animation in an embodiment
  • FIG. 9 is a schematic flow diagram of animation rendering data caching and rendering steps in an embodiment
  • FIG. 10 is a schematic flowchart of an off-screen rendering step in an embodiment
  • FIG. 11 is a flow diagram of the steps of saving the animation rendering data of the largest vector diagram when multiple animation layers contain the same vector diagram in an embodiment
  • FIG. 12 is a schematic flowchart of the step of calculating the animation area of the pre-composite attribute group in an embodiment
  • FIG. 13 is a schematic diagram of an animation area of a pre-composited attribute group in an embodiment
  • Figure 14 is a structural diagram of Transform2D in an embodiment
  • FIG. 15 is a schematic flowchart of a method for dividing a static interval in an embodiment
  • FIG. 16 is a schematic diagram of rendering time consumption when the cache strategy is fully enabled in an embodiment
  • FIG. 17 is a schematic diagram of rendering time consumption when rendering data cache is closed in an embodiment
  • FIG. 18 is a schematic diagram of rendering time-consuming for closing the drawing data cache in an embodiment
  • FIG. 19 is a schematic diagram of rendering time-consuming in an embodiment where the caching strategy is fully closed
  • FIG. 20 is a structural block diagram of an animation rendering device in an embodiment
  • 21 is a structural block diagram of an animation rendering device in another embodiment
  • Figure 22 is a structural block diagram of a computer device in an embodiment.
  • Fig. 1 is an application environment diagram of an animation rendering method in an embodiment.
  • the animation rendering method is applied to an animation rendering system.
  • the animation rendering system includes a terminal 110 and a server 120.
  • the terminal 110 and the server 120 are connected through a network.
  • the terminal 110 may specifically be a desktop terminal or a mobile terminal, and the mobile terminal may specifically be at least one of a mobile phone, a tablet computer, and a notebook computer.
  • the server 120 may be implemented as an independent server or a server cluster composed of multiple servers.
  • an animation rendering method is provided.
  • the method is mainly applied to the terminal 110 in FIG. 1 as an example.
  • the animation rendering method specifically includes the following steps:
  • the target format can be a PAG format, with a suffix of PAG (or pag).
  • Animation files in PAG format can be obtained by integrating resources such as text or images into a single file.
  • PAG format animation files use dynamic bit storage technology with extremely high compression rate.
  • animation files in PAG format can run across platforms, and can dynamically modify the text content and font size style in the animation at runtime while retaining the animation effect, and can also replace the image content to achieve a rich variety Animation content customization effect.
  • the terminal when a trigger event is detected, the terminal obtains an animation file in the target format.
  • the trigger event may be an externally input operation instruction, such as an animation playback instruction.
  • the method may further include: after the animation file is made through the AE (Adobe After Effects, Adobe post animation synthesis) client, exporting the animation in the target format through the PAG export plug-in installed on the AE client file.
  • the animation file can be encoded in the target encoding method.
  • the target coding method can be any of Huffman coding, Shannon coding, RLC (Run Length Code), LZW (Lenpel-Ziv&Welch) coding, arithmetic coding, predictive coding, transform coding, and quantization coding.
  • Huffman coding Shannon coding
  • RLC Raster Length Code
  • LZW Lenpel-Ziv&Welch
  • the installed PAG export plug-in is a plug-in developed for the AE client to export the target format.
  • the animation file in the target format needs to install a specific PAG export plug-in on the AE client.
  • Animation rendering can be applied to animation playback in specific application scenarios. Therefore, the method of obtaining animation files in S202 can be divided according to the application scenarios of animation playback:
  • Scenario 1 play animation on the client's display page.
  • the terminal in the process of opening the client, obtains an animation file in the target format from a locally stored animation library; or, the terminal sends an animation acquisition instruction to the server, and receives an animation in the target format in response to the animation acquisition instruction File in order to play the opening animation on the opening page according to the animation file when the client is opened.
  • the open page of the client belongs to a type of display page.
  • the PAG format animation file used to play on the opening page can be obtained, so as to play the animation on the opening page according to the PAG format animation file, thereby increasing the opening page
  • the animation effect improves the user experience.
  • the terminal after the client is started, if the display page scrolls to the target position, the terminal obtains an animation file in the target format from a locally stored animation library; or, the terminal sends an animation obtaining instruction to the server, and receives a response to the animation file.
  • An animation file in the target format of the animation acquisition instruction so as to play the opening animation on the display page scrolled to the target position according to the animation file.
  • the animation file in PAG format can be obtained for playing at the bottom of the display page, so that the user can watch the animation when scrolling the display page to the bottom, and set the animation on the display page of the client The effect can improve the user experience.
  • Scenario 2 When the client is playing a video or displaying an image, it displays an animation in the corresponding position of the video or image.
  • the terminal selects the specified video identifier according to the input selection instruction according to the selection instruction, obtains the corresponding IP (Internet Protocol, Internet Protocol) address according to the video identifier, and according to the IP address Obtain the video file and the corresponding animation file, so that when the video is played, the animation is played at the set position in the video.
  • IP Internet Protocol, Internet Protocol
  • the animation drawing data can be called PAG animation drawing data, which can be data used to describe or draw PAG animation content.
  • Animation rendering can be performed according to the animation drawing data, and then the corresponding PAG for display can be obtained by uploading to the screen.
  • Animation content Perform animation rendering on a certain instantaneous animation drawing data in the animation drawing data interval and then display it on the screen to get a certain frame of animation. Therefore, the instantaneous animation drawing data is the data that composes a certain animation frame and corresponds to the animation frame relationship.
  • the animation drawing data interval that meets the static condition may be a data interval where the animation drawing data does not change within a period of time.
  • the animation drawing data interval that meets the static condition can be understood as the static interval of the animation drawing data, that is, the static interval of the animation layer.
  • the animation drawing data interval is the static interval of the animation file; when the animation file has multiple animation layers, the intersection of the animation drawing data intervals is the static interval of the animation file Interval.
  • the animation drawing data interval is the drawing data interval meeting the static condition.
  • the static condition can mean that it does not change over a period of time.
  • the static condition may mean that the animation does not change within a period of time during the playback process (that is, the animation remains static within the period of time).
  • the static condition can mean that the animation drawing data constituting the animation frame does not change within a period of time.
  • the animation drawing data is composed of group drawing data, so the static condition may mean that the group drawing data does not change within a period of time.
  • the group drawing data is composed of animation attribute values, so the static condition can also mean that the animation attribute values do not change within a period of time.
  • the animation attribute value may be the color, size, font, and movement parameters of the text in the animation.
  • Animation attribute values are related to changes in time.
  • animation attribute values and time can be linearly related, that is, animation attribute values increase or decrease proportionally with time;
  • animation attribute values and time can also be Bezier related, that is, animation The property value changes with time in a Bezier curve;
  • the animation property value and time can also be of the hold type, that is, the animation property value does not change within a period of time, that is, it meets the static condition.
  • the animation attribute value belongs to the parameter value of the time axis attribute
  • the time axis attribute may be an attribute related to the change of the animation attribute value and time.
  • the animation attribute group has six types of animation attribute groups, including: transform (transform), mask (mask) representing layers, mask (trackMatter) between layers, layer style (layerStyle), effects ( Six kinds of animation attribute groups such as effect and content.
  • the animation attribute group of the content category contains drawable elements, such as images, text, and shapes. Therefore, the animation attribute group of the content category is the drawable element attribute group.
  • the content contained in the animation attribute group can be described by the corresponding animation attribute value, or it can be said that the animation attribute value is an element in the animation attribute group.
  • the shape of the animation can be described by the two animation attribute values of path information and paint information.
  • the terminal may load and decode the animation file through the client to obtain the animation file object. Then, the terminal traverses the animation attribute value list on the animation file object, and combines the animation attribute values obtained from the traversal according to the corresponding animation attribute group to obtain the group drawing data. The terminal combines the group drawing data according to the corresponding animation layers to obtain the animation drawing data.
  • the terminal may first find the animation attribute value interval that meets the static condition. Since the animation drawing data contains group drawing data, and the group drawing data contains animation attribute values, the terminal first calculates the intersection of the animation attribute value intervals in the animation attribute group, and uses the intersection of the animation attribute value intervals as the group drawing data that meets the static condition Then, the bubble algorithm is used to calculate the animation drawing data interval that meets the static condition.
  • each instantaneous animation drawing data in the animation drawing data interval is the same.
  • a certain instantaneous animation drawing data in the animation drawing data interval has a corresponding relationship with the animation frame.
  • the terminal after determining the animation drawing data interval that meets the static condition, obtains the initial animation drawing data in the animation drawing data interval, and then buffers the acquired initial animation drawing data.
  • the initial animation drawing data is instantaneous animation drawing data.
  • the terminal can obtain the animation drawing data of the first frame in the first ten frames, and then cache it.
  • the terminal detects the playback progress of the animation file in real time during the playback process. If the current frame to be played is static, that is, the animation drawing data corresponding to the frame to be played meets the static condition, it is read from the cache The initial animation drawing data corresponding to the frame to be played.
  • animation file A For example, for animation file A, suppose there are 100 frames in total, and the animation drawing data of the first ten frames meets the static condition. In the process of playing animation file A, if the first frame is played and the second frame needs to be played, since the animation drawing data of the first ten frames meets the static condition, the terminal only needs to read the animation drawing of the first frame from the cache The data is fine.
  • a corresponding flag can be set for each segment of the animation drawing data interval.
  • the animation drawing data corresponding to the frame to be played during the playback of the animation file meets the static condition, read from the cache according to the set flag. Take the initial animation drawing data, and the read initial animation drawing data corresponds to the frame to be played.
  • the terminal invokes a rendering tool to render the read initial animation rendering data to obtain visual animation rendering data, and then displays the animation rendering data on the screen.
  • the terminal calls WebGL (Web Graphics Library), or OpenGL ES (Open Graphics Library for Embedded Systems, open graphics library for embedded systems), or OpenGL ES2.0 version reads the initial animation drawing data read Perform rendering, and then upload the rendered animation rendering data to the screen.
  • WebGL Web Graphics Library
  • OpenGL ES Open Graphics Library for Embedded Systems, open graphics library for embedded systems
  • OpenGL ES2.0 version reads the initial animation drawing data read Perform rendering, and then upload the rendered animation rendering data to the screen.
  • the animation file A if the first ten frames of a total of 100 frames of animation are static, when the animation drawing data is cached, only the animation drawing data corresponding to the first frame needs to be cached.
  • the terminal When playing the second to tenth frames, the terminal only needs to obtain the animation rendering data corresponding to the first frame in the buffer, and render the animation rendering data corresponding to the first frame, and then the animation rendering data of the second to tenth frames can be obtained. , And then display it on the screen to get the animation of frames 2-10.
  • the initial animation drawing data of the animation drawing data that meets the static condition is buffered, and when the attribute value of the corresponding animation frame meets the static condition, the starting animation corresponding to the frame to be played is directly obtained from the buffer.
  • Start the animation drawing data and there is no need to analyze the animation file again to obtain the animation drawing data corresponding to the frame to be played, thereby avoiding a lot of calculations, saving time in the rendering process, and making the animation playback smoother.
  • S204 may specifically include:
  • the animation file object includes various data related to the animation frame, such as the animation attribute value used to describe the animation and which is the smallest particle.
  • the terminal parses the animation file in the target format through the client to obtain binary animation data.
  • the terminal reads the binary animation data bit by bit to obtain the animation file object.
  • the terminal when more than one application in the client is playing the same animation, the terminal only decodes the same animation file and caches the animation file object obtained by decoding. When more than one application plays the same animation, the animation file object can be read from the cached animation file.
  • the client when the client needs to play the same animation, it only needs to load the same animation file into memory and decode it into an animation file object.
  • the cached animation file object is reused. can.
  • the terminal when more than one application in the client is playing the same animation, the terminal only decodes the same animation file to obtain the animation file object, and the animation drawing data interval that meets the static condition read by the animation file object Cache the initial animation drawing data in.
  • the initial animation drawing data corresponding to the frame to be played of each application can be read from the cache.
  • the terminal traverses the animation attribute value list on the animation file object, and combines the animation attribute values obtained from the traversal according to the corresponding animation attribute group to obtain the group drawing data.
  • the terminal combines the group drawing data according to the corresponding animation layers to obtain the animation drawing data.
  • S306 Determine an animation drawing data interval that meets the static condition from the read animation drawing data.
  • the terminal may first find the animation attribute value interval that meets the static condition. Since the animation drawing data contains group drawing data, and the group drawing data contains animation attribute values, the terminal first calculates the intersection of the animation attribute value intervals in the animation attribute group, and uses the intersection of the animation attribute value intervals as the group drawing data that meets the static condition Then, the bubble algorithm is used to calculate the animation drawing data interval that meets the static condition.
  • the terminal determines the playback progress corresponding to each application; when the playback progress corresponds to the animation drawing data of the frame to be played meets the static condition, it reads the cached, and The initial animation drawing data corresponding to the playback progress and shared by more than one application.
  • the animation file object is obtained by decoding the animation file, the animation drawing data in the animation file object is read, and then the animation drawing data interval that meets the static condition in the animation file object is found, so as to start the animation drawing data interval.
  • the initial animation drawing data is cached, so that when the attribute value of the corresponding animation frame meets the static condition, the initial animation drawing data corresponding to the frame to be played is directly obtained from the cache, avoiding a lot of calculations and saving the rendering process Time-consuming, making the animation play more smoothly.
  • the animation attribute group has six types of animation attribute groups, including: transform, mask, trackMatter, layerStyle, effect, and content.
  • the animation attribute value is an element in the animation attribute group.
  • the path information and mask mode can be used to achieve drawing. Therefore, the path information and the mask mode are the animation attribute values of the mask.
  • Animation attribute values are values related to changes in time, for example, animation attribute values increase or decrease proportionally with time, or animation attribute values change with time in a Bezier curve, or animation attribute values are within a certain period of time Within does not change with time.
  • animation attribute value interval that meets the static condition can be understood as the static interval of the animation attribute value.
  • the terminal searches the animation attribute group for animation attribute values that do not change with time within a period of time, and this period of time can be used as an animation attribute value interval that meets the static condition.
  • this period of time can be used as an animation attribute value interval that meets the static condition.
  • the above-mentioned time range may be a period of time with a specific time as a measurement unit, or a period of time with a frame as a measurement unit.
  • S404 Use the intersection of the animation attribute value intervals as the group drawing data interval of the animation attribute group.
  • the group drawing data interval can be understood as the static interval of the group drawing data, that is, the static interval of the animation attribute group.
  • the terminal calculates the intersection between the animation attribute value intervals, uses the intersection of the animation attribute value intervals as the group drawing data interval of the animation attribute group, and then executes S406, that is, calculates the animation of the animation layer using the bubbling algorithm Plot the data interval.
  • the initial animation attribute value of each animation attribute value interval is cached.
  • there is no intersection between the animation attribute value intervals which means that at least one animation attribute value in the animation attribute group changes with time, so the animation attribute group also changes with time, that is, the animation attribute value group does not conform to the static Condition (that is, there is no static interval in the animation attribute value group), but there is at least one animation attribute value in the animation attribute value group that meets the static condition, and then find the animation attribute value interval that meets the static condition, and determine the start of the animation attribute value interval Animation attribute values are cached.
  • the animation attribute group when the animation attribute group is a drawable element attribute group, the animation attribute group includes at least two drawable elements; the drawable element includes at least two animation attribute values; the method further includes: the terminal sets the animation attribute value range The intersection of is determined as the element interval of drawable elements, and the intersection between the element intervals of drawable elements is calculated.
  • S404 may specifically include: determining the intersection between the element intervals as the group drawing data interval of the animation attribute group.
  • the animation attribute group includes six types of animation attribute groups: transform, mask, trackMatter, layerStyle, effect, and content.
  • This type of animation attribute group of content is an indispensable part of the animation layer. It represents the drawable elements of the animation layer, such as shape, text, solid, pre-composition (PreCompose) and image, etc. , And other groups are processed on the basis of content, such as pan and zoom, masking, and filter effects. Due to the diversity of Content, it is necessary to cache specific drawable elements.
  • S406 Determine the intersection of the group drawing data intervals as the animation drawing data interval of the animation layer.
  • the animation drawing data interval can be understood as the static interval of the animation drawing data, that is, the static interval of the animation layer.
  • the terminal calculates the intersection of each group of drawing data intervals, and determines the intersection of the group of drawing data intervals as the animation drawing data interval of the animation layer.
  • the animation file is composed of at least one animation layer, and one animation layer corresponds to one animation drawing data, and the animation drawing data is composed of at least one group of drawing data.
  • the corresponding animation can be obtained by rendering each animation drawing data and then on the screen.
  • the initial group of drawing data of each group of drawing data intervals is buffered.
  • the terminal uses the animation drawing data interval as the animation static interval.
  • the terminal calculates the intersection between the animation drawing data intervals of each animation layer number, and uses the intersection between the animation drawing data intervals as the animation static interval.
  • the pre-composition attribute group can be regarded as a sub-animation file, and the sub-animation file can include at least one animation layer and/ Or nested sub-precomposition attribute groups.
  • the static interval of the animation layer, the animation attribute group, and the animation attribute value in the pre-compositing attribute group reference may be made to the above-mentioned embodiment.
  • [t1, t2] represents the static interval of transform, that is, the animation layer has not undergone any transformation (such as no translation, scaling, rotation, etc.);
  • [m1, m2] represents the static interval of the mask, which is the animation image
  • [c1,c2] and [c3,c4] indicate the static interval of content, that is, the animation layer applies the same text and the same image, and [ There is no change in the text during c1,c2], and there is no change in the image during [c3,c4].
  • t1, t2, m1, m2, c1, c2, c3, and c4 can be represented by frames.
  • t1 can represent the first frame of the animation
  • m1 can represent the third frame of the animation
  • t2 can represent the first frame of the animation.
  • [c1,c2] and [c3,c4] are the static intervals of content.
  • the terminal When playing an animation file, the terminal only needs to parse the text at time c1 and the image at time c3 once, and cache and mark the parsed text and image.
  • the frame that needs to be drawn for animation playback is c and c belongs to the [c1,c2] interval, the terminal does not need to re-parse the text of the c frame, and can directly obtain the group drawing data of the c1 frame from the cache, and then according to the group of the c1 frame
  • the drawing data renders the text, and the rendered text is exactly the same as the c-frame text.
  • first determine the animation attribute value interval that meets the static condition and then calculate the animation drawing data interval using the bubbling algorithm, so as to cache the initial animation drawing data in the animation drawing data interval, so that the corresponding animation is played
  • the initial animation drawing data corresponding to the frame to be played is directly obtained from the cache, avoiding a large amount of calculation, saving time in the rendering process, and making the animation playback smoother.
  • the method further includes:
  • the animation attribute value interval can be understood as the static interval of the animation attribute value. If there is no intersection between the animation attribute value intervals, it means that there is at least one animation attribute value in the animation attribute group that changes with time, so the animation attribute group also changes with time, that is, there is no static interval in the animation attribute value group. However, there is at least one animation attribute value in the animation attribute value group that meets the static condition, and then the animation attribute value interval that meets the static condition is found, and the initial animation attribute value of the animation attribute value interval is cached.
  • the animation drawing data interval can be understood as the static interval of the animation drawing data, that is, the static interval of the animation layer.
  • the animation drawing data corresponding to the frame to be played does not meet the static condition, and there is no intersection between the animation attribute value intervals, which means that the animation drawing data corresponding to the frame to be played does not hit the corresponding static interval and is waiting
  • the group drawing data corresponding to the playback frame also misses the corresponding static interval.
  • the terminal when the animation file is played, if the animation drawing data corresponding to the frame to be played does not hit the static interval, S208 is executed. If the animation drawing data corresponding to the frame to be played does not hit the static interval, the terminal continues to search for small particles in the animation drawing data (i.e. group drawing data) to find whether the group of drawing data corresponding to the frame to be played hits the corresponding static interval If it hits, the initial group drawing data corresponding to the frame to be played is obtained from the cache, and then the group drawing data corresponding to the frame to be played and in the non-stationary interval is parsed from the animation file object, so that the terminal can Perform animation rendering according to the acquired initial group drawing data and the parsed group drawing data.
  • group drawing data i.e. group drawing data
  • the terminal continues to the group of drawing data Search for small particles (that is, animation attribute values) to find whether the animation attribute value corresponding to the frame to be played hits the corresponding static interval, if it hits, obtain the initial animation attribute value corresponding to the frame to be played from the cache, and then execute S606.
  • group of drawing data Search for small particles (that is, animation attribute values) to find whether the animation attribute value corresponding to the frame to be played hits the corresponding static interval, if it hits, obtain the initial animation attribute value corresponding to the frame to be played from the cache, and then execute S606.
  • the static section Due to the existence of the static section, when a certain frame is played, all animation layers need to be traversed. If the current frame to be played hits the static section of the animation layer, then for the animation layer, the entire static section The data of all frames is the same, so the corresponding initial drawing data can be directly obtained from the buffer. If it does not hit the static interval of the animation layer, it will traverse all the groups of the animation layer. If it hits the static interval of the group, the terminal can directly use the drawing data of the initial group in the static interval of the group; In addition, the group drawing data corresponding to the frame to be played and which is a non-stationary interval is also analyzed from the animation file object.
  • the terminal will traverse all the animation attribute values of the group again. If it hits the static interval of the animation attribute value, the terminal directly uses the initial animation attribute value in the static interval of the animation attribute value. , In addition, S606 is also executed.
  • S606 Obtain an animation attribute value corresponding to the frame to be played and that does not meet the static condition from the animation file object obtained by decoding the animation file.
  • the terminal when the initial animation attribute value corresponding to the frame to be played is read, the terminal also parses the animation attribute value corresponding to the frame to be played and in the non-stationary interval from the animation file object. Therefore, the terminal can perform animation rendering according to the acquired initial animation attribute value and the parsed animation attribute value.
  • S608 Perform animation rendering according to the read initial animation attribute value and the acquired animation attribute value.
  • its animation drawing data may be composed of animation attribute values in the static interval and animation attribute values in the non-stationary interval.
  • the initial animation attribute value corresponding to the frame to be played can be directly read from the cache, without parsing from the animation file object.
  • the initial animation attribute value read by the terminal and the acquired animation attribute value are converted into animation drawing data, and then the animation rendering is performed according to the animation drawing data.
  • the initial animation attribute value of the animation attribute value in the static interval is cached.
  • the initial animation attribute value corresponding to the frame to be played is read from the cache. There is no need to parse the animation attribute value belonging to the static interval from the animation file object, thereby reducing The amount of calculation saves time-consuming in the rendering process, thereby making the animation playback smoother.
  • the method further includes:
  • the animation attribute value interval can be understood as the static interval of the animation attribute value. There is an intersection between the animation attribute value intervals, which means that all animation attribute values in the animation attribute group will not change with time. Therefore, the animation attribute group has a static interval. If the animation layer does not have a static interval, the terminal will cache the group drawing data The starting group of the interval plots data.
  • the animation drawing data interval can be understood as the static interval of the animation drawing data, that is, the static interval of the animation layer.
  • the animation drawing data corresponding to the frame to be played does not meet the static condition, and there is an intersection between the animation attribute value intervals, which means that the animation drawing data corresponding to the frame to be played does not hit the corresponding static interval, but is to be played At least a part of the drawing data in the group of drawing data corresponding to the frame hits the corresponding static interval.
  • the animation drawing data corresponding to the frame to be played when the animation file is played, if the animation drawing data corresponding to the frame to be played does not hit the static interval, S208 is executed. If the animation drawing data corresponding to the frame to be played does not hit the static interval, the terminal continues to search for small particles in the animation drawing data (i.e. group drawing data) to find whether the group of drawing data corresponding to the frame to be played hits the corresponding static interval If it hits, the start group drawing data corresponding to the frame to be played is obtained from the cache, and then S706 is executed.
  • group drawing data small particles in the animation drawing data
  • S706 Acquire group drawing data corresponding to the frame to be played and that does not meet the static condition from the animation file object obtained by decoding the animation file.
  • the terminal when the initial group drawing data corresponding to the frame to be played is read, the terminal also parses from the animation file object the group drawing data corresponding to the frame to be played and in the non-stationary interval. Therefore, the terminal can perform animation rendering according to the acquired initial group drawing data and the parsed group drawing data.
  • S708 Perform animation rendering according to the read initial group drawing data and the acquired group drawing data.
  • the animation drawing data may consist of group drawing data in the static interval and group drawing data in the non-stationary interval.
  • group drawing data in the static interval the initial group drawing data corresponding to the frame to be played can be directly read from the buffer, without parsing from the animation file object.
  • the initial group drawing data read by the terminal and the acquired group drawing data are converted into animation drawing data, and then animation rendering is performed according to the animation drawing data.
  • the initial group drawing data of the group drawing data with the static interval is cached.
  • the initial group of drawing data corresponding to the frame to be played is read from the cache, and there is no need to parse the group of drawing data belonging to the static interval from the animation file object, thereby reducing The amount of calculation saves time-consuming in the rendering process, thereby making the animation playback smoother.
  • S208 may specifically include:
  • the animation file played by more than one application may be: multiple places in the client need to play the same animation.
  • the video of the client's video player, and the video has animation is in a certain position of the display page outside the video player (such as the video player is not in full-screen mode, and the video player is fixed at the top of the display, And the display page is pulled to the bottom) also need to play the animation.
  • the terminal when more than one application in the client is playing the same animation, the terminal only decodes the same animation file and caches the animation file object obtained by decoding. When more than one application plays the same animation, the animation file object can be read from the cached animation file.
  • the client when the client needs to play the same animation, it only needs to load the same animation file into memory and decode it into an animation file object.
  • the cached animation file object is reused. can.
  • the terminal when more than one application in the client is playing the same animation, the terminal only decodes the same animation file to obtain the animation file object, and the animation drawing data that meets the static condition read by the animation file object
  • the initial animation drawing data in the interval is cached, and the cached initial animation drawing data can be shared when being played by multiple applications.
  • the terminal can read the initial animation drawing data corresponding to the frame to be played of each application from the cache.
  • the terminal when more than one application in the client is playing the same animation, the terminal only decodes the same animation file to obtain the animation file object. If the animation drawing data read by the animation file object does not conform to the static state When the condition is met, it is judged whether the group drawing data of the animation attribute group meets the static condition. If so, the initial group drawing data meeting the static condition is cached, and the buffered initial group drawing data can be shared by multiple applications when playing. When more than one application plays the same animation, the terminal can read the initial group drawing data corresponding to the frame to be played of each application from the cache.
  • the terminal when more than one application in the client is playing the same animation, the terminal only decodes the same animation file to obtain the animation file object. If the animation drawing data read by the animation file object does not conform to the static state If the condition and the group drawing data of the animation attribute group do not meet the static condition, it is judged whether the animation attribute value meets the static condition. If so, the initial animation attribute value that meets the static condition is cached, and the cached initial animation attribute value can be Share when played by multiple apps. When more than one application plays the same animation, the terminal can read the initial animation attribute value corresponding to the frame to be played of each application from the cache.
  • the animation drawing data corresponding to the frame to be played with the playback progress meets the static condition, that is, the animation drawing data corresponding to the frame to be played with the playback progress has a static interval.
  • the terminal when multiple applications play the same animation, the terminal separately records the playback progress of the animation played by each application, and then determines whether the animation drawing data corresponding to the frame to be played hits the static interval according to the playback progress, and if it hits the animation For the static interval of the drawing data, read the buffered initial animation drawing data corresponding to the playback progress and shared by more than one application. If it misses the static interval of the animation drawing data, the terminal traverses the group drawing data in the animation layer that constitutes the animation drawing data. If it hits the static interval of the group drawing data, it directly starts the static interval. Just use the initial set of drawing data. If it does not hit the static interval of the group drawing data, the terminal will traverse the animation attribute values in the animation attribute group that constitute the group drawing data. If it hits the static interval of the animation attribute value, it will directly The initial animation property value can be used.
  • S806 Render the read initial animation drawing data shared by more than one application in sequence to obtain animation rendering data corresponding to each application.
  • the terminal reads the initial animation attribute value shared by more than one application on the one hand, and the other On the one hand, the animation attribute value of the non-stationary interval is analyzed from the shared animation file object, and the read initial animation attribute value and the parsed animation attribute value are converted into animation drawing data, and then rendered.
  • the terminal reads the initial group drawing data shared by more than one application on the one hand, and on the other hand from the shared Analyze the group drawing data of the non-stationary interval in the animation file object of, combine the read initial group drawing data and the analyzed group drawing data into the animation drawing data, and then perform rendering.
  • the multiple applications when multiple applications of the client play the same animation, the multiple applications simultaneously share the animation drawing data parsed by the same animation file, and cache the initial animation drawing data, which reduces the cache space on the one hand. , On the other hand, it reduces the amount of analytical calculations.
  • the animation drawing data corresponding to each frame to be played meets the static condition, the initial animation drawing data corresponding to the frame to be played is obtained from the cache. There is no need to parse the animation drawing data belonging to the static interval from the animation file object, thereby reducing The amount of calculation saves time-consuming in the rendering process, thereby making the animation playback smoother.
  • animation rendering data may also be cached.
  • the animation file includes a vector diagram; as shown in Figure 9, the method can also include:
  • vector graphics are also called object-oriented images, which are geometric primitives based on mathematical equations such as points, straight lines, or polygons to represent the image, which will not be distorted when zoomed in, zoomed out, or rotated.
  • the text in the animation (such as graphic text) also belongs to the vector diagram.
  • Animation rendering data requires calculations, especially vector diagrams described by complex path information.
  • the point coordinate information in the path information and the path description information between points and points must be analyzed one by one, and then the path information can be converted into Vector illustration.
  • the same is true for the description information of the text.
  • the animation drawing data buffers only the path information and paint information of the text. These information are computationally expensive and time-consuming each time they are rendered. Therefore, when playing an animation, it is necessary to first render the animation drawing data into animation rendering data, and then cache it, and directly read the corresponding animation rendering data from the cache when needed.
  • the terminal when it is determined that the animation file includes a vector diagram, the terminal obtains the animation drawing data on the vector diagram obtained by decoding the animation file, or, when the animation drawing data in the vector diagram meets the static condition, it is determined that it meets the static condition.
  • Conditional animation drawing data interval from which the initial animation drawing data is obtained.
  • S904 Perform off-screen rendering on the animation drawing data to obtain animation rendering data.
  • the animation rendering data may be image textures.
  • S904 may specifically include: the terminal performs off-screen rendering on the decoded animation drawing data to obtain the animation rendering data; or, the terminal performs off-screen rendering on the actual animation drawing data in the animation drawing data interval to obtain the animation Plot the data.
  • the terminal determines the size of the animation rendering data before caching the animation rendering data.
  • the animation rendering data is compressed under the premise of ensuring the quality of the animation. Reduce the size of the animation rendering data, and then cache it, which can reduce the cache size.
  • the animation rendering data corresponding to the frame to be played is read in the buffer; or, when the frame to be played during playback is a vector graphics animation frame, And when the animation drawing data corresponding to the frame to be played meets the static condition, the animation rendering data corresponding to the frame to be played and rendered from the initial animation drawing data in the animation drawing data interval is read from the cache.
  • S904 may specifically include:
  • S1002 Determine the size of the outer container used to display the vector diagram.
  • the external container may be an external container view (View) used to display a vector diagram.
  • View an external container view
  • the image container corresponding to the area displayed on the vector diagram on the mobile phone can be called an external container.
  • the actual display size of the vector diagram depends on the size of the external container. Because the animation rendering data is cached, the size of the animation rendering data has been determined. In order to ensure the clarity of the animation, when the animation rendering data is cached, its size can be applied with the maximum zoom ratio, so that when the cached animation rendering data is applied in a smaller scene, the animation rendering data can be compressed instead of pulled. Stretch, which can effectively ensure the clarity of the animation.
  • the design terminal can preset the size of the external container for displaying the vector diagram according to the size of the animation in each terminal.
  • the terminal can determine the size of the corresponding external container for displaying the vector diagram from the preset size according to the size of the terminal itself.
  • S1004 Determine a scaling ratio of the size of the animation drawing data to the size of the outer container.
  • S1004 may specifically include: determining the first size ratio between the inner container and the outer container carrying the animation frame; determining the second size ratio between the animation layer and the inner container; determining the animation in the drawable element attribute group The third size ratio between the drawing data and the animation layer; the zoom ratio is determined according to the first size ratio, the second size ratio and the third size ratio.
  • the terminal first obtains the first size ratio S1 of the inner container and the outer container carrying the animation frame; then traverses each layer layer by layer, and takes out the second size ratio S2 of the layer relative to the inner container. ; Then the tree structure traverses each child node in turn until the content is found, and the original width and height of the content node and the third size ratio S3 relative to the layer node are taken out; finally, the zoom ratio of the content relative to the external container is calculated as S1 The cumulative multiplication between S2 and S3.
  • the node tree is composed of external container, internal container, animation layer, drawable element attribute group and animation drawing data in the drawable element attribute group.
  • the changed zoom ratio is obtained;
  • the node tree consists of an outer container, an inner container, an animation layer, a drawable element attribute group, and a drawable element attribute group
  • the animation rendering data in the configuration; adjust the size of the cached animation rendering data according to the zoom ratio; or, obtain the input zoom ratio, and adjust the size of the cached animation rendering data according to the input zoom ratio.
  • the terminal when playing an animation, the terminal will hang the PAG view (PAGView) that displays the animation on other nodes, such as the last leaf node.
  • PAGView PAG view
  • the size of these parent nodes changes, it will notify the PAGView and add The zoom ratio value is applied to the buffer of the animation rendering data, which can ensure that the buffered animation rendering data is optimal under the condition of ensuring the definition.
  • S1008 In the off-screen buffer, perform animation rendering on the animation drawing data according to the size and zoom ratio of the external container to obtain the animation rendering data.
  • the terminal in order for the terminal to synchronize the display of the display with the animation controller, when the electron gun scans a new line, it sends a horizontal synchronization signal when it is ready to scan.
  • the refresh frequency of the display is the frequency at which the synchronization signal is generated.
  • the processor calculates parameter values such as animation frame, animation width and height, and passes the calculated parameter values to the graphics card for rendering to obtain animation rendering data, and the animation rendering data rendered by the graphics card is put into the off-screen buffer.
  • the video controller reads the animation rendering data in the off-screen buffer line by line according to the synchronization signal, and transmits it to the display for display after digital-to-analog conversion.
  • the whole process of off-screen rendering needs to switch the context: first switch from the current screen (On-Screen) to off-screen (Off-Screen), wait until the off-screen rendering is over, display the animation rendering data in the off-screen buffer
  • the terminal needs to switch the context from off-screen to the current screen.
  • the animation drawing data of the vector diagram is pre-rendered first, so as to avoid the time-consuming need to convert the animation drawing data into the animation rendering data during playback, thereby effectively reducing the time-consuming in the rendering process. Conducive to improving the smoothness of animation playback.
  • the method further includes:
  • the terminal when the animation file includes a pre-composition attribute group, and multiple animation layers are synthesized in the pre-composition attribute group, and the vector graphics contained in each animation layer are the same but different in size, the terminal obtains The animation rendering data corresponding to the largest vector diagram.
  • S906 may specifically include: S1104, buffering the animation rendering data corresponding to the vector diagram with the largest size.
  • an animation has three animation layers, and the same image appears in each animation layer. Since the three animation layers all contain the same image, only one copy is needed.
  • the size of the animation rendering data cache needs to consider the zoom of each animation layer, and select the largest zoom ratio value, so that the size of the cached animation rendering data is the maximum zoom ratio value when the animation rendering data When drawing on a smaller container or layer, the cached animation rendering data can be compressed to ensure clarity.
  • the animation rendering data corresponding to the vector with the largest size is cached.
  • three copies of animation rendering data can be cached at the same time, which reduces the cache
  • the maximum size of the animation rendering data is cached, so as to avoid the problem of image definition reduction caused by stretching the vector diagram during display.
  • S906 may specifically include:
  • the first target animation rendering data contains part of the animation rendering data.
  • the first target animation rendering data is part of the data in the five-pointed star and clouds.
  • animation rendering data can be understood as image texture, it has a size. Therefore, the content contained in the animation area may only contain part of the animation rendering data. If the size of the non-animation rendering data area in the animation area is larger, it is To avoid additional calculations that may be consumed during rendering, the animation area needs to be optimized.
  • the second target animation rendering data includes all data in the animation rendering data, that is, the first target animation rendering data belongs to a part of the second target animation rendering data.
  • the non-animated rendering data area may be a blank area or an invalid area.
  • a rectangular frame can be used to frame the second target animation data, and the rectangular frame is the smallest rectangular frame when the second target animation data is framed, then the The smallest rectangle is used as the smallest animation area, as shown in the dotted area in Figure 13.
  • S1206 Determine the intersection area between the acquired animation area and the minimum animation area.
  • S1208 Determine the intersection area as the animation area of the pre-composite attribute group.
  • the terminal when determining the intersection area between the animation area and the minimum animation area, displays the intersection area as the animation area of the pre-composite attribute group.
  • the terminal optimizes the boundary of the pre-synthesis attribute group, calculates the smallest rectangular area containing all the animation rendering data, and then finds this rectangle The intersection with the bounding rectangle of the pre-synthesized attribute group to obtain the real boundary of the pre-synthesized attribute group.
  • the animation file contains only one layer A and one attribute group related to pre-compositing.
  • the boundary of the pre-compositing attribute group is a thin black rectangular box C, and the actual content area is a part of the five-pointed star and clouds. The other part of the five-pointed star and the clouds are outside the rectangular frame.
  • the optimization process can include: first calculate the smallest rectangle size that contains the actual content area, and then get the dotted rectangle frame A in the figure; finally find the boundary between the dotted rectangle A and the pre-synthesized attribute group (ie, the black thin line rectangle C) The intersection between the two, the black thick-line rectangular frame B is obtained, and the black thick-line rectangular frame B is the real animation area that needs to be rendered and displayed.
  • the animation area of the pre-synthesized animation attribute group contains a large invalid area
  • the animation area needs to be re-determined to ensure that the animation rendering data is included in the animation area, and the invalid area is minimized so as to avoid rendering It may consume additional calculations at the time, and avoid scanning the invalid area when loading the screen, reducing the time consumption of the screen.
  • Lottie animation uses json (JavaScript Object Notation, JS object notation) format to store all the information of the animation.
  • json JavaScript Object Notation, JS object notation
  • the playback of the animation first load the animation file to Memory, and then parse the animation file according to the json format, locate the playback moment during playback, and take out the instantaneous animation information, that is, find the content that the instantaneous animation needs to display from the parsed json file, which can be images, text, etc.
  • this instantaneous display content needs to be converted into rendering data that can be drawn on the screen.
  • the memory data of the same animation file is not reused. If multiple identical animations appear on the screen to be played simultaneously, the same animation file will be loaded multiple times.
  • the data at a certain moment of the animation is analyzed and then converted into the data to be drawn, but the static interval of the drawing data is not considered. Therefore, the initial drawing data in the static interval is not reused in the conversion from the memory data to the drawing data.
  • the embodiment of the present invention proposes a three-level cache animation rendering solution, which includes animation file cache, drawing data cache, and rendering data cache.
  • three cache methods are described, as follows:
  • decoding time can be saved.
  • the animation file is decoded only once, which makes it less time-consuming and avoids additional time-consuming decoding of each animation from the animation file to the memory once.
  • the so-called static interval is the most intuitive understanding. For example, the total duration of a PAG animation is 2s, assuming that there are 24 frames per second. If there is no change in the time period of 1s to 1.5s, that is, the animation is static from 1s to 1.5s. , Then the static interval of this animation is [24,36], the unit is frame.
  • Each frame of animation in an animation file can be superimposed by multiple layers, and the layers are composed of smaller groups.
  • the final group is described by a series of attributes, including timeline attributes and fixed attributes.
  • For the time axis attribute you can find the interval of the Hold type, that is, the static interval of the time axis attribute; for the fixed attribute, the entire time interval is a static interval. After finding the static interval of the time axis attribute, bubble upward again. If a group contains several time axis attributes and the static interval of each time axis attribute has an intersection, the corresponding group has no attribute value on the intersection. Change, then this intersection is the static interval of the group.
  • the group that composes the layer is divided into 6 categories, namely transform, mask, trackMatter, layerStyle, effect and content. Take transform as an example. It is composed of multiple keyframes (Keyframe) ⁇ Transform2D>. Each Keyframe is a record of the layer's translation, scaling, rotation and/or transparency change information, which is saved in the PAG animation file. When parsing the PAG animation file, take out these Keyframes to restore the animation track of this layer. The final data structure stored in these key frames is Transform2D. This Transform2D is actually described by the time axis attribute.
  • anchorPoint position, x-axis coordinate position (xPosition), The y-axis coordinate position (yPosition), scale (scale), rotation (rotation) and transparency (opacity) are as shown in Figure 14.
  • position, xPosition, and yPosition describe translation information; anchorPoint and scale describe zoom information; anchorPoint and rotation describe rotation information; and opacity describes transparency information.
  • the static interval can be calculated directly, or it can be obtained by searching for the non-stationary interval. That is, when the non-stationary interval is found, the non-stationary interval is filtered out to obtain the static interval of the entire animation.
  • the way to get the static interval by looking for the non-stationary interval method can specifically include: starting from the root composition, the entire animation has n frames, from the first frame to the last frame of the animation (ie [1,n]); then traverse sequentially All layers or sub-compositions, and then traverse each group of the layer, and directly remove the non-stationary interval in the group from [1,n], so that [1,n] is cut into many small interval segments.
  • the static interval of a layer is to calculate the intersection of the static intervals of all the groups of the layer. Similarly, the intersection of the static intervals of all the layers is the static interval of the composition, so that it bubbles layer by layer until the root composition (that is, the entire animation) static interval. .
  • Fig. 5 is a schematic diagram of finding the intersection of static intervals, where t1, t2, m1, m2, c1, c2, c3, and c4 are in units of frames.
  • the three groups of the layer are shown in Figure 5 throughout the animation cycle.
  • the layer does not have any transformations such as translation, scaling, and rotation; in the interval [m1, m2], the layer uses the same mask, and the mask does not change;
  • the layer applies a text and a picture respectively, and the text and picture have not changed in any way.
  • [c1,c2] and [c3,c4] are the static intervals of the content.
  • the frame that needs to be drawn for animation playback is c, and c is in the [c1,c2] interval, then there is no need to re-parse the text of frame c.
  • the layer is composed of multiple groups, and the static interval is the intersection of the static intervals of all groups under the layer. As shown in Figure 5, if a layer consists of only 3 groups, then the layer has two static intervals: [m1, c2] and [c3, t2].
  • the initial static interval is the first frame to the last frame.
  • S1514 Select the Hold type interval describing the time axis attribute of the group, and then judge whether there is a layer in the subcomposition.
  • the drawing data cache refers to caching the relevant part of the PAG file (PAGFile) object.
  • PAGFile PAG file
  • all the static sections can be obtained by the above method of dividing the static section.
  • parsing the PAGFile object can get the initial drawing data of the static interval, and then buffer the initial drawing data.
  • different caching strategies are made for different layers, different groups, and smaller granularities.
  • a layer can be composed of any one or a combination of 6 types of groups. The caching strategy of each type of group is described in turn:
  • Content is an indispensable part of the layer, which represents the drawable elements of the layer, such as pictures, text, and shapes. Almost all other groups do some processing on the basis of Content, such as pan and zoom, masking, and filter effects.
  • the elements that Content can represent are roughly divided into: shape, text, solid, PreCompose and image. Due to the diversity of Content, it is necessary to cache specific elements. For these 5 elements, different data contents are cached separately:
  • shape represents the shape, which can be a regular circle, rectangle, five-pointed star, etc., or it can be an irregular shape.
  • the shape cached here is a custom data structure.
  • the content of the custom data structure is ShapePaint.
  • the ShapePaint contains the two attribute values of the drawn path information and paint information.
  • text represents the content of the text, that is, the text, such as a line of prompts in an animation scrolling, and the prompts here use text.
  • the information required for text rendering is similar to shape, and both path information and paint information are also required.
  • the font type can be set for the text. For example, the font in an animation can come from different types, so the attribute value of font type information needs to be recorded.
  • Solid is a special case of shape. It can represent a solid filled rectangle. The description can use the width, height and fill color of the rectangle, that is, the three attribute values of width, height and fill color can be cached. There are also many application scenarios for solid, such as graphics in simple vector animations, masks used as layers, or pre-composited masks.
  • the image represents the picture.
  • the specific pixel information of the picture needs to be obtained, so the picture pixel information is cached as an attribute value; in addition, the size of the picture display, especially when zooming, the width and height of the picture will change. Compress and stretch accordingly. Therefore, the original width and height of the image must also be cached as the attribute value.
  • PreCompose means precomposition, which is composed of one or more layers, and other compositions can also be nested.
  • PreCompose is used for sub-composition applications.
  • the layers included in PreCompose are traversed, and then the layer's caching strategy is used for caching.
  • Animation actually changes with the time axis, and most of the changes are composed of at least one of translation, zoom, rotation, or fading.
  • a rectangle to transform if it is a translational painting, you can determine the initial position of the rectangle (position, record coordinates x and y), and then record the x and y axis translation distance xPoition and yPosition (floating point number, record distance) , Through the three information of coordinates x, y and distance, a horizontal motion picture can be restored. If it is a zoom animation, you can first determine an anchor point anchorPoint, and add scale information to restore it completely. If it is a rotation animation, first determine an anchor point anchorPoint, and add the rotation information to restore the rotation animation. If it is a fading transparency animation, you can record the opacity information.
  • the changes in any 2D space can be recorded with a small amount of information.
  • the information recorded in translation, zoom and rotation can be recorded using a matrix, and changes in transparency can be recorded using opacity.
  • Mask represents the mask in a layer.
  • the path information and mask mode are used to achieve the rendering.
  • a rectangular mask essentially needs to record the path information of the rectangle, that is, the vertex information of the rectangle, so as to restore the mask rectangle, so the path information and mask mode need to be cached.
  • the mask mode determines the final display, such as: Add mode means that the mask is directly added to the display; Subtract mode means that the part of the mask is subtracted; Intersect mode means that the original graphics and the mask are taken The intersection part is displayed; the Difference mode means that the disjoint part of the original image and the mask is displayed.
  • TrackMatter is similar to Mask in that it is used as a mask. The difference is that Mask is a mask within a layer, while TrackMatter uses one layer as a mask for another layer based on transparency and brightness. For TrackMatter, what needs to be cached is the same as Mask, path information and mode information.
  • the TrackMatter mode has the following categories: Alpha mode, which controls the display area according to the opacity area of the layer; AlphaInverted mode, which controls the display area of the layer according to the transparent area of the layer; Luma mode and LumaInverted mode and Alpha is similar, except that the condition that controls the display area is replaced with brightness.
  • LayerStyle and effect are a kind of filter, which can be understood as processing the pixels of the picture to produce a personalized effect.
  • LayerStyle and effect can be used to support projection effects in PAG animation, and they depend on filter information when implemented. For example, draw a projection image on the original image, and the parameters of the image are calculated by the color, direction and distance of the projection. Therefore, the filter information can be cached when caching.
  • the drawing data cache can ensure that the drawing data for a certain moment of the animation can be directly obtained, but the conversion of these drawing data into rendering data that can be directly drawn requires calculation, especially for vector graphics with complex path information.
  • rendering it is necessary to parse the point coordinates of the path information and the path description information of the points and points one by one, and then draw the path information into a vector diagram.
  • the drawing data cache is only the path information and paint information of the text. This information is time-consuming to calculate and transform each time it is drawn. This time-consuming can be solved by rendering the data cache.
  • the method is: after getting the drawing data, create an off-screen buffer, then perform off-screen rendering to get the rendering data, and cache the rendering data. When you need to draw, read the rendering data directly from the cache, which saves This reduces the time-consuming conversion from drawing data to rendering data.
  • the size of the vector diagram display depends on the size of the external display container View. Since the texture size is determined when the rendering data is cached, in order to ensure the definition of the final animation, when the rendering data is cached, the texture size should be applied at the maximum zoom ratio, so that when the cached rendering data is applied to a smaller In the scene, the rendered data can be compressed instead of stretched, so as to ensure clarity.
  • the zoom ratio To calculate the zoom ratio, first obtain the zoom ratio S1 of the external View used to carry and display the animation; then traverse each composition and layer layer by layer, and extract their zoom ratio S2; and then find the zoom ratio of each node in a tree structure Scaling ratio until the content is found, take out the original width and height of the content and the scaling ratio S3 relative to the parent node; finally, the scaling ratio of the content relative to the display View is calculated as the accumulation of these tree-level scaling ratios S1, S2, and S3 Multiply.
  • the content of PreCompose type and image type will be different.
  • an animation containing PreCompose type an animation has 3 layer layers, and the same picture may appear in each layer, but because they are all the same picture, only one copy is needed.
  • the zoom ratio of all places where the image is used should be considered, and the largest zoom ratio value should be taken.
  • the cached texture size is the scene with the maximum zoom ratio value. When this texture is When drawing on a smaller View, the cached image can be directly compressed to ensure clarity.
  • a setCacheScale(float scale) interface for setting the size is provided for external settings.
  • PAGView displaying animations are often placed in other containers, and when the size of these parent containers changes, notifications are required Go to PAGView, and apply the zoom ratio to the rendering data buffer, so as to ensure that the buffered rendering data is the best under the condition of ensuring the definition.
  • the maximum zoom ratio is used, it still takes up memory, and the cached pictures under recompression can still guarantee the definition without affecting the animation effect.
  • Figure 13 is an animation composed of only one layer A and one PreCompose, where the border of PreCompose is a small black thin rectangular frame, the actual content area is the part of the five-pointed star and clouds, and the other part is outside the rectangular frame.
  • the optimization method can include: first calculate the smallest rectangular box containing the actual content, and get the black dashed rectangular box A in Figure 13; finally Find the intersection of the rectangular frame A and the PreCompose boundary, and obtain the thick black rectangular frame B, which is the real display area.
  • FIG. 16 is a schematic diagram of rendering time consumption when the cache strategy is fully turned on
  • Figure 17 is a schematic diagram of rendering time consumption when the rendering data cache is turned off
  • Figure 18 is a schematic diagram of rendering time consumption when the drawing data cache is turned off
  • Figure 19 is rendering when the cache strategy is fully turned off Time-consuming schematic diagram.
  • Figure 16-19 shows that when the cache strategy is fully enabled, the rendering time is the smallest, taking only 25ms, as shown in the dashed box 1602 in Figure 16.
  • rendering time is The maximum is 1052 ms, as shown by the dashed box 1902 in Figure 19; from the rendering time in Figures 16 and 19, it can be seen that the rendering time corresponding to the full-on and full-off caching strategies differs by 2 orders of magnitude.
  • the corresponding rendering time is 863s, as shown in the dashed box 1702 in Figure 17; when the rendering data cache is closed (that is, the rendering data cache is used), the corresponding rendering The time-consuming is 184s, as shown in the dashed box 1802 in Figure 18, when the drawing data cache and the rendering data cache are used, there is a difference between the rendering time corresponding to the two, but the drawing data cache and the rendering data cache are two Different caching strategies are suitable for different application scenarios, so they play different roles.
  • the "crowded" animation is just a manifestation of the excellent performance of the application of rendering data cache in complex graphics.
  • the rendering time of animation can be greatly reduced.
  • the animation rendering method of the embodiment of the present invention can be used to reduce the rendering time of 1-2 levels.
  • the reduction in rendering time makes the animation playback very smooth, and it can also achieve better playback effects on low-end mobile phones, especially when multiple animations are played on the same page on the client at the same time, if it is the same animation, Can greatly save the time-consuming decoding.
  • Figures 2-4, 6-12, and 15 are schematic flowcharts of an animation rendering method in an embodiment. It should be understood that although the various steps in the flowcharts of FIGS. 2-4, 6-12, and 15 are displayed in sequence as indicated by the arrows, these steps are not necessarily executed in the order indicated by the arrows. Unless specifically stated in this article, the execution of these steps is not strictly limited in order, and these steps can be executed in other orders. Moreover, at least some of the steps in Figures 2-4, 6-12, and 15 can include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily executed at the same time, but can be executed at different times. The order of execution of these sub-steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with other steps or at least part of the sub-steps or stages of other steps.
  • an animation rendering device includes: a file acquisition module 2002, a determination module 2004, a data caching module 2006, a data reading module 2008, and an animation rendering module 2010; wherein :
  • the file acquisition module 2002 is used to acquire the animation file in the target format
  • the determining module 2004 is configured to determine an animation drawing data interval that meets the static condition from the animation drawing data obtained by decoding when the animation file is decoded;
  • the data cache module 2006 is used to cache the initial animation drawing data in the animation drawing data interval
  • the data reading module 2008 is used to read the buffered initial animation drawing data corresponding to the frame to be played when the animation drawing data corresponding to the frame to be played meets the static condition during the playback of the animation file;
  • the animation rendering module 2010 is used to perform animation rendering according to the read initial animation drawing data.
  • the initial animation drawing data of the animation drawing data that meets the static condition is buffered, and when the attribute value of the corresponding animation frame meets the static condition, the starting animation corresponding to the frame to be played is directly obtained from the buffer.
  • Start the animation drawing data and there is no need to analyze the animation file again to obtain the animation drawing data corresponding to the frame to be played, thereby avoiding a lot of calculations, saving time in the rendering process, and making the animation playback smoother.
  • the determining module 2004 is further used to: decode the animation file to obtain the animation file object; read the animation drawing data in the animation file object; determine the animation that meets the static condition from the read animation drawing data Plot the data interval.
  • the animation file object is obtained by decoding the animation file, the animation drawing data in the animation file object is read, and then the animation drawing data interval that meets the static condition in the animation file object is found, so as to start the animation drawing data interval.
  • the initial animation drawing data is cached, so that when the attribute value of the corresponding animation frame meets the static condition, the initial animation drawing data corresponding to the frame to be played is directly obtained from the cache, avoiding a lot of calculations and saving the rendering process Time-consuming, making the animation play more smoothly.
  • the animation file includes at least one animation layer; each animation layer includes at least two animation attribute groups; each animation attribute group includes at least two animation attribute values; the determining module 2004 is also used for: In the animation file, determine each animation attribute value interval in the animation attribute group that meets the static condition; take the intersection of the animation attribute value interval as the group drawing data interval of the animation attribute group; determine the intersection of the group drawing data interval as the animation layer The animation draws the data interval.
  • first determine the animation attribute value interval that meets the static condition and then calculate the animation drawing data interval using the bubbling algorithm, so as to cache the initial animation drawing data in the animation drawing data interval, so that the corresponding animation is played
  • the initial animation drawing data corresponding to the frame to be played is directly obtained from the cache, avoiding a large amount of calculation, saving time in the rendering process, and making the animation playback smoother.
  • the device further includes: an attribute value obtaining module 2012; wherein:
  • the data cache module 2006 is also used to cache the initial animation attribute value of the animation attribute value interval when there is no intersection between the animation attribute value intervals;
  • the data reading module 2008 is also used for reading the cached and the frame to be played when the animation drawing data corresponding to the frame to be played does not meet the static condition and there is no intersection between the animation attribute value intervals during the playback of the animation file The corresponding initial animation attribute value;
  • the attribute value obtaining module 2012 is used to obtain the animation attribute value corresponding to the frame to be played and that does not meet the static condition from the animation file object obtained by decoding the animation file;
  • the animation rendering module 2010 is also used to perform animation rendering according to the read initial animation attribute value and the acquired animation attribute value.
  • the initial animation attribute value of the animation attribute value with the static interval is cached.
  • the initial animation attribute value corresponding to the frame to be played is read from the cache. There is no need to parse the animation attribute value belonging to the static interval from the animation file object, thereby reducing The amount of calculation saves time-consuming in the rendering process, thereby making the animation playback smoother.
  • the device further includes: a data acquisition module 2014; wherein:
  • the data caching module 2006 is also used to cache the initial group drawing data of the group drawing data interval when there is an intersection between the animation attribute value intervals but there is no intersection between the group drawing data intervals;
  • the data reading module 2008 is also used to read the buffered initial set of drawing data corresponding to the frame to be played when the animation drawing data corresponding to the frame to be played does not meet the static condition during the playback of the animation file;
  • the data acquisition module 2014 is used to acquire group drawing data corresponding to the frame to be played and that does not meet the static condition from the animation file object obtained by decoding the animation file;
  • the animation rendering module 2010 is also used to perform animation rendering according to the read initial group drawing data and the acquired group drawing data.
  • the initial group drawing data of the group drawing data with the static interval is cached.
  • the initial group of drawing data corresponding to the frame to be played is read from the cache, and there is no need to parse the group of drawing data belonging to the static interval from the animation file object, thereby reducing The amount of calculation saves time-consuming in the rendering process, thereby making the animation playback smoother.
  • the animation attribute group when the animation attribute group is a drawable element attribute group, the animation attribute group includes at least two drawable elements; the drawable element includes at least two animation attribute values; as shown in FIG. 21, the device further includes: Intersection calculation module 2016; among them:
  • the determining module 2004 is also used to determine the intersection of the animation attribute value intervals as the element interval of the drawable element;
  • the intersection calculation module 2016 is used to calculate the intersection between the element intervals of the drawable elements
  • the determining module 2004 is also used to determine the intersection between the element intervals as the group drawing data interval of the animation attribute group.
  • the data reading module 2008 is further configured to: when the animation file is played in more than one application, determine each The playback progress corresponding to the application; when the animation drawing data corresponding to the frame to be played meets the static condition, the buffered initial animation drawing data corresponding to the playback progress and shared by more than one application is read.
  • the multiple applications when multiple applications of the client play the same animation, the multiple applications simultaneously share the animation drawing data parsed by the same animation file, and cache the initial animation drawing data, which reduces the cache space on the one hand. , On the other hand, it reduces the amount of analytical calculations.
  • the animation drawing data corresponding to each frame to be played meets the static condition, the initial animation drawing data corresponding to the frame to be played is obtained from the cache. There is no need to parse the animation drawing data belonging to the static interval from the animation file object, thereby reducing The amount of calculation saves time-consuming in the rendering process, thereby making the animation playback smoother.
  • the animation file includes a vector diagram
  • the data acquisition module 2014 is also used to obtain animation drawing data about the vector diagram obtained by decoding the animation file
  • the animation rendering module 2010 is also used to perform off-screen rendering of the animation drawing data to obtain the animation rendering data;
  • the data cache module 2006 is also used to cache the animation rendering data
  • the data reading module 2008 is also used to read the buffered animation rendering data corresponding to the frame to be played when the frame to be played is a vector graphics animation frame during the playback of the animation file.
  • the animation rendering module 2010 is further configured to: determine the first size ratio between the inner container and the outer container that carries the animation frame; determine the second size ratio between the animation layer and the inner container; determine the drawable element attribute The third size ratio between the animation drawing data and the animation layer in the group; the zoom ratio is determined according to the first size ratio, the second size ratio and the third size ratio.
  • the animation rendering module 2010 is also used to: determine the size of the external container used to display the vector diagram; determine the scaling ratio of the size of the animation drawing data to the size of the external container; create an off-screen buffer; In the buffer, perform animation rendering on the animation drawing data according to the size and zoom ratio of the external container to obtain the animation rendering data.
  • the device further includes: a zoom ratio acquisition module 2018 and a size adjustment module 2020; wherein:
  • the zoom ratio obtaining module 2018 is used to obtain the zoom ratio after the change when the ratio corresponding to the node in the node tree changes;
  • the node tree is composed of an outer container, an inner container, an animation layer, a drawable element attribute group, and a drawable element
  • the size adjustment module 2020 is used to adjust the size of the cached animation rendering data according to the zoom ratio; or,
  • the zoom ratio obtaining module 2018 is also used to obtain the input zoom ratio
  • the size adjustment module 2020 is also used to adjust the size of the cached animation rendering data according to the input zoom ratio value.
  • the animation drawing data of the vector diagram is pre-rendered first, so as to avoid the time-consuming need to convert the animation drawing data into the animation rendering data during playback, thereby effectively reducing the time-consuming in the rendering process. Conducive to improving the smoothness of animation playback.
  • the data acquisition module 2014 is also used for acquiring the vector image with the largest size when multiple animation layers are included in the animation file, and the vector images contained in each animation layer are the same but different in size.
  • Corresponding animation rendering data
  • the data caching module 2006 is also used for caching the animation rendering data corresponding to the vector graph with the largest size.
  • the animation rendering data corresponding to the vector with the largest size is cached.
  • three copies of animation rendering data can be cached at the same time, which reduces the cache
  • the maximum size of the animation rendering data is cached, so as to avoid the problem of image definition reduction caused by stretching the vector diagram during display.
  • the data caching module 2006 is further configured to: when the animation file includes a pre-composite attribute group, determine the animation area in the pre-composite attribute group that contains the first target animation rendering data; when the animation area contains non-animation When the size of the rendering data area reaches the preset condition, determine the minimum animation area containing the second target animation rendering data; the first target animation rendering data is a part of the second target animation rendering data; determine the acquired animation area and the minimum The intersection area between the animation areas; the intersection area is determined as the animation area of the pre-composite attribute group.
  • the animation area of the pre-synthesized animation attribute group contains a large invalid area
  • the animation area needs to be re-determined to ensure that the animation rendering data is included in the animation area, and the invalid area is minimized so as to avoid rendering It may consume additional calculations at the time, and avoid scanning the invalid area when loading the screen, reducing the time consumption of the screen.
  • Fig. 22 shows an internal structure diagram of a computer device in an embodiment.
  • the computer device may specifically be the terminal 110 in FIG. 1.
  • the computer equipment includes a processor, a memory, a network interface, an input device, and a display screen connected through a system bus.
  • the memory includes a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium of the computer device stores an operating system, and may also store a computer program.
  • the processor can realize the animation rendering method.
  • a computer program may also be stored in the internal memory, and when the computer program is executed by the processor, the processor can execute the animation rendering method.
  • the display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen.
  • the input device of the computer equipment can be a touch layer covered on the display screen, or a button, trackball or touch pad set on the housing of the computer equipment. It can be an external keyboard, touchpad, or mouse.
  • FIG. 22 is only a block diagram of part of the structure related to the solution of the present application, and does not constitute a limitation on the computer device to which the solution of the present application is applied.
  • the specific computer device may Including more or fewer parts than shown in the figure, or combining some parts, or having a different arrangement of parts.
  • the animation rendering apparatus provided in the present application may be implemented in the form of a computer program, and the computer program may run on the computer device as shown in FIG. 22.
  • the memory of the computer device can store various program modules that make up the animation rendering device, such as the file acquisition module 2002, the determination module 2004, the data caching module 2006, the data reading module 2008, and the animation rendering module 2010 shown in FIG. 20.
  • the computer program composed of each program module causes the processor to execute the steps in the animation rendering method of each embodiment of the present application described in this specification.
  • the computer device shown in FIG. 22 may execute S202 through the file acquisition module 2002 in the animation rendering apparatus shown in FIG. 20.
  • the computer device can execute S204 through the determining module 2004.
  • the computer device can execute S206 through the data caching module 2006.
  • the computer device can execute S208 through the data reading module 2008.
  • the computer device may execute S210 through the animation rendering module 2010.
  • a computer device including a memory and a processor, the memory stores a computer program, and when the computer program is executed by the processor, the processor executes the steps of the above animation rendering method.
  • the steps of the animation rendering method may be the steps in the animation rendering method of the foregoing embodiments.
  • a computer-readable storage medium is provided, and a computer program is stored.
  • the computer program is executed by a processor, the processor executes the steps of the animation rendering method.
  • the steps of the animation rendering method may be the steps in the animation rendering method of the foregoing embodiments.
  • Non-volatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory may include random access memory (RAM) or external cache memory.
  • RAM is available in many forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous chain Channel (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), etc.
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDRSDRAM double data rate SDRAM
  • ESDRAM enhanced SDRAM
  • SLDRAM synchronous chain Channel
  • memory bus Radbus direct RAM
  • RDRAM direct memory bus dynamic RAM
  • RDRAM memory bus dynamic RAM

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

本申请涉及一种动画渲染方法、装置、计算机可读存储介质和计算机设备,所述方法包括:获取目标格式的动画文件;当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间;缓存所述动画绘制数据区间中的起始动画绘制数据;当所述动画文件在播放过程中待播放帧所对应的动画绘制数据符合所述静止条件时,读取缓存的与所述待播放帧对应的起始动画绘制数据;根据读取的所述起始动画绘制数据进行动画渲染。

Description

动画渲染方法、装置、计算机可读存储介质和计算机设备
本申请要求于2019年06月11日提交中国专利局,申请号为2019105019948,发明名称为“动画渲染方法、装置、计算机可读存储介质和计算机设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理技术领域,特别是涉及一种动画渲染方法、装置、计算机可读存储介质和计算机设备。
背景技术
随着图像处理技术和网络技术的不断发展,动画的应用也越来越广泛,例如在客户端的页面上设置动画,可以提高页面的观赏效果。或者,用户在通过客户端拍摄视频过程中,可以选择相应的动画以便与所拍摄的视频进行合成,从而使拍摄出来的视频中具有用户想要的动画效果。
在播放动画时,需要对相应的动画文件进行解码得到关键帧的动画数据,然后根据关键帧的动画数据计算得到待播放动画帧的动画数据,然后渲染并上屏显示。然而,传统的渲染方案中,需要根据关键动画帧的动画数据计算当前待播放动画帧的动画数据进而渲染得到渲染数据,从而增大了渲染过程中的计算量,可能影响动画播放的流畅度。
发明内容
根据本申请的各种实施例,提供一种动画渲染方法、装置、计算机可读存储介质和计算机设备。
一种动画渲染方法,由计算机设备执行,所述方法包括:
获取目标格式的动画文件;
当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间;
缓存所述动画绘制数据区间中的起始动画绘制数据;
当所述动画文件在播放过程中待播放帧所对应的动画绘制数据符合所述静止条件时,读取缓存的与所述待播放帧对应的起始动画绘制数据;
根据读取的所述起始动画绘制数据进行动画渲染。
一种动画渲染装置,所述装置包括:
文件获取模块,用于获取目标格式的动画文件;
确定模块,用于当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间;
数据缓存模块,用于缓存所述动画绘制数据区间中的起始动画绘制数据;
数据读取模块,用于当所述动画文件在播放过程中待播放帧所对应的动画绘制数据符合所述静止条件时,读取缓存的与所述待播放帧对应的起始动画绘制数据;
动画渲染模块,用于根据读取的所述起始动画绘制数据进行动画渲染。
一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时,使得所述处理器执行上述动画渲染方法的步骤。
一种计算机设备,包括存储器和处理器,所述存储器存储有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行上述动画渲染方法的步骤。
本申请的一个或多个实施例的细节在下面的附图和描述中提出。本申请的其它特征和优点将从说明书、附图以及权利要求书变得明显。
附图说明
图1为一个实施例中动画渲染方法的应用环境图;
图2为一个实施例中动画渲染方法的流程示意图;
图3为一个实施例中解码动画文件得到动画绘制数据和确定动画绘制数据区间步骤的流程示意图;
图4为一个实施例中计算动画图层的动画绘制数据区间步骤的流程示意图;
图5为一个实施例中计算动画图层的静止区间的示意图;
图6为一个实施例中根据起始动画属性值和获取的动画属性值渲染动画步骤的流程示意图;
图7为一个实施例中根据起始组绘制数据和获取的组绘制数据渲染动画步骤的流程示意图;
图8为一个实施例中多个应用播放相同动画时使用共享的动画绘制数据进行渲染步骤的流程示意图;
图9为一个实施例中动画渲染数据缓存和渲染步骤的流程示意图;
图10为一个实施例中离屏渲染步骤的流程示意图;
图11为一个实施例中多个动画图层中包含相同矢量图时,保存尺寸最大矢量图的动画渲染数据步骤的流程示意图;
图12为一个实施例中计算预合成属性组的动画区域步骤的流程示意图;
图13为一个实施例中预合成属性组的动画区域的示意图;
图14为一个实施例中Transform2D的结构图;
图15为一个实施例中静止区间划分方法的流程示意图;
图16为一个实施例中缓存策略全开的渲染耗时示意图;
图17为一个实施例中关闭渲染数据缓存的渲染耗时示意图;
图18为一个实施例中关闭绘制数据缓存的渲染耗时示意图;
图19为一个实施例中缓存策略全关的渲染耗时示意图;
图20为一个实施例中动画渲染装置的结构框图;
图21为另一个实施例中动画渲染装置的结构框图;
图22为一个实施例中计算机设备的结构框图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
图1为一个实施例中动画渲染方法的应用环境图。参照图1,该动画渲染方法应用于动画渲染系统。该动画渲染系统包括终端110和服务器120。终端110 和服务器120通过网络连接。终端110具体可以是台式终端或移动终端,移动终端具体可以手机、平板电脑、笔记本电脑等中的至少一种。服务器120可以用独立的服务器或者是多个服务器组成的服务器集组来实现。
如图2所示,在一个实施例中,提供了一种动画渲染方法。本实施例主要以该方法应用于上述图1中的终端110来举例说明。参照图2,该动画渲染方法具体包括如下步骤:
S202,获取目标格式的动画文件。
其中,目标格式可以是PAG格式,后缀名为PAG(或pag)。PAG格式的动画文件可对文本或图像等资源进行集成到单文件内得到。此外,PAG格式的动画文件采用了极高压缩率的动态比特位存储技术。另外,PAG格式的动画文件可以实现跨平台运行,且在运行时可在保留动画效果的前提下,任意动态修改动画中的文本内容、字体大小样式,而且还可以替换图像内容,实现丰富多样的动画内容定制效果。
在一个实施例中,在检测到触发事件时,终端获取目标格式的动画文件。该触发事件可以是外部输入的操作指令,如动画播放指令。
在一个实施例中,S202之前,该方法还可以包括:当通过AE(Adobe After Effects,Adobe后期动画合成)客户端制作动画文件后,通过在AE客户端安装的PAG导出插件导出目标格式的动画文件。在导出目标格式的动画文件过程中,可以采用目标编码方式对动画文件进行编码。目标编码方式可以是哈夫曼(Huffman)编码、香农编码、RLC(Run Length Code,行程编码)、LZW(Lenpel-Ziv&Welch)编码、算术编码、预测编码、变换编码和量化编码等方式中的任一种。
其中,安装的PAG导出插件是为AE客户端开发的用于导出目标格式的插件。在AE客户端上导出动画文件时,可以有至少三种可选择的导出方式,例如选择矢量合成导出、序列帧导出和视频合成导出等导出方式。需要说明的是,目标格式的动画文件需要在AE客户端安装特定的PAG导出插件。
动画渲染可应用于特定应用场景中的动画播放,因此,对于S202获取动画文件的方式,可以按照动画播放的应用场景进行划分:
场景1,在客户端的展示页面播放动画。
在一个实施例中,在开启客户端的过程中,终端从本地存储的动画库中获取目标格式的动画文件;或者,终端向服务器发送动画获取指令,接收响应于该动画获取指令的目标格式的动画文件,以便在开启客户端时根据该动画文件在开启页面播放开启动画。其中,客户端的开启页面属于展示页面的一种页面。
例如,在客户端的开启页面增加动画效果,可以在开启客户端时,可以获取用于在开启页面播放的PAG格式的动画文件,以便根据PAG格式的动画文件于开启页面播放动画,从而增加开启页面的动画效果,提高用户体验。
在一个实施例中,当开启客户端之后,若展示页面滚动到目标位置时,终端从本地存储的动画库中获取目标格式的动画文件;或者,终端向服务器发送动画获取指令,接收响应于该动画获取指令的目标格式的动画文件,以便根据该动画文件在滚动到目标位置的展示页面播放开启动画。
例如,当客户端的展示页面滚动到底部时,可以获取用于在展示页面底部播放的PAG格式的动画文件,从而用户可以在将展示页面滚动到底部时观看到动画,在客户端展示页面设置动画效果,可以提高用户体验。
场景2,客户端在播放视频或展示图像时,在视频或图像的对应位置展示动画。
在拍摄视频或图像时,若设置了在拍摄页面选择了所提供的动画时,则所拍摄的视频或图像包含由对应的动画。在一个实施例中,当需要播放视频时,终端根据输入的选择指令,根据该选择指令选取指定的视频标识,根据该视频标识获取对应的IP(Internet Protocol,互联网协议)地址,按照该IP地址获取视频文件和对应的动画文件,以便在播放视频时,在视频中的设定的位置播放动画。
S204,当解码动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间。
其中,动画绘制数据可称为PAG动画的绘制数据,可以是用于描述或绘制PAG动画内容的数据,可根据动画绘制数据进行动画渲染,然后进行上屏便可得到对应的用于显示的PAG动画内容。对动画绘制数据区间中的某一瞬时动画绘制数据进行动画渲染然后上屏显示,便可得到某一帧的动画,因此,瞬时动画 绘制数据是组成某一动画帧的数据,与动画帧存在对应关系。符合静止条件的动画绘制数据区间可以是动画绘制数据在一段时间范围内不发生变化的数据区间。
需要说明的是,符合静止条件的动画绘制数据区间可理解为动画绘制数据的静止区间,也即动画图层的静止区间。当动画文件的动画图层为一个时,则动画绘制数据区间即为动画文件的静止区间;当动画文件的动画图层为多个时,则各动画绘制数据区间的交集即为动画文件的静止区间。在后续实施例中,若无特别说明,动画绘制数据区间即为符合静止条件的绘制数据区间。
静止条件可以指在一段时间范围内不发生变化。例如,从宏观上来说,静止条件可以指在播放过程中动画在一段时间范围内不发生变化(即动画在该段时间范围内保持静止)。从微观上来说,静止条件可以指组成动画帧的动画绘制数据在一段时间范围内不发生变化。进一步地,动画绘制数据由组绘制数据构成,因此静止条件可以指组绘制数据在一段时间范围内不发生变化。进一步地,组绘制数据构成由动画属性值构成,因此静止条件也可以指动画属性值在一段时间范围内不发生变化。其中,动画属性值可以是动画中文字的颜色、大小、字体以及文字的移动参数等。动画属性值与时间的变化相关,例如,动画属性值与时间可以是线性相关,即动画属性值随时间变化按比例增加或减小;动画属性值与时间也可以是贝塞尔相关,即动画属性值随时间变化呈贝塞尔曲线;动画属性值与时间还可以是保持(Hold)类型,即动画属性值在一段时间范围内没有发生变化,即符合静止条件。
需要说明的是,动画属性值属于时间轴属性的参数值,而时间轴属性可以是动画属性值与时间的变化有关的属性。
对于一个动画文件中的各动画帧,可以由多个动画图层(即动画的图层)的叠加而成,而动画图层又可以由更小颗粒的动画属性组(group)组成。其中,动画属性组共有六类动画属性组,包括:变换(transform)、表示图层内的遮罩(mask)、表示图层间的遮罩(trackMatter)、图层样式(layerStyle)、效果(effect)和内容(content)等六种动画属性组。内容类的动画属性组中包含有可绘制元素,如图像(image)、文字内容(text)和形状(shape)等,因此该内容类的动画属性组即为可绘制元素属性组。动画属性组中所包含的内容可以由对应的动画属 性值进行描述,也可以说动画属性值是动画属性组中的元素。例如,在图形渲染的时候,对于形状的描述,可以通过路径(path)信息和画笔(paint)信息这两种动画属性值便可描述除动画的形状。
在一个实施例中,在获取到目标格式的动画文件之后,终端可以通过客户端对该动画文件进行加载并解码,得到动画文件对象。然后,终端遍历动画文件对象上的动画属性值列表,将遍历所得动画属性值按照对应的动画属性组进行组合,得到组绘制数据。终端将组绘制数据按照对应的动画图层进行组合,得到动画绘制数据。
在一个实施例中,对于符合静止条件的动画绘制数据区间的确定方式,终端可以先找出符合静止条件的动画属性值区间。由于动画绘制数据包含组绘制数据,而组绘制数据包含动画属性值,因此,终端先计算动画属性组中各动画属性值区间的交集,将动画属性值区间的交集作为符合静止条件的组绘制数据区间,然后采用冒泡算法计算得到符合静止条件的动画绘制数据区间。
S206,缓存动画绘制数据区间中的起始动画绘制数据。
其中,由于动画绘制数据区间中的动画绘制数据不随时间发生变化,因此,在动画绘制数据区间内的各瞬时动画绘制数据均相同。此外,动画绘制数据区间中的某一瞬时动画绘制数据与动画帧存在对应关系,在缓存动画绘制数据时,终端只需缓存动画绘制数据区间中的起始动画绘制数据,通过该起始动画绘制数据便可渲染出对应动画帧中的画面。
在一个实施例中,终端确定符合静止条件的动画绘制数据区间之后,获取动画绘制数据区间中的起始动画绘制数据,然后将所获取的起始动画绘制数据进行缓存。其中,该起始动画绘制数据为瞬时动画绘制数据。
例如,对于动画文件A,假设总共有100帧,前十帧的动画是静止的,即前十帧的动画绘制数据符合静止条件。因此,终端可以在前十帧中获取第一帧的动画绘制数据,然后进行缓存。
S208,当动画文件在播放过程中待播放帧所对应的动画绘制数据符合静止条件时,读取缓存的与待播放帧对应的起始动画绘制数据。
在一个实施例中,终端在播放过程中,实时检测动画文件的播放进度,若当 前的待播放帧是静止的,即待播放帧所对应的动画绘制数据符合静止条件,则从缓存中读取与待播放帧对应的起始动画绘制数据。
例如,对于动画文件A,假设总共有100帧,前十帧的动画绘制数据符合静止条件。在播放动画文件A的过程中,若第一帧播放完需要播放第二帧时,由于前十帧的动画绘制数据符合静止条件,因此,终端只需从缓存中读取第一帧的动画绘制数据即可。
在一个实施例中,对于每一段动画绘制数据区间均可设置对应的标识,当动画文件在播放过程中待播放帧所对应的动画绘制数据符合静止条件时,从缓存中按照所设置的标识读取起始动画绘制数据,读取的起始动画绘制数据与待播放帧对应。
S210,根据读取的起始动画绘制数据进行动画渲染。
在一个实施例中,终端调用渲染工具对读取的起始动画绘制数据进行渲染,得到视觉的动画渲染数据,然后将该动画渲染数据进行上屏显示。
例如,终端调用WebGL(Web Graphics Library,网络图形库),或OpenGL ES(Open Graphics Library for Embedded Systems,嵌入式系统开放式图形库),或OpenGL ES2.0版本对读取的起始动画绘制数据进行渲染,然后将渲染所得的动画渲染数据进行上屏。
作为一个示例,对于动画文件A,若总共100帧的动画中前十帧是静止的,在缓存动画绘制数据时,只需缓存第一帧所对应的动画绘制数据。当播放第二至十帧时,终端只需获取缓存的第一帧所对应的动画绘制数据,对第一帧所对应的动画绘制数据进行渲染,便可得到第2至10帧的动画渲染数据,然后进行上屏显示即可得到第2至10帧的动画。
上述实施例中,对符合静止条件的动画绘制数据的起始动画绘制数据进行缓存,当播放到对应动画帧的属性值符合所述静止条件时,直接从缓存中获取与待播放帧对应的起始动画绘制数据,无需再次对动画文件进行解析来得到与待播放帧对应的动画绘制数据,从而避免了大量的计算,节省了渲染过程中的耗时,进而使动画播放变得更加流畅。
在一个实施例中,如图3所示,S204具体可以包括:
S302,对动画文件进行解码,获得动画文件对象。
其中,动画文件对象中包括有各种与动画帧相关的数据,例如用于描述动画的、且为最小颗粒的动画属性值。
在一个实施例中,当需要播放动画文件时,终端通过客户端解析目标格式的动画文件,得到二进制动画数据。终端按位读取二进制动画数据得到动画文件对象。
在一个实施例中,当客户端中有多于一个应用播放同一个动画时,终端只对同一个动画文件进行解码,并将解码所得的动画文件对象进行缓存。当多于一个应用播放同一个动画,可以从缓存的动画文件中读取动画文件对象即可。
例如,当客户端需要播放相同的动画时,只需要将同一份动画文件加载到内存并解码成动画文件对象,在同时播放多个相同的动画时,则对缓存的动画文件对象进行复用即可。
在一个实施例中,当客户端中有多于一个应用播放同一个动画时,终端只对同一个动画文件进行解码得到动画文件对象,将由动画文件对象读取的符合静止条件的动画绘制数据区间中的起始动画绘制数据进行缓存。当多于一个应用播放同一个动画,可以从缓存中读取与各应用的待播放帧对应的起始动画绘制数据即可。
S304,读取动画文件对象中的动画绘制数据。
在一个实施例中,终端遍历动画文件对象上的动画属性值列表,将遍历所得动画属性值按照对应的动画属性组进行组合,得到组绘制数据。终端将组绘制数据按照对应的动画图层进行组合,得到动画绘制数据。
S306,从所读取的动画绘制数据中确定符合静止条件的动画绘制数据区间。
在一个实施例中,对于符合静止条件的动画绘制数据区间的确定方式,终端可以先找出符合静止条件的动画属性值区间。由于动画绘制数据包含组绘制数据,而组绘制数据包含动画属性值,因此,终端先计算动画属性组中各动画属性值区间的交集,将动画属性值区间的交集作为符合静止条件的组绘制数据区间,然后采用冒泡算法计算得到符合静止条件的动画绘制数据区间。
在一个实施例中,当动画文件在多于一个的应用播放时,终端确定各应用对 应的播放进度;当播放进度对应待播放帧的动画绘制数据符合静止条件时,则读取缓存的、与播放进度对应的、且由多于一个的应用所共享的起始动画绘制数据。
上述实施例中,通过解码动画文件得到动画文件对象,读取动画文件对象中的动画绘制数据,然后查找出动画文件对象中符合静止条件的动画绘制数据区间,以便对动画绘制数据区间中的起始动画绘制数据进行缓存,从而在播放到对应动画帧的属性值符合所述静止条件时,直接从缓存中获取与待播放帧对应的起始动画绘制数据,避免大量的计算,节省渲染过程中的耗时,使动画播放变得更加流畅。
在一个实施例中,如图4所示,动画文件包括至少一个动画图层;每个动画图层包括至少两个动画属性组;每个动画属性组包括至少两个动画属性值;S204具体可以包括:
S402,确定动画属性组中符合静止条件的各动画属性值区间。
其中,动画属性组共有六类动画属性组,包括:transform、mask、trackMatter、layerStyle、effect和content等六种类型的动画属性组。动画属性值是动画属性组中的元素,对于mask,可以利用path信息和遮罩模式来实现绘制而成,因此path信息和遮罩模式即为mask的动画属性值。动画属性值是与时间的变化相关的值,例如,动画属性值随时间变化按比例增加或减小,或者动画属性值随时间变化呈贝塞尔曲线变化,又或者动画属性值在一段时间范围内不随时间变化而变化。
需要说明的是,符合静止条件的动画属性值区间可理解为动画属性值的静止区间。
在一个实施例中,终端在动画属性组中查找在一段时间范围内不随时间发生变化的动画属性值,该段时间范围即可作为符合静止条件的动画属性值区间。需要说明的是,上述的时间范围可以是以具体的时间为衡量单位的一段时间区间,也可以是以帧为衡量单位的一段时间区间。
S404,将动画属性值区间的交集作为动画属性组的组绘制数据区间。
其中,组绘制数据区间可理解为组绘制数据的静止区间,也即动画属性组的静止区间。
在一个实施例中,终端计算各动画属性值区间之间的交集,将动画属性值区间的交集作为动画属性组的组绘制数据区间,然后执行S406,即采用冒泡算法计算动画图层的动画绘制数据区间。
在一个实施例中,当各动画属性值区间之间不存在交集时,对各动画属性值区间的起始动画属性值进行缓存。其中,动画属性值区间之间不存在交集,则表示动画属性组中至少存在一个动画属性值是随时间发生变化的,因此动画属性组也是随时间发生变化的,即动画属性值组不符合静止条件(也即动画属性值组不存在静止区间),但动画属性值组中至少存在一个动画属性值符合静止条件,然后找出符合静止条件的动画属性值区间,对动画属性值区间的起始动画属性值进行缓存。
在一个实施例中,动画属性组为可绘制元素属性组时,动画属性组包括至少两个可绘制元素;可绘制元素包括至少两个动画属性值;该方法还包括:终端将动画属性值区间的交集确定为可绘制元素的元素区间,计算可绘制元素的元素区间之间的交集。S404具体可以包括:将元素区间之间的交集确定为动画属性组的组绘制数据区间。
其中,动画属性组包括transform、mask、trackMatter、layerStyle、effect和content等六种类型的动画属性组。content这种类型的动画属性组是动画图层必不可少的组成部分,它表示的是该动画图层的可绘制元素,如shape、text、实体(solid)、预合成(PreCompose)和image等,而其它group是在content基础上做的一些处理,如平移缩放、加遮罩,加滤镜特效等。由于Content的多样性,需要针对具体的可绘制元素去做缓存。
S406,将组绘制数据区间的交集确定为动画图层的动画绘制数据区间。
其中,动画绘制数据区间可理解为动画绘制数据的静止区间,也即动画图层的静止区间。
在一个实施例中,终端计算各组绘制数据区间的交集,并将组绘制数据区间的交集确定为动画图层的动画绘制数据区间。
动画文件由至少一个动画图层组成,一个动画图层对应一个动画绘制数据,而动画绘制数据是由至少一个组绘制数据构成,对各动画绘制数据进行渲染然 后上屏便可得到对应的动画。
在一个实施例中,当各组绘制数据区间之间不存在交集时,对各组绘制数据区间的起始组绘制数据进行缓存。其中,组绘制数据区间之间不存在交集,则表示动画图层中至少存在一个组绘制数据是随时间发生变化的,因此动画图层也是随时间发生变化的,即组绘制数据组不符合静止条件,但组绘制数据组中至少存在一个组绘制数据符合静止条件,然后找出符合静止条件的组绘制数据区间,对组绘制数据区间的起始组绘制数据进行缓存。
在一个实施例中,当动画文件的动画图层数量为一个时,终端将动画绘制数据区间作为动画静止区间。当动画文件的动画图层数量为多个时,终端计算各动画图层数量的动画绘制数据区间之间的交集,将动画绘制数据区间之间的交集作为动画静止区间。
需要说明的是,当动画文件的动画图层数量为多个时,若存在符合静止条件的动画绘制数据区间,则在动画绘制数据区间之中,对应动画图层中的图像或文本不发生变化(即处于静止状态)。当动画文件的动画图层数量为多个时,若存在动画静止区间,则在动画静止区间之中,动画中的所有图像或文本不发生变化(即处于静止状态)。
在一个实施例中,若动画文件中除了包括动画图层,还包括预合成属性组时,预合成属性组可以视为一个子动画文件,该子动画文件又可以包括至少一个动画图层和/或嵌套的子预合成属性组。在计算预合成属性组中动画图层、动画属性组和动画属性值的静止区间,可以参考上述实施例的方式。
作为一个示例,如图5所示,对于动画文件的某一个动画图层,若该动画图层由三个group组成,分别为transform、mask和content。其中,[t1,t2]表示transform的静止区间,即该动画图层未发生任何变换(如未发生平移、缩放和旋转等);[m1,m2]表示mask的静止区间,该即该动画图层一直使用同一个遮罩、且遮罩没有任何变化;[c1,c2]和[c3,c4]表示content的静止区间,即该动画图层分别应用了同一个文本和同一张图像,且[c1,c2]期间文本没有发生任何变化,且[c3,c4]期间图像也没有发生任何变化。需要说明的是,t1、t2、m1、m2、c1、c2、c3和c4可以用帧表示,例如t1可以表示动画的第一帧,m1可以表示动画的第 三帧,t2可以表示动画的第十帧。
对于content来说,[c1,c2]和[c3,c4]为content的静止区间。在播放动画文件时,终端只需要解析一次c1时刻的文本和c3时刻的图像,并把解析所得的文本和图像进行缓存并做标记。当动画播放需要绘制的帧为c、且c属于[c1,c2]区间时,终端无需再重新解析c帧的文本,可以直接从缓存中获取c1帧的组绘制数据,然后根据c1帧的组绘制数据将文本渲染出来,渲染出来的文本和c帧文本完全一样。
上述只考虑了content这一种group场景,对于一个动画图层而言,通常是由多个group组成,例如,对于一个只包含transform、mask和content三种group的动画图层来说,其静止区间即为transform、mask和content静止区间的交集,如图5所示,其静止区间应该有两段:[m1,c2]和[c3,t2],具体可参考图5的虚线截断部分。
上述实施例中,先确定符合静止条件的动画属性值区间,然后采用冒泡算法计算出动画绘制数据区间,以便对动画绘制数据区间中的起始动画绘制数据进行缓存,从而在播放到对应动画帧的属性值符合所述静止条件时,直接从缓存中获取与待播放帧对应的起始动画绘制数据,避免大量的计算,节省渲染过程中的耗时,使动画播放变得更加流畅。
在一个实施例中,如图6所示,该方法还包括:
S602,在动画属性值区间之间不存在交集时,缓存动画属性值区间的起始动画属性值。
其中,动画属性值区间可理解为动画属性值的静止区间。动画属性值区间之间不存在交集,则表示动画属性组中至少存在一个动画属性值是随时间发生变化的,因此动画属性组也是随时间发生变化的,即动画属性值组不存在静止区间,但动画属性值组中至少存在一个动画属性值符合静止条件,然后找出符合静止条件的动画属性值区间,对动画属性值区间的起始动画属性值进行缓存。
S604,当动画文件在播放过程中待播放帧所对应的动画绘制数据不符合静止条件、且动画属性值区间之间不存在交集时,则读取缓存的与待播放帧所对应的起始动画属性值。
其中,动画绘制数据区间可理解为动画绘制数据的静止区间,也即动画图层的静止区间。动画文件在播放过程中待播放帧所对应的动画绘制数据不符合静止条件、且动画属性值区间之间不存在交集,表示待播放帧所对应的动画绘制数据未命中对应的静止区间、且待播放帧所对应的组绘制数据也未命中对应的静止区间。
在一个实施例中,在播放动画文件时,若待播放帧所对应的动画绘制数据未命中静止区间,则执行S208。若待播放帧所对应的动画绘制数据未命中静止区间,终端则继续往动画绘制数据中的小颗粒(即组绘制数据)查找,查找待播放帧所对应的组绘制数据是否命中对应的静止区间,若命中,则从缓存中获取与待播放帧对应的起始组绘制数据,然后还从动画文件对象中解析与待播放帧对应的、且为非静止区间中的组绘制数据,从而终端可以根据所获取的起始组绘制数据和解析所得的组绘制数据进行动画渲染。
在一个实施例中,若待播放帧所对应的动画绘制数据未命中对应的静止区间、且待播放帧所对应的组绘制数据也未命中对应的静止区间时,终端则继续往组绘制数据中的小颗粒(即动画属性值)查找,查找待播放帧所对应的动画属性值是否命中对应的静止区间,若命中,则从缓存中获取与待播放帧对应的起始动画属性值,然后执行S606。
由于静止区间段的存在,当播放到某一帧时,需要遍历所有的动画图层,若当前待播放帧命中动画图层的静止区间时,那么对于该动画图层而言,整个静止区间内所有帧的数据都是一样的,因此可以从缓存中直接获取对应的起始绘制数据。如果未命中动画图层的的静止区间时,则再遍历该动画图层的所有group,若命中group的静止区间,终端直接将group的静止区间中的起始组绘制数据拿来使用即可;此外,还从动画文件对象中解析与待播放帧对应的、且为非静止区间的组绘制数据。若未命中group的静止区间,终端则再遍历该group的所有动画属性值,若命中动画属性值的静止区间,终端直接将动画属性值的静止区间中的起始动画属性值拿来使用即可,此外,还执行S606。
S606,从动画文件解码所得的动画文件对象中,获取与待播放帧对应的、且不符合静止条件的动画属性值。
在一个实施例中,当读取到与待播放帧所对应的起始动画属性值时,终端还从动画文件对象中解析与待播放帧对应的、且为非静止区间中的动画属性值,从而终端可以根据所获取的起始动画属性值和解析所得的动画属性值进行动画渲染。
S608,根据读取的起始动画属性值和所获取的动画属性值进行动画渲染。
其中,对于待播放帧,其动画绘制数据可以由静止区间内的动画属性值和非静止区间内的动画属性值组成。对于静止区间内的动画属性值,可以直接从缓存中读取与待播放帧对应的起始动画属性值即可,无需从动画文件对象中解析。
在一个实施例中,终端读取的起始动画属性值和所获取的动画属性值转换成动画绘制数据,然后根据动画绘制数据进行动画渲染。
上述实施例中,当动画图层和动画属性组均不存在静止区间时,则对存在静止区间的动画属性值的起始动画属性值进行缓存。当待播放帧所命中动画属性值的静止区间时,则从缓存中读取待播放帧所对应的起始动画属性值,无需从动画文件对象解析属于静止区间中的动画属性值,从而降低了计算量,节省了渲染过程中的耗时,进而使动画播放变得更加流畅。
在一个实施例中,如图7所示,该方法还包括:
S702,在动画属性值区间之间存在交集、但组绘制数据区间之间不存在交集时,缓存组绘制数据区间的起始组绘制数据。
其中,动画属性值区间可理解为动画属性值的静止区间。动画属性值区间之间存在交集,则表示动画属性组中的所有动画属性值均不随时间发生变化,因此动画属性组存在静止区间,若动画图层不存在静止区间时,终端则缓存组绘制数据区间的起始组绘制数据。
S704,当动画文件在播放过程中待播放帧所对应的动画绘制数据不符合静止条件时,则读取缓存的与待播放帧所对应的起始组绘制数据。
其中,动画绘制数据区间可理解为动画绘制数据的静止区间,也即动画图层的静止区间。动画文件在播放过程中待播放帧所对应的动画绘制数据不符合静止条件、且动画属性值区间之间存在交集,表示待播放帧所对应的动画绘制数据未命中对应的静止区间、但待播放帧所对应的组绘制数据中的至少一部分数据 命中对应的静止区间。
在一个实施例中,在播放动画文件时,若待播放帧所对应的动画绘制数据未命中静止区间,则执行S208。若待播放帧所对应的动画绘制数据未命中静止区间,终端则继续往动画绘制数据中的小颗粒(即组绘制数据)查找,查找待播放帧所对应的组绘制数据是否命中对应的静止区间,若命中,则从缓存中获取与待播放帧对应的起始组绘制数据,然后执行S706。
S706,从动画文件解码所得的动画文件对象中,获取与待播放帧对应的、且不符合静止条件的组绘制数据。
在一个实施例中,当读取到与待播放帧所对应的起始组绘制数据时,终端还从动画文件对象中解析与待播放帧对应的、且为非静止区间中的组绘制数据,从而终端可以根据所获取的起始组绘制数据和解析所得的组绘制数据进行动画渲染。
S708,根据读取的起始组绘制数据和所获取的组绘制数据进行动画渲染。
其中,对于待播放帧,其动画绘制数据可以由静止区间内的组绘制数据和非静止区间内的组绘制数据组成。对于静止区间内的组绘制数据,可以直接从缓存中读取与待播放帧对应的起始组绘制数据即可,无需从动画文件对象中解析。
在一个实施例中,终端读取的起始组绘制数据和所获取的组绘制数据转换成动画绘制数据,然后根据动画绘制数据进行动画渲染。
上述实施例中,当动画图层不存在静止区间,而动画属性组存在静止区间时,则对存在静止区间的组绘制数据的起始组绘制数据进行缓存。当待播放帧所命中组绘制数据的静止区间时,则从缓存中读取待播放帧所对应的起始组绘制数据,无需从动画文件对象解析属于静止区间中的组绘制数据,从而降低了计算量,节省了渲染过程中的耗时,进而使动画播放变得更加流畅。
在一个实施例中,如图8所示,S208具体可以包括:
S802,当动画文件在多于一个的应用播放时,确定各应用对应的播放进度。
其中,多于一个的应用播放动画文件可以是:客户端中的多个地方需要播放相同的动画。例如,客户端的视频播放器视频、且该视频带有动画,在视频播放器外的展示页面的某个位置(如视频播放器未开全屏模式、且视频播放器固定在 显示屏的上方位置、而展示页面拉至底部)也需要播放该动画。
在一个实施例中,当客户端中有多于一个应用播放同一个动画时,终端只对同一个动画文件进行解码,并将解码所得的动画文件对象进行缓存。当多于一个应用播放同一个动画,可以从缓存的动画文件中读取动画文件对象即可。
例如,当客户端需要播放相同的动画时,只需要将同一份动画文件加载到内存并解码成动画文件对象,在同时播放多个相同的动画时,则对缓存的动画文件对象进行复用即可。
在另一个实施例中,当客户端中有多于一个应用播放同一个动画时,终端只对同一个动画文件进行解码得到动画文件对象,将由动画文件对象读取的符合静止条件的动画绘制数据区间中的起始动画绘制数据进行缓存,缓存的起始动画绘制数据可以被多个应用播放时进行共享。当多于一个应用播放同一个动画,终端可以从缓存中读取与各应用的待播放帧对应的起始动画绘制数据即可。
在另一个实施例中,当客户端中有多于一个应用播放同一个动画时,终端只对同一个动画文件进行解码得到动画文件对象,若由动画文件对象读取的动画绘制数据不符合静止条件时,则判断动画属性组的组绘制数据是否符合静止条件,若符合,则缓存符合静止条件的起始组绘制数据,缓存的起始组绘制数据可以被多个应用播放时进行共享。当多于一个应用播放同一个动画,终端可以从缓存中读取与各应用的待播放帧对应的起始组绘制数据即可。
在另一个实施例中,当客户端中有多于一个应用播放同一个动画时,终端只对同一个动画文件进行解码得到动画文件对象,若由动画文件对象读取的动画绘制数据不符合静止条件、且动画属性组的组绘制数据也不符合静止条件时,则判断动画属性值是否符合静止条件,若符合,则缓存符合静止条件的起始动画属性值,缓存的起始动画属性值可以被多个应用播放时进行共享。当多于一个应用播放同一个动画,终端可以从缓存中读取与各应用的待播放帧对应的起始动画属性值即可。
S804,当播放进度对应待播放帧的动画绘制数据符合静止条件时,则读取缓存的、与播放进度对应的、且由多于一个的应用所共享的起始动画绘制数据。
其中,播放进度对应待播放帧的动画绘制数据符合静止条件,即为播放进度 对应待播放帧的动画绘制数据存在静止区间。
在一个实施例中,当多个应用播放相同动画时,终端分别记录各应用所播放动画的播放进度,然后按照各播放进度分别确定对应待播放帧的动画绘制数据是否命中静止区间,若命中动画绘制数据的静止区间,则读取缓存的、与播放进度对应的、且由多于一个的应用所共享的起始动画绘制数据。若未命中动画绘制数据的静止区间,终端则再遍历动画图层中的、用于构成动画绘制数据的组绘制数据,若命中组绘制数据的静止区间时,则直接将该静止区间中的起始组绘制数据拿来使用即可。若未命中组绘制数据的静止区间时,终端则再遍历动画属性组中的、用于构成组绘制数据的动画属性值,若命中动画属性值的静止区间时,则直接将该静止区间中的起始动画属性值拿来使用即可。
S806,将读取的由多于一个的应用所共享的起始动画绘制数据依次进行渲染,得到各应用所对应的动画渲染数据。
在一个实施例中,若动画绘制数据和组绘制数据均不存在静止区间、动画属性值存在静止区间时,终端一方面读取的由多于一个的应用所共享的起始动画属性值,另一方面从共享的动画文件对象中解析非静止区间的动画属性值,将读取的起始动画属性值和解析所得的动画属性值转换成动画绘制数据,然后进行渲染。
在一个实施例中,若动画绘制数据不存在静止区间、但组绘制数据存在静止区间时,终端一方面读取的由多于一个的应用所共享的起始组绘制数据,另一方面从共享的动画文件对象中解析非静止区间的组绘制数据,将读取的起始组绘制数据和解析所得的组绘制数据合成动画绘制数据,然后进行渲染。
上述实施例中,当客户端的多个应用播放同一个动画时,则多个应用同时共享同一个动画文件解析出来的动画绘制数据,并将起始动画绘制数据进行缓存,一方面降低了缓存空间,另一方面降低了解析的计算量。当各待播放帧所对应动画绘制数据符合静止条件时,则从缓存中获取与待播放帧对应的起始动画绘制数据,无需从动画文件对象解析属于静止区间中的动画绘制数据,从而降低了计算量,节省了渲染过程中的耗时,进而使动画播放变得更加流畅。
在一个实施例中,除了缓存动画绘制数据,还可以缓存动画渲染数据。动画 文件中包括有矢量图;如图9所示,该方法还可以包括:
S902,获取动画文件解码所得的关于矢量图的动画绘制数据。
其中,矢量图也称为面向对象的图像,是用点、直线或者多边形等基于数学方程的几何图元表示图像,在放大、缩小或旋转等情况下不会失真。动画中的文字(如图形文字)也属于矢量图。
对于矢量图来说,终端缓存动画绘制数据、或组绘制数据、或动画属性值时,可以保证动画获取某一瞬时的描述数据可以直接获取到,而将这些描述数据转化为可上屏显示的动画渲染数据,还需要进行计算,尤其是复杂的path信息描述的矢量图,在渲染时需逐一解析path信息中的点坐标信息,以及点与点的路径描述信息,然后才能把path信息转换成矢量图。文本的描述信息也是如此,动画绘制数据缓存的只是文本的path信息和paint信息,这些信息在每次渲染时,计算量大且耗时长。因此,在播放动画时,需先将动画绘制数据渲染成动画渲染数据,然后进行缓存,在需要的时候直接从缓存中读取对应的动画渲染数据即可。
在一个实施例中,当确定动画文件中包括有矢量图时,终端获取动画文件解码所得的关于矢量图的动画绘制数据,或者,当矢量图中的动画绘制数据符合静止条件时,确定符合静止条件的动画绘制数据区间,从该动画绘制数据区间中获取起始动画绘制数据。
S904,对动画绘制数据进行离屏渲染,得到动画渲染数据。
其中,动画渲染数据可以是图像纹理。
在一个实施例中,S904具体可以包括:终端对解码所得的动画绘制数据进行离屏渲染,得到动画渲染数据;或者,终端对动画绘制数据区间中的其实动画绘制数据进行离屏渲染,得到动画绘制数据。
S906,将动画渲染数据进行缓存。
在一个实施例中,在缓存动画渲染数据之前,终端确定动画渲染数据的尺寸,当动画渲染数据的尺寸大于预设尺寸阈值时,在保证动画画质的前提下对动画渲染数据进行压缩,以降低动画渲染数据的尺寸,然后进行缓存,从而可以降低缓存大小。
S908,当动画文件在播放过程中待播放帧为矢量图动画帧时,读取缓存的与待播放帧对应的动画渲染数据。
在一个实施例中,当播放过程中待播放帧为矢量图动画帧时,在缓存中读取与待播放帧对应的动画渲染数据;或者,当播放过程中待播放帧为矢量图动画帧、且待播放帧所对应动画绘制数据符合静止条件时,则从缓存中读取与待播放帧对应、且由动画绘制数据区间中的起始动画绘制数据渲染所得的动画渲染数据。
上述实施例中,通过对矢量图的动画绘制数据先进行预渲染,然后奖渲染所得的动画渲染数据进行缓存,当播放过程中待播放帧为矢量图动画帧时,从缓存中读取与待播放帧对应的动画渲染数据,从而避免了从动画绘制数据转换为动画渲染数据的耗时,从而有效地降低了渲染过程中的耗时,有利于提升动画播放的流畅度。
在一个实施例中,如图10所示,S904具体可以包括:
S1002,确定用于展示矢量图的外部容器的尺寸。
其中,外部容器可以是用于展示矢量图的外部容器视图(View)。例如,动画在手机上播放,矢量图在手机上所展示区域所对应的图像容器可称为外部容器。
矢量图真实展示的大小取决于外部容器的尺寸,由于动画渲染数据在缓存时,动画渲染数据的尺寸就已经确定。为了保证动画清晰度,在缓存动画渲染数据时,其尺寸大小可以以最大缩放比值来应用,以便当缓存的动画渲染数据被应用在较小的场景时,可以对动画渲染数据进行压缩而不是拉伸,从而可以有效保证动画的清晰度。
在一个实施例中,在设计过程中,设计终端可以根据动画在各终端的尺寸大小,来预设置用于展示矢量图的外部容器的尺寸。在播放动画时,终端可以根据终端本身尺寸大小,从预设置的尺寸中确定对应的用于展示矢量图的外部容器的尺寸。
S1004,确定动画绘制数据的尺寸相对外部容器的尺寸的缩放比值。
在一个实施例中,S1004具体可以包括:确定承载动画帧的内部容器与外部容器间的第一尺寸比值;确定动画图层与内部容器间的第二尺寸比值;确定可绘 制元素属性组中动画绘制数据与动画图层间的第三尺寸比值;根据第一尺寸比值、第二尺寸比值和第三尺寸比值确定缩放比值。
例如,对于缩放比值的计算,终端首先获取承载动画帧的内部容器与外部容器的第一尺寸比值S1;然后逐层遍历每个图层(layer),取出layer相对内部容器的第二尺寸比值S2;再依次树形结构遍历每个子节点直到寻找到content,取出该content节点的原始宽高和相对于layer节点的第三尺寸比值S3;最后计算出content相对于外部容器的缩放比值即为S1、S2和S3之间的累乘。其中,节点树由外部容器、内部容器、动画图层、可绘制元素属性组和可绘制元素属性组中的动画绘制数据构成。
在一个实施例中,当节点树中节点所对应的比值发生变化时,获取变化后的缩放比值;节点树由外部容器、内部容器、动画图层、可绘制元素属性组和可绘制元素属性组中的动画绘制数据构成;按照缩放比值对缓存的动画渲染数据的尺寸进行调整;或者,获取输入的缩放比值,根据输入的缩放比值对缓存的动画渲染数据的尺寸进行调整。
例如,在播放动画时,终端会将展示动画的PAG视图(PAGView)挂在其它节点,如挂着末位的叶子节点,而这些父节点的尺寸发生改变时,将会通知到PAGView,并把缩放比值应用到动画渲染数据的缓存上,这样可以确保缓存的动画渲染数据是保证清晰度情况下最优的。
另外,若使用最大缩放比值还是占用较大内存,且对缓存的动画渲染数据进一步压缩依然能保证动画的清晰度不影响动画效果,那么,在最大缩放比值的基础上,可以再另外设置一个缩放比值,以减小缓存的占用大小。
S1006,创建离屏缓冲区。
S1008,在离屏缓冲区中,按照外部容器的尺寸和缩放比值对动画绘制数据进行动画渲染,得到动画渲染数据。
在一个实施例中,终端为使显示器的显示跟动画控制器同步,当电子枪新扫描一行时,准备扫描的时发送一个水平同步信号,显示器的刷新频率就是同步信号产生的频率。然后,处理器计算动画帧(frame)、动画宽高等参数值,将计算的参数值交给显卡去渲染得到动画渲染数据,显卡渲染的动画渲染数据放入离 屏缓冲区。最后,视频控制器会按照同步信号逐行读取离屏缓冲区的动画渲染数据,经过数模转换传递给显示器进行显示。
其中,离屏渲染的整个过程,需切换上下文环境:先是从当前屏幕(On-Screen)切换到离屏(Off-Screen),等到离屏渲染结束以后,将离屏缓冲区的动画渲染数据显示到屏幕上时,终端需将上下文环境从离屏切换到当前屏幕。
上述实施例中,将矢量图的动画绘制数据先进行预渲染,避免在播放时需要将动画绘制数据转换为动画渲染数据所需的耗时,从而有效地降低了渲染过程中的耗时,有利于提升动画播放的流畅度。
在一个实施例中,如图11所示,该方法还包括:
S1102,当动画文件中包括有多个动画图层、且各动画图层中所包含的矢量图相同但尺寸不同时,则获取尺寸最大的矢量图所对应的动画渲染数据。
在一个实施例中,当动画文件包括有预合成属性组、且预合成属性组中合成了多个动画图层、且各动画图层中所包含的矢量图相同但尺寸不同时,终端则获取尺寸最大的矢量图所对应的动画渲染数据。
S906具体可以包括:S1104,将尺寸最大的矢量图所对应的动画渲染数据进行缓存。
例如,一个动画有三个动画图层,而同一张图像出现在每个动画图层中,由于三个动画图层都包含同一张图像,所以只需要缓存一份即可。然而,动画渲染数据缓存的尺寸大小,需考虑各动画图层的缩放情况,选取最大的缩放比值,从而在缓存时,缓存的动画渲染数据的尺寸大小即为最大缩放比值,当这个动画渲染数据绘制在较小的容器或图层上时,可以对缓存的动画渲染数据进行压缩,保证清晰度。
上述实施例中,当多个图层中出现有相同的矢量图时,将尺寸最大的矢量图所对应的动画渲染数据进行缓存,一方面,可以避免同时缓存三份动画渲染数据,降低了缓存的占用大小;另一方面,缓存最大尺寸的动画渲染数据,避免在显示是对矢量图进行拉伸而造成图像清晰度降低的问题。
在一个实施例中,如图12所示,S906具体可以包括:
S1202,当动画文件包括有预合成属性组时,确定预合成属性组中包含第一 目标动画渲染数据的动画区域。
其中,第一目标动画渲染数据是包含动画渲染数据中的一部分数据。例如,如图13所示,第一目标动画渲染数据即为五角星和云朵中的其中一部分数据。
由于动画渲染数据可以理解为图像纹理,是具有尺寸大小的,因此,动画区域所包含的内容可能只包含了动画渲染数据的一部分,若动画区域中的非动画渲染数据区域的尺寸较大,为避免渲染时可能会消耗额外的计算量,需要对动画区域进行优化。
S1204,当动画区域中包含的非动画渲染数据区域的尺寸达到预设条件时,则确定包含第二目标动画渲染数据的最小动画区域;第一目标动画渲染数据为第二目标动画渲染数据中的一部分。
其中,第二目标动画渲染数据是包含动画渲染数据中的全部数据,也即第一目标动画渲染数据属于第二目标动画渲染数据中的一部分。非动画渲染数据区域可以是空白区域或无效区域。
由于第二目标动画渲染数据为保护动画数据中的全部数据,可以使用一个矩形框对第二目标动画数据进行框定,且该矩形框为框定第二目标动画数据时的最小矩形框,则将该最小矩形框作为最小动画区域,如图13中的虚线区域。
S1206,确定所获取的动画区域与最小动画区域之间的交集区域。
S1208,将交集区域确定为预合成属性组的动画区域。
在一个实施例中,当确定动画区域与最小动画区域之间的交集区域时,终端将交集区域作为预合成属性组的动画区域进行显示。
作为一个示例,当预合成属性组的动画区域包含过多无效区域或空白区域时,终端对预合成属性组的边界进行优化,通过计算得到包含所有动画渲染数据的最小矩形区域,然后求这个矩形与预合成属性组的边界矩形的交集,从而得到预合成属性组的真实边界。
如图13所示,动画文件只包含一个只有一个layer A和一个关于预合成的属性组,其中预合成属性组的边界是黑色细线矩形框C,实际内容区域为五角星和云朵的一部分,五角星和云朵的另部分在这个矩形边框外。在渲染时,由于这个矩形框大部分内容都是空白,所以可以优化。优化过程可以包括:先计算出最 小的包含了实际内容区域的矩形大小,可以得到图中虚线矩形框A;最后求虚线矩形A与预合成属性组的边界(即黑色细线矩形框C)之间的交集,得到黑色粗线矩形框B,黑色粗线矩形框B即为真实需要渲染并展示的动画区域。
上述实施例中,对于预合成动画属性组的动画区域包含有较大无效区时,需要重新确定动画区域,保证动画渲染数据被包含在动画区域中,且尽量减小无效区,从而可以避免渲染时可能会消耗额外的计算量,且避免在上屏时扫描无效区,降低上屏耗时。
作为一个示例,对于传统的动画渲染方案,例如Lottie动画渲染,Lottie动画采用的是json(JavaScript Object Notation,JS对象简谱)格式存储动画的所有信息,在播放动画过程中:先把动画文件加载到内存,再把动画文件按照json格式解析出来,在播放时定位到播放时刻,取出瞬时的动画信息,也就是从解析的json文件中找到这个瞬时动画需要展示的内容,可以是图像和文本等,最后需要把这个瞬时的展示内容转成可绘制上屏的渲染数据。在渲染过程中,对于矢量图需要计算path信息上各个点信息并绘制路径,而文本则是读paint和path信息并转换。
传统的动画渲染方案中,对于相同的动画,没有复用同一份动画文件的内存数据,若在屏幕出现多个相同动画同时播放,会加载同一份动画文件多次。另外,解析动画某一瞬时的数据,然后转成待绘制的数据,但是没有考虑绘制数据的静止区间,因此在从内存数据转成绘制数据没有复用静止区间中的起始绘制数据。最后,它也没有考虑矢量图的绘制数据转成渲染数据的耗时,对于复杂的矢量图,每次从绘制数据到渲染数据,需要较多的渲染耗时。
针对上述问题,本发明实施例提出了三级缓存的动画渲染方案,该方案中包括有动画文件缓存、绘制数据缓存和渲染数据缓存,接下来对三种缓存方式进行阐述,如下所述:
(一)动画文件缓存
将动画文件解码并保存到内存中,对于同一个动画文件,只会保存一份解码后的文件对象。当有多个场景需要播放这个动画时,都基于这个文件对象,而不 会再重新读取一遍动画文件,避免多次解码的过程,这样节省了很多不必要的开销,尤其是内存方面的开销。例如,当客户端启动之后,存在一个通用的加载(loading)动画,所有的加载应用场景都会使用到这个loading动画,那么每次加loading时都加载一次动画文件,即便动画文件的描述数据占用内存很小,但是当几十个动画都需要这个loading动画时,那么内存占用开销非常大。
此外,还可以节省解码耗时,对于相同的多个loading动画只解码动画文件一次,使得耗时比较少,避免每个动画都从动画文件解码一次到内存中而产生额外的耗时。
(二)绘制数据缓存
利用PAG动画特性,划分动画静止区间,并对静止区间做绘制数据缓存,在介绍绘制数据缓存之前,首先对静止区间的划分进行阐述,具体如下:
(1)划分静止区间
所谓静止区间,最直观的理解,例如:一个PAG动画总时长为2s,假设每秒具有24帧,若在1s到1.5s这个时间段内没有任何变化,即在1s到1.5s动画是静止的,那么这个动画的静止区间就是[24,36],单位为帧。
一个动画文件中的各帧动画可以由多个图层叠加而成的,而图层又由更小的group组成,最后的group又是由一系列的属性描述,包括时间轴属性和固定属性。对于时间轴属性来说,可以找到Hold类型的区间段,即时间轴属性的静止区间;对于固定属性来说,整个时间段都是静止区间。查找到时间轴属性的静止区间后,再向上冒泡,如果一个group包含了几个时间轴属性,各时间轴属性的静止区间存在交集时,则对应的group在交集上的属性值是没有任何变化的,那么这个交集即为group的静止区间。同理,然后继续往上冒泡,可以找到多个group静止区间的交集,就是交集即为layer的静止区间;多个layer的静止区间交集就是矢量合成(composition)的静止区间,从而可以得到整个动画的静止区间。
组成layer的group分为6类,即transform、mask、trackMatter、layerStyle、effect和content。以transform为例,它是由多个关键帧(Keyframe)<Transform2D>组成,每个Keyframe都是对layer的平移、缩放、旋转和/或透明度变化信息的 记录,它保存在PAG动画文件中,解析PAG动画文件时,取出这些Keyframe,就能还原这个layer的动画轨迹。这些关键帧最终存储的数据结构是Transform2D,这个Transform2D其实就是由时间轴属性来描述的,它包含了以下几个属性:锚点(anchorPoint)、位置(position)、x轴坐标位置(xPosition)、y轴坐标位置(yPosition)、缩放比例(scale)、旋转(rotation)和透明度(opacity),如图14所示。position、xPosition、yPosition描述平移信息;anchorPoint、scale描述缩放信息;anchorPoint、rotation描述旋转信息;opacity描述透明度信息。
解析PAG动画文件时,若某个layer的group类型为transform时,其描述信息Transform2D的时间轴属性在几个Keyframe上属性值没有发生变化,那么就可以理解为这几帧是这个transform的静止区间。判断是否是静止区间,最终都是转化为寻找时间轴属性的Hold类型区间段,其它group也是类似。在求group的父节点layer的静止区间,则计算这个layer所有group的静止区间的交集。
在计算静止区间时,可以直接计算静止区间,也可以通过寻找非静止区间方式来得到,即寻找出非静止区间时,将寻找到的非静止区间过滤掉即可得到整个动画的静止区间。通过寻找非静止区间方式来得到静止区间的方式,具体可以包括:先从根composition开始,整个动画有n帧,从第一帧到动画最后一帧(即[1,n]);然后依次遍历所有layer或者子composition,再遍历layer的每个group,把group里面非静止区间直接从[1,n]中去掉,这样就把[1,n]切成了很多细小的区间段。一个layer的静止区间就是计算该layer所有group的静止区间的交集,同理所有layer静止区间的交集就是composition的静止区间,这样逐层冒泡,直到最后就是根composition(即整个动画)的静止区间。
如图5所示,图5为求静止区间交集的示意图,其中t1、t2、m1、m2、c1、c2、c3和c4是以帧为单位。对于composition中的某一个layer,该layer的三个group在整个动画周期内如图5所示。其中,在[t1,t2]区间内,该layer是没有任何平移、缩放和旋转等变换;在[m1,m2]区间内,该layer使用同一个遮罩、且遮罩未发生任何变化;在[c1,c2]和[c3,c4]区间内,该layer分别应用了一个文本和一张图片,且文本和图片也未发生任何变化。
显然,对于content这个group来说,[c1,c2]和[c3,c4]都是该content的静止区间,在进行动画渲染时,只需要解析一次c1时刻的文本,以及解析一次c3时刻的图片,并把文本和图片保存在内存中并做标记。当动画播放需要绘制的帧为c,且c在[c1,c2]区间内,那么不需要去重新解析c帧的文本,可以直接使用用c1去查被标记的缓存,把c1的文本绘制出来,这个文本与c帧文本完全一样。
上述是只考虑了content这一个group的场景,对于一个layer而言,layer是由多个group组成,其静止区间即为该layer下所有group静止区间的交集。如图5所示,如果一个layer只由3个group组成,那么该layer有两段静止区间:[m1,c2]和[c3,t2]。
可以看出来,划分静止区间其实是一个冒泡的过程,首先寻找的是时间轴属性的Hold类型区间,再求交集找到group的静止区间,然后逐层向上求交集,直到最后根composition的静止区间,就是整个动画的静止区间。
综上所述,对于静止区间的划分,其流程如下所述:
S1502,从根composition开始,初始静止区间为第一帧到最后一帧。
S1504,判断是否有子composition。
S1506,若不存在子composition,则通过所有layer计算根composition的静止区间,然后执行S1518。
S1508,若存在子composition,则从子composition开始,初始静止区间为子composition起始帧到结束帧。
S1510,判断子composition中是否存在layer。
S1512,若子composition中存在layer,计算layer中每个group的静止区间。
S1514,选择描述group的时间轴属性的Hold类型区间,然后判断判断子composition中是否存在layer。
S1516,若子composition中不存在layer,通过所有group计算layer的静止区间。
S1518,计算根composition的静止区间。
(2)绘制数据缓存
绘制数据缓存是指缓存PAG文件(PAGFile)对象相关的部分,当解码得到 PAGFile对象之后,可以通过上述划分静止区间的方法得到所有静止区间。同时,解析PAGFile对象可以得到静止区间的起始绘制数据,然后缓存起始绘制数据即可。为了尽可能减少缓存大小,只缓存必要的绘制数据,这里对不同layer、不同group以及更小粒度分类,做出不同缓存策略。
PAG动画绘制一帧动画时,将会跳转到该帧,然后依次绘制每一个layer或预合成的composition,而预合成的composition本质也是多个layer或者composition嵌套,最终还是layer的绘制,因此绘制的粒度可以再细分到具体layer的绘制上。一个layer可以由6类group中任意一种或多种组合而成,以下依次说明每类group的缓存策略:
1)Content
Content是layer必不可少的组成部分,表示的是该layer的可绘制元素部分,比如:图片、文字和形状等等。而其它group几乎都是在Content的基础上做一些处理,如平移缩放、加遮罩和加滤镜特效等。
Content可以表示的元素大致分为:shape、text、solid、PreCompose和image。由于Content的多样性,需要针对具体的元素去做缓存,针对这5中元素,分别缓存不同的数据内容:
shape表示形状,可以是常规的圆、矩形、五角星等,还可以是没有规则的形状。在图形渲染的时候,对于形状的描述,只需绘制path信息和paint信息,就能完整绘制出所表示的图形。这里shape缓存的是一个自定义数据结构,该自定义数据结构的内容为形状绘制(ShapePaint),ShapePaint中包含了绘制的path信息和paint信息这两个属性值。
text表示文字内容,也即文本,比如动画中一行提示语滚动播放,这里提示语就使用到text。文本渲染需要的信息与shape类似,也都需要path信息和paint信息。另外,文本可以设置字体类型,比如:一个动画中字体可以来源于不同类型,因此还需要记录字体类型信息这个属性值。
solid是shape的一个特例,它可以表示一个实心填充了颜色的矩形,对它的描述可以使用矩形的宽、高和填充色,即可以缓存宽、高和填充色这三种属性值。solid应用场景也比较多,如简单矢量动画中的图形、用作图层的遮罩,或 者预合成的遮罩等。
image表示的是图片,在做渲染时,需获取图片具体像素信息,所以图片像素信息作为属性值被缓存起来;另外图片展示的大小,尤其是在缩放时,图片宽高会变化,渲染时要做出相应压缩和拉伸。因此,图片原始的宽高也需作为属性值缓存起来。
PreCompose表示预合成,由一个或者多个layer组成,也可以再嵌套其它composition。在本实施例中,PreCompose用于子composition应用,对于PreCompose的缓存,遍历PreCompose中所包含的layer,然后采用layer的缓存策略进行缓存。
2)Transform
动画其实是随时间轴会有变化,而大部分变化都是由平移、缩放、旋转或渐隐渐现中的至少一种组成。
在2D空间内,对于一个矩形做transform,如果是平移动画,可以确定矩形初始位置点(position,记录坐标x和y),然后记录x和y轴平移距离xPoition和yPosition(浮点数,记录距离),通过坐标x、y和距离这三个信息就可以还原一个平移动画。如果是缩放动画,可以先确定了一个锚点anchorPoint,再加上scale信息,就可以完整的还原。如果是旋转动画,先确定一个锚点anchorPoint,再加上旋转的rotation信息,就可以还原旋转动画。如果是渐隐渐现的透明度动画,可以记录opacity信息即可。
综上可得,任何一个2D空间内的变化都可以用少量的信息记录下来,平移、缩放和旋转记录的信息可以使用矩阵(Matrix)记录,透明度变化可以使用opacity记录。通过记录这两个信息,就可以还原出动画的变化,所以缓存时,只需要将Matrix和opacity作为属性值进行缓存即可。
3)Mask
遮罩的使用,可以使用两个简单的图形,通过组合关系形成一个全新的图形,这里Mask表示的是一个图层内的遮罩。而遮罩在渲染的时候,利用的是path信息和遮罩模式来实现绘制的。比如,一个矩形遮罩,本质上需要记录的就是这个矩形的path信息,即矩形的顶点信息,从而还原出这个遮罩矩形,所以需要缓 存path信息和遮罩模式。
遮罩模式决定了最后的展示,比如:加(Add)模式表示遮罩直接添加展示;减(Subtract)模式表示减去遮罩遮住部分;交叉(Intersect)模式表示取原图形和遮罩的交集部分展示;差异(Difference)模式表示取原图形和遮罩的不相交部分展示。
4)TrackMatter
TrackMatter和Mask类似,作用也是当做遮罩来使用,区别为Mask是一个图层内的遮罩,而TrackMatter是根据透明度和亮度用一个图层充当另一个图层的遮罩。对于TrackMatter,需要缓存的和Mask一样,也是path信息和模式信息。
其中,TrackMatter的模式有以下几类:Alpha模式,会根据图层不透明度区域去控制显示区域;AlphaInverted模式,它是根据图层的透明区域去控制图层的显示区域;Luma模式和LumaInverted模式与Alpha类似,只是将控制显示区域的条件换成了亮度。
5)layerStyle和effect
layerStyle和effect是一种滤镜,滤镜可以理解为对图片的像素点处理从而产生个性化的效果。layerStyle和effect在PAG动画中可用于支持投影效果,在实现时依赖于滤镜信息,比如,在原图上再绘制一张投影image,而image的参数由投影的颜色、方向和距离等计算出来,因此在缓存时可以缓存滤镜信息即可。
(三)渲染数据缓存
对于矢量图而言,绘制数据缓存可以保证动画获取某一瞬时的绘制数据可以直接获取到,但是这些绘制数据转化为可直接绘制的渲染数据,还需要计算,尤其是复杂path信息的矢量图,在渲染时,需要逐一解析path信息的点坐标,以及点与点的路径描述信息,然后把path信息完整绘制成矢量图。文本的绘制数据也是如此,绘制数据缓存的只是文本的path信息和paint信息,这些信息在每次绘制的时候,都存在计算转化的耗时,可以通过渲染数据缓存的方式解决这个耗时,具体的方式为:在得到绘制数据之后,创建一个离屏缓冲区,然后进行离屏渲染得到渲染数据,对渲染数据进行缓存,当需要绘制的时候,直接从缓存 中读取渲染数据,这样就节省了从绘制数据转化为渲染数据的耗时。
由于矢量图可以随意缩放,所以矢量图展示的大小取决于外部的展示容器View的尺寸。由于缓存渲染数据时,就已确定纹理的尺寸,所以为了保证最终动画清晰度,缓存渲染数据的时候,纹理尺寸大小要以最大缩放比值来应用,这样当缓存的渲染数据被应用在较小的场景时,可以对渲染数据进行压缩而不是拉伸,从而可以保证清晰度。这个缩放比值的计算,首先获取到用来承载、展示动画的外部View的缩放比值S1;然后逐层遍历每个composition和layer,取出它们缩放比值S2;再依次树形结构寻找到每个节点的缩放比值,直到寻找到content,取出该content的原始宽高和相对于父节点的缩放比值S3;最后计算出content相对于展示View的缩放比值就是这些树状层级缩放比值S1、S2和S3的累乘。
由于考虑嵌套问题,PreCompose类型和image类型的content会有所不同。比如,对于包含PreCompose类型的动画,一个动画有3个layer图层,而同一张图片是可能出现在每个图层中的,但是因为都是同一张图片,所以只需要缓存一份即可,对于图片缓存的尺寸大小选择上来说,要考虑所有使用到该图片的地方的缩放比值,并取最大的缩放比值,在缓存的时候,缓存的纹理大小就是最大缩放比值的场景,当这个纹理被绘制在较小的View上时,可以直接压缩缓存的image,从而保证清晰度。
此外,还提供了用于设置尺寸的setCacheScale(float scale)接口供外部设置,在展示动画时,展示动画的PAGView往往会被放在其它容器中,而这些父容器尺寸发生改变的时候,需要通知到PAGView,并把缩放比值应用到渲染数据缓存上,这样可以保证缓存的渲染数据是保证清晰度情况下最优的。另外,如果使用最大缩放比值还是相当占用内存,而且再压缩下缓存的图片依然能够保证清晰度,不影响动画效果,在最大缩放比值的基础上,可以再设置一个缩放比,降低缓存大小。
对PreCompose的边界测量的优化,通过计算得到包含所有内容的最小矩形区域,然后求这个矩形与PreCompose的边界矩形的交集,从而得到PreCompose的真实边界。图13为一个只有一个layer A和一个PreCompose组成的动画,其 中PreCompose的边界是小的黑色细线矩形框,实际内容区域是五角星和云朵的部分,另外一部分在矩形边框外。在渲染时,由于这个矩形框大部分内容都是空白,可以进行优化,优化的方式可以包括:先计算出最小的包含了实际内容的矩形框,得到图13中黑色虚线的矩形框A;最后求矩形框A与PreCompose边界的交集,得到黑色粗线矩形框B,矩形框B也就是真实展示的区域。
(四)渲染耗时对比
同一个PAG动画“人山人海”,使用动画预览器(PAGViewer)查看渲染(Render)耗时。其中,图16为缓存策略全开的渲染耗时示意图,图17为关闭渲染数据缓存的渲染耗时示意图,图18为关闭绘制数据缓存的渲染耗时示意图,图19为缓存策略全关的渲染耗时示意图。
通过图16-19可得,Render耗时在缓存策略全开的时候,渲染耗时最小,只占用25ms,如图16中的虚线框1602所示;而缓存策略全关的时候,渲染耗时最大为1052ms,如图19中的虚线框1902所示;从图16和图19中的渲染耗时可知,缓存策略全开和全关所对应的渲染耗时相差2个数量级。
另外,关闭渲染数据缓存(即采用绘制数据缓存)时,对应的渲染耗时为863s,如图17中的虚线框1702所示;关闭绘制数据缓存(即采用渲染数据缓存)时,对应的渲染耗时为184s,如图18中的虚线框1802所示,采用绘制数据缓存和采用渲染数据缓存时,两者所对应的渲染耗时之间存在差异,但绘制数据缓存和渲染数据缓存是两种不同的缓存策略,适用于不同的应用场景,所以起到的作用也不同,这里“人山人海”动画正是渲染数据缓存的应用在复杂图形情况下的优秀性能的体现而已。
通过采用上述实施例,可以具有以下有益效果:
1)可以大幅度降低动画的渲染耗时,相对传统动画渲染方法,采用本发明实施例的动画渲染方法,可以降低1-2个数量级别的渲染耗时。渲染耗时的降低,使得动画播放非常流畅,在低端手机上也能达到较好的播放效果,尤其是在客户端上同一个页面有多个动画同时播放的时候,如果是相同的动画,可以大大节省解码耗时。
2)针对不同的图层采用不同的缓存策略,以及计算动画缩放比值时,针对 图片和预合成的composition采用不同的计算规则。
图2-4、6-12、15为一个实施例中动画渲染方法的流程示意图。应该理解的是,虽然图2-4、6-12、15的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,图2-4、6-12、15中的至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
如图20所示,在一个实施例中,提供了一种动画渲染装置,该装置包括:文件获取模块2002、确定模块2004、数据缓存模块2006、数据读取模块2008和动画渲染模块2010;其中:
文件获取模块2002,用于获取目标格式的动画文件;
确定模块2004,用于当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间;
数据缓存模块2006,用于缓存动画绘制数据区间中的起始动画绘制数据;
数据读取模块2008,用于当动画文件在播放过程中待播放帧所对应的动画绘制数据符合静止条件时,读取缓存的与待播放帧对应的起始动画绘制数据;
动画渲染模块2010,用于根据读取的起始动画绘制数据进行动画渲染。
上述实施例中,对符合静止条件的动画绘制数据的起始动画绘制数据进行缓存,当播放到对应动画帧的属性值符合所述静止条件时,直接从缓存中获取与待播放帧对应的起始动画绘制数据,无需再次对动画文件进行解析来得到与待播放帧对应的动画绘制数据,从而避免了大量的计算,节省了渲染过程中的耗时,进而使动画播放变得更加流畅。
在一个实施例中,确定模块2004还用于:对动画文件进行解码,获得动画文件对象;读取动画文件对象中的动画绘制数据;从所读取的动画绘制数据中确定符合静止条件的动画绘制数据区间。
上述实施例中,通过解码动画文件得到动画文件对象,读取动画文件对象中的动画绘制数据,然后查找出动画文件对象中符合静止条件的动画绘制数据区间,以便对动画绘制数据区间中的起始动画绘制数据进行缓存,从而在播放到对应动画帧的属性值符合所述静止条件时,直接从缓存中获取与待播放帧对应的起始动画绘制数据,避免大量的计算,节省渲染过程中的耗时,使动画播放变得更加流畅。
在一个实施例中,动画文件包括至少一个动画图层;每个动画图层包括至少两个动画属性组;每个动画属性组包括至少两个动画属性值;确定模块2004还用于:当解码所述动画文件时,确定动画属性组中符合静止条件的各动画属性值区间;将动画属性值区间的交集作为动画属性组的组绘制数据区间;将组绘制数据区间的交集确定为动画图层的动画绘制数据区间。
上述实施例中,先确定符合静止条件的动画属性值区间,然后采用冒泡算法计算出动画绘制数据区间,以便对动画绘制数据区间中的起始动画绘制数据进行缓存,从而在播放到对应动画帧的属性值符合所述静止条件时,直接从缓存中获取与待播放帧对应的起始动画绘制数据,避免大量的计算,节省渲染过程中的耗时,使动画播放变得更加流畅。
在一个实施例中,如图21所示,该装置还包括:属性值获取模块2012;其中:
数据缓存模块2006还用于在动画属性值区间之间不存在交集时,缓存动画属性值区间的起始动画属性值;
数据读取模块2008还用于当动画文件在播放过程中待播放帧所对应的动画绘制数据不符合静止条件、且动画属性值区间之间不存在交集时,则读取缓存的与待播放帧所对应的起始动画属性值;
属性值获取模块2012,用于从动画文件解码所得的动画文件对象中,获取与待播放帧对应的、且不符合静止条件的动画属性值;
动画渲染模块2010还用于根据读取的起始动画属性值和所获取的动画属性值进行动画渲染。
上述实施例中,当动画图层和动画属性组均不存在静止区间时,则对存在静 止区间的动画属性值的起始动画属性值进行缓存。当待播放帧所命中动画属性值的静止区间时,则从缓存中读取待播放帧所对应的起始动画属性值,无需从动画文件对象解析属于静止区间中的动画属性值,从而降低了计算量,节省了渲染过程中的耗时,进而使动画播放变得更加流畅。
在一个实施例中,如图21所示,该装置还包括:数据获取模块2014;其中:
数据缓存模块2006还用于在动画属性值区间之间存在交集、但组绘制数据区间之间不存在交集时,缓存组绘制数据区间的起始组绘制数据;
数据读取模块2008还用于当动画文件在播放过程中待播放帧所对应的动画绘制数据不符合静止条件时,则读取缓存的与待播放帧所对应的起始组绘制数据;
数据获取模块2014,用于从动画文件解码所得的动画文件对象中,获取与待播放帧对应的、且不符合静止条件的组绘制数据;
动画渲染模块2010还用于根据读取的起始组绘制数据和所获取的组绘制数据进行动画渲染。
上述实施例中,当动画图层不存在静止区间,而动画属性组存在静止区间时,则对存在静止区间的组绘制数据的起始组绘制数据进行缓存。当待播放帧所命中组绘制数据的静止区间时,则从缓存中读取待播放帧所对应的起始组绘制数据,无需从动画文件对象解析属于静止区间中的组绘制数据,从而降低了计算量,节省了渲染过程中的耗时,进而使动画播放变得更加流畅。
在一个实施例中,动画属性组为可绘制元素属性组时,动画属性组包括至少两个可绘制元素;可绘制元素包括至少两个动画属性值;如图21所示,该装置还包括:交集计算模块2016;其中:
确定模块2004还用于将动画属性值区间的交集确定为可绘制元素的元素区间;
交集计算模块2016,用于计算可绘制元素的元素区间之间的交集;
确定模块2004还用于将元素区间之间的交集确定为动画属性组的组绘制数据区间。
在一个实施例中,当动画文件在播放过程中待播放帧所对应的动画绘制数 据符合静止条件时,数据读取模块2008还用于:当动画文件在多于一个的应用播放时,确定各应用对应的播放进度;当播放进度对应待播放帧的动画绘制数据符合静止条件时,则读取缓存的、与播放进度对应的、且由多于一个的应用所共享的起始动画绘制数据。
上述实施例中,当客户端的多个应用播放同一个动画时,则多个应用同时共享同一个动画文件解析出来的动画绘制数据,并将起始动画绘制数据进行缓存,一方面降低了缓存空间,另一方面降低了解析的计算量。当各待播放帧所对应动画绘制数据符合静止条件时,则从缓存中获取与待播放帧对应的起始动画绘制数据,无需从动画文件对象解析属于静止区间中的动画绘制数据,从而降低了计算量,节省了渲染过程中的耗时,进而使动画播放变得更加流畅。
在一个实施例中,动画文件中包括有矢量图;数据获取模块2014还用于获取动画文件解码所得的关于矢量图的动画绘制数据;
动画渲染模块2010还用于对动画绘制数据进行离屏渲染,得到动画渲染数据;
数据缓存模块2006还用于将动画渲染数据进行缓存;
数据读取模块2008还用于当动画文件在播放过程中待播放帧为矢量图动画帧时,读取缓存的与待播放帧对应的动画渲染数据。
在一个实施例中,动画渲染模块2010还用于:确定承载动画帧的内部容器与外部容器间的第一尺寸比值;确定动画图层与内部容器间的第二尺寸比值;确定可绘制元素属性组中动画绘制数据与动画图层间的第三尺寸比值;根据第一尺寸比值、第二尺寸比值和第三尺寸比值确定缩放比值。
在一个实施例中,动画渲染模块2010还用于:确定用于展示矢量图的外部容器的尺寸;确定动画绘制数据的尺寸相对外部容器的尺寸的缩放比值;创建离屏缓冲区;在离屏缓冲区中,按照外部容器的尺寸和缩放比值对动画绘制数据进行动画渲染,得到动画渲染数据。
上述实施例中,通过对矢量图的动画绘制数据先进行预渲染,然后奖渲染所得的动画渲染数据进行缓存,当播放过程中待播放帧为矢量图动画帧时,从缓存中读取与待播放帧对应的动画渲染数据,从而避免了从动画绘制数据转换为动 画渲染数据的耗时,从而有效地降低了渲染过程中的耗时,有利于提升动画播放的流畅度。
在一个实施例中,如图21所示,该装置还包括:缩放比值获取模块2018、尺寸调整模块2020;其中:
缩放比值获取模块2018,用于当节点树中节点所对应的比值发生变化时,获取变化后的缩放比值;节点树由外部容器、内部容器、动画图层、可绘制元素属性组和可绘制元素属性组中的动画绘制数据构成;
尺寸调整模块2020,用于按照缩放比值对缓存的动画渲染数据的尺寸进行调整;或者,
缩放比值获取模块2018还用于获取输入的缩放比值;
尺寸调整模块2020还用于根据输入的缩放比值对缓存的动画渲染数据的尺寸进行调整。
上述实施例中,将矢量图的动画绘制数据先进行预渲染,避免在播放时需要将动画绘制数据转换为动画渲染数据所需的耗时,从而有效地降低了渲染过程中的耗时,有利于提升动画播放的流畅度。
在一个实施例中,数据获取模块2014还用于当动画文件中包括有多个动画图层、且各动画图层中所包含的矢量图相同但尺寸不同时,则获取尺寸最大的矢量图所对应的动画渲染数据;
数据缓存模块2006还用于将尺寸最大的矢量图所对应的动画渲染数据进行缓存。
上述实施例中,当多个图层中出现有相同的矢量图时,将尺寸最大的矢量图所对应的动画渲染数据进行缓存,一方面,可以避免同时缓存三份动画渲染数据,降低了缓存的占用大小;另一方面,缓存最大尺寸的动画渲染数据,避免在显示是对矢量图进行拉伸而造成图像清晰度降低的问题。
在一个实施例中,数据缓存模块2006还用于:当动画文件包括有预合成属性组时,确定预合成属性组中包含第一目标动画渲染数据的动画区域;当动画区域中包含的非动画渲染数据区域的尺寸达到预设条件时,则确定包含第二目标动画渲染数据的最小动画区域;第一目标动画渲染数据为第二目标动画渲染数 据中的一部分;确定所获取的动画区域与最小动画区域之间的交集区域;将交集区域确定为预合成属性组的动画区域。
上述实施例中,对于预合成动画属性组的动画区域包含有较大无效区时,需要重新确定动画区域,保证动画渲染数据被包含在动画区域中,且尽量减小无效区,从而可以避免渲染时可能会消耗额外的计算量,且避免在上屏时扫描无效区,降低上屏耗时。
图22示出了一个实施例中计算机设备的内部结构图。该计算机设备具体可以是图1中的终端110。如图22所示,该计算机设备包括该计算机设备包括通过系统总线连接的处理器、存储器、网络接口、输入装置和显示屏。其中,存储器包括非易失性存储介质和内存储器。该计算机设备的非易失性存储介质存储有操作系统,还可存储有计算机程序,该计算机程序被处理器执行时,可使得处理器实现动画渲染方法。该内存储器中也可储存有计算机程序,该计算机程序被处理器执行时,可使得处理器执行动画渲染方法。计算机设备的显示屏可以是液晶显示屏或者电子墨水显示屏,计算机设备的输入装置可以是显示屏上覆盖的触摸层,也可以是计算机设备外壳上设置的按键、轨迹球或触控板,还可以是外接的键盘、触控板或鼠标等。
本领域技术人员可以理解,图22中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
在一个实施例中,本申请提供的动画渲染装置可以实现为一种计算机程序的形式,计算机程序可在如图22所示的计算机设备上运行。计算机设备的存储器中可存储组成该动画渲染装置的各个程序模块,比如,图20所示的文件获取模块2002、确定模块2004、数据缓存模块2006、数据读取模块2008和动画渲染模块2010。各个程序模块构成的计算机程序使得处理器执行本说明书中描述的本申请各个实施例的动画渲染方法中的步骤。
例如,图22所示的计算机设备可以通过如图20所示的动画渲染装置中的文件获取模块2002执行S202。计算机设备可通过确定模块2004执行S204。计算机设备可通过数据缓存模块2006执行S206。计算机设备可通过数据读取模块2008执行S208。计算机设备可通过动画渲染模块2010执行S210。
在一个实施例中,提供了一种计算机设备,包括存储器和处理器,存储器存储有计算机程序,计算机程序被处理器执行时,使得处理器执行上述动画渲染方法的步骤。此处动画渲染方法的步骤可以是上述各个实施例的动画渲染方法中的步骤。
在一个实施例中,提供了一种计算机可读存储介质,存储有计算机程序,计算机程序被处理器执行时,使得处理器执行上述动画渲染方法的步骤。此处动画渲染方法的步骤可以是上述各个实施例的动画渲染方法中的步骤。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一非易失性计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和/或易失性存储器。非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的 组合不存在矛盾,都应当认为是本说明书记载的范围。
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本申请专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (20)

  1. 一种动画渲染方法,由计算机设备执行,其特征在于,所述方法包括:
    获取目标格式的动画文件;
    当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间;
    缓存所述动画绘制数据区间中的起始动画绘制数据;
    当所述动画文件在播放过程中待播放帧所对应的动画绘制数据符合所述静止条件时,读取缓存的与所述待播放帧对应的起始动画绘制数据;
    根据读取的所述起始动画绘制数据进行动画渲染。
  2. 根据权利要求1所述的方法,其特征在于,所述当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间包括:
    对所述动画文件进行解码,获得动画文件对象;
    读取所述动画文件对象中的动画绘制数据;
    从所读取的动画绘制数据中确定符合静止条件的动画绘制数据区间。
  3. 根据权利要求1所述的方法,其特征在于,所述动画文件包括至少一个动画图层;每个所述动画图层包括至少两个动画属性组;每个所述动画属性组包括至少两个动画属性值;所述当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间包括:
    当解码所述动画文件时,从解码所得的动画绘制数据中确定所述动画属性组中符合静止条件的各动画属性值区间;
    将所述动画属性值区间的交集作为所述动画属性组的组绘制数据区间;
    将所述组绘制数据区间的交集确定为所述动画图层的动画绘制数据区间。
  4. 根据权利要求3所述的方法,其特征在于,所述方法还包括:
    在所述动画属性值区间之间不存在交集时,缓存所述动画属性值区间的起始动画属性值;
    当所述动画文件在播放过程中待播放帧所对应的动画绘制数据不符合所述静止条件、且所述动画属性值区间之间不存在交集时,则
    读取缓存的与所述待播放帧所对应的起始动画属性值;
    从所述动画文件解码所得的动画文件对象中,获取与所述待播放帧对应的、且不符合所述静止条件的动画属性值;
    根据读取的起始动画属性值和所获取的动画属性值进行动画渲染。
  5. 根据权利要求3所述的方法,其特征在于,所述方法还包括:
    在所述动画属性值区间之间存在交集、但所述组绘制数据区间之间不存在交集时,缓存所述组绘制数据区间的起始组绘制数据;
    当所述动画文件在播放过程中待播放帧所对应的动画绘制数据不符合所述静止条件时,则
    读取缓存的与所述待播放帧所对应的起始组绘制数据;
    从所述动画文件解码所得的动画文件对象中,获取与所述待播放帧对应的、且不符合所述静止条件的组绘制数据;
    根据读取的起始组绘制数据和所获取的组绘制数据进行动画渲染。
  6. 根据权利要求3所述的方法,其特征在于,所述动画属性组为可绘制元素属性组时,所述动画属性组包括至少两个可绘制元素;所述可绘制元素包括至少两个动画属性值;所述方法还包括:
    将所述动画属性值区间的交集确定为所述可绘制元素的元素区间;
    计算所述可绘制元素的元素区间之间的交集;
    所述将所述动画属性值区间的交集作为所述动画属性组的组绘制数据区间包括:
    将所述元素区间之间的交集确定为所述动画属性组的组绘制数据区间。
  7. 根据权利要求1所述的方法,其特征在于,所述当所述动画文件在播放过程中待播放帧所对应的动画绘制数据符合所述静止条件时,读取缓存的与所述待播放帧对应的起始动画绘制数据包括:
    当所述动画文件在多于一个的应用播放时,确定各应用对应的播放进度;
    当所述播放进度对应待播放帧的动画绘制数据符合所述静止条件时,则
    读取缓存的、与所述播放进度对应的、且由所述多于一个的应用所共享的起始动画绘制数据。
  8. 根据权利要求1所述的方法,其特征在于,所述动画文件中包括有矢量 图;所述方法还包括:
    获取所述动画文件解码所得的关于所述矢量图的动画绘制数据;
    对所述动画绘制数据进行离屏渲染,得到动画渲染数据;
    将所述动画渲染数据进行缓存;
    当所述动画文件在播放过程中待播放帧为矢量图动画帧时,读取缓存的与所述待播放帧对应的动画渲染数据。
  9. 根据权利要求8所述的方法,其特征在于,所述对所述动画绘制数据进行离屏渲染,得到动画渲染数据包括:
    确定用于展示所述矢量图的外部容器的尺寸;
    确定所述动画绘制数据的尺寸相对所述外部容器的尺寸的缩放比值;
    创建离屏缓冲区;
    在所述离屏缓冲区中,按照所述外部容器的尺寸和所述缩放比值对所述动画绘制数据进行动画渲染,得到动画渲染数据。
  10. 根据权利要求9所述的方法,其特征在于,所述确定所述动画绘制数据的尺寸相对所述外部容器的尺寸的缩放比值包括:
    确定承载动画帧的内部容器与所述外部容器间的第一尺寸比值;
    确定所述动画图层与所述内部容器间的第二尺寸比值;
    确定可绘制元素属性组中动画绘制数据与所述动画图层间的第三尺寸比值;
    根据所述第一尺寸比值、第二尺寸比值和第三尺寸比值确定缩放比值。
  11. 根据权利要求10所述的方法,其特征在于,所述方法还包括:
    当节点树中节点所对应的比值发生变化时,获取变化后的缩放比值;所述节点树由所述外部容器、内部容器、动画图层、可绘制元素属性组和所述可绘制元素属性组中的动画绘制数据构成;
    按照所述缩放比值对缓存的所述动画渲染数据的尺寸进行调整;或者,
    获取输入的缩放比值,根据输入的所述缩放比值对缓存的所述动画渲染数据的尺寸进行调整。
  12. 根据权利要求8所述的方法,其特征在于,所述方法还包括:
    当所述动画文件中包括有多个动画图层、且各所述动画图层中所包含的矢 量图相同但尺寸不同时,则
    获取尺寸最大的矢量图所对应的动画渲染数据;
    所述将所述动画渲染数据进行缓存包括:
    将尺寸最大的矢量图所对应的动画渲染数据进行缓存。
  13. 根据权利要求1至12任一项所述的方法,其特征在于,所述根据读取的所述起始动画绘制数据进行动画渲染之后,所述方法还包括:
    当所述动画文件包括有预合成属性组时,确定所述预合成属性组中包含第一目标动画渲染数据的动画区域;
    当所述动画区域中包含的非动画渲染数据区域的尺寸达到预设条件时,则确定包含第二目标动画渲染数据的最小动画区域;所述第一目标动画渲染数据为所述第二目标动画渲染数据中的一部分;
    确定所获取的动画区域与所述最小动画区域之间的交集区域;
    将所述交集区域确定为所述预合成属性组的动画区域。
  14. 一种动画渲染装置,其特征在于,所述装置包括:
    文件获取模块,用于获取目标格式的动画文件;
    确定模块,用于当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间;
    数据缓存模块,用于缓存所述动画绘制数据区间中的起始动画绘制数据;
    数据读取模块,用于当所述动画文件在播放过程中待播放帧所对应的动画绘制数据符合所述静止条件时,读取缓存的与所述待播放帧对应的起始动画绘制数据;
    动画渲染模块,用于根据读取的所述起始动画绘制数据进行动画渲染。
  15. 一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时,使得所述处理器执行以下步骤:
    获取目标格式的动画文件;
    当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间;
    缓存所述动画绘制数据区间中的起始动画绘制数据;
    当所述动画文件在播放过程中待播放帧所对应的动画绘制数据符合所述静止条件时,读取缓存的与所述待播放帧对应的起始动画绘制数据;
    根据读取的所述起始动画绘制数据进行动画渲染。
  16. 根据权利要求15所述的计算机可读存储介质,其特征在于,所述计算机程序被处理器执行所述当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间的步骤时,使得所述处理器具体执行以下步骤:
    对所述动画文件进行解码,获得动画文件对象;
    读取所述动画文件对象中的动画绘制数据;
    从所读取的动画绘制数据中确定符合静止条件的动画绘制数据区间。
  17. 根据权利要求15所述的计算机可读存储介质,其特征在于,所述动画文件包括至少一个动画图层;每个所述动画图层包括至少两个动画属性组;每个所述动画属性组包括至少两个动画属性值;
    所述计算机程序被处理器执行所述当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间的步骤时,使得所述处理器具体执行以下步骤:
    当解码所述动画文件时,确定所述动画属性组中符合静止条件的各动画属性值区间;
    将所述动画属性值区间的交集作为所述动画属性组的组绘制数据区间;
    将所述组绘制数据区间的交集确定为所述动画图层的动画绘制数据区间。
  18. 一种计算机设备,包括存储器和处理器,所述存储器存储有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行以下步骤:
    获取目标格式的动画文件;
    当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间;
    缓存所述动画绘制数据区间中的起始动画绘制数据;
    当所述动画文件在播放过程中待播放帧所对应的动画绘制数据符合所述静止条件时,读取缓存的与所述待播放帧对应的起始动画绘制数据;
    根据读取的所述起始动画绘制数据进行动画渲染。
  19. 根据权利要求18所述的计算机设备,其特征在于,所述计算机程序被处理器执行所述当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间的步骤时,使得所述处理器具体执行以下步骤:
    对所述动画文件进行解码,获得动画文件对象;
    读取所述动画文件对象中的动画绘制数据;
    从所读取的动画绘制数据中确定符合静止条件的动画绘制数据区间。
  20. 根据权利要求18所述的计算机设备,其特征在于,所述动画文件包括至少一个动画图层;每个所述动画图层包括至少两个动画属性组;每个所述动画属性组包括至少两个动画属性值;
    所述计算机程序被处理器执行所述当解码所述动画文件时,从解码所得的动画绘制数据中确定符合静止条件的动画绘制数据区间的步骤时,使得所述处理器具体执行以下步骤:
    当解码所述动画文件时,确定所述动画属性组中符合静止条件的各动画属性值区间;
    将所述动画属性值区间的交集作为所述动画属性组的组绘制数据区间;
    将所述组绘制数据区间的交集确定为所述动画图层的动画绘制数据区间。
PCT/CN2020/095013 2019-06-11 2020-06-09 动画渲染方法、装置、计算机可读存储介质和计算机设备 WO2020248951A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP20823436.9A EP3985612A4 (en) 2019-06-11 2020-06-09 METHOD AND DEVICE FOR PLAYING ANIMATION, COMPUTER READABLE STORAGE MEDIUM AND COMPUTER DEVICE
JP2021563216A JP7325535B2 (ja) 2019-06-11 2020-06-09 アニメーションレンダリング方法、装置、コンピュータ読み取り可能な記憶媒体、及びコンピュータ機器
US17/379,998 US11783522B2 (en) 2019-06-11 2021-07-19 Animation rendering method and apparatus, computer-readable storage medium, and computer device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910501994.8A CN112070864A (zh) 2019-06-11 2019-06-11 动画渲染方法、装置、计算机可读存储介质和计算机设备
CN201910501994.8 2019-06-11

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/379,998 Continuation US11783522B2 (en) 2019-06-11 2021-07-19 Animation rendering method and apparatus, computer-readable storage medium, and computer device

Publications (1)

Publication Number Publication Date
WO2020248951A1 true WO2020248951A1 (zh) 2020-12-17

Family

ID=73658396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/095013 WO2020248951A1 (zh) 2019-06-11 2020-06-09 动画渲染方法、装置、计算机可读存储介质和计算机设备

Country Status (5)

Country Link
US (1) US11783522B2 (zh)
EP (1) EP3985612A4 (zh)
JP (1) JP7325535B2 (zh)
CN (1) CN112070864A (zh)
WO (1) WO2020248951A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781615A (zh) * 2021-09-28 2021-12-10 腾讯科技(深圳)有限公司 一种动画生成方法、装置、设备、存储介质及程序产品

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111258483B (zh) * 2020-01-20 2021-06-15 北京无限光场科技有限公司 图像缩放显示方法、装置、计算机设备及可读介质
CN112883296A (zh) * 2021-01-21 2021-06-01 平安科技(深圳)有限公司 可视化页面的加载方法、装置、设备及存储介质
CN113220184A (zh) * 2021-05-28 2021-08-06 北京达佳互联信息技术有限公司 动态图标显示方法、装置和电子设备
CN113254211B (zh) * 2021-06-01 2023-04-07 广州小鹏汽车科技有限公司 缓存分配方法、装置、电子设备及存储介质
CN113409427B (zh) * 2021-07-21 2024-04-19 北京达佳互联信息技术有限公司 动画播放方法、装置、电子设备及计算机可读存储介质
CN114398124B (zh) * 2021-12-31 2024-04-12 深圳市珍爱捷云信息技术有限公司 基于iOS系统的点九效果图渲染方法及其相关装置
CN114419193B (zh) * 2022-01-24 2023-03-10 北京思明启创科技有限公司 图像绘制方法、装置、电子设备及计算机可读存储介质
CN114598937B (zh) * 2022-03-01 2023-12-12 上海哔哩哔哩科技有限公司 动画视频生成、播放方法及装置
CN115134661A (zh) * 2022-06-28 2022-09-30 龙芯中科(合肥)技术有限公司 视频处理方法及视频处理应用
CN116450017B (zh) * 2023-04-25 2024-01-26 北京优酷科技有限公司 显示对象的显示方法、装置、电子设备及介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7786999B1 (en) * 2000-10-04 2010-08-31 Apple Inc. Edit display during rendering operations
CN103810738A (zh) * 2012-11-14 2014-05-21 腾讯科技(深圳)有限公司 Gif文件渲染方法及装置
US20160275108A1 (en) * 2015-02-09 2016-09-22 Jonathan Mark Sidener Producing Multi-Author Animation and Multimedia Using Metadata

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6100941A (en) * 1998-07-28 2000-08-08 U.S. Philips Corporation Apparatus and method for locating a commercial disposed within a video data stream
JP2004073351A (ja) 2002-08-13 2004-03-11 Sankyo Kk 遊技機
US20060282855A1 (en) * 2005-05-05 2006-12-14 Digital Display Innovations, Llc Multiple remote display system
JP2007018157A (ja) 2005-07-06 2007-01-25 Canon Inc 画像処理装置
US20100073379A1 (en) * 2008-09-24 2010-03-25 Sadan Eray Berger Method and system for rendering real-time sprites
US9280844B2 (en) * 2013-03-12 2016-03-08 Comcast Cable Communications, Llc Animation
CN109389661B (zh) * 2017-08-04 2024-03-01 阿里健康信息技术有限公司 一种动画文件转化方法及装置
US10319134B2 (en) * 2017-09-01 2019-06-11 Disney Enterprises, Inc. Animation system for managing scene constraints for pose-based caching
CN109359262B (zh) * 2018-10-11 2020-09-04 广州酷狗计算机科技有限公司 动画播放方法、装置、终端及存储介质

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7786999B1 (en) * 2000-10-04 2010-08-31 Apple Inc. Edit display during rendering operations
CN103810738A (zh) * 2012-11-14 2014-05-21 腾讯科技(深圳)有限公司 Gif文件渲染方法及装置
US20160275108A1 (en) * 2015-02-09 2016-09-22 Jonathan Mark Sidener Producing Multi-Author Animation and Multimedia Using Metadata

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781615A (zh) * 2021-09-28 2021-12-10 腾讯科技(深圳)有限公司 一种动画生成方法、装置、设备、存储介质及程序产品
CN113781615B (zh) * 2021-09-28 2023-06-13 腾讯科技(深圳)有限公司 一种动画生成方法、装置、设备、存储介质

Also Published As

Publication number Publication date
EP3985612A1 (en) 2022-04-20
US11783522B2 (en) 2023-10-10
EP3985612A4 (en) 2022-08-03
JP2022535669A (ja) 2022-08-10
JP7325535B2 (ja) 2023-08-14
CN112070864A (zh) 2020-12-11
US20210350601A1 (en) 2021-11-11

Similar Documents

Publication Publication Date Title
WO2020248951A1 (zh) 动画渲染方法、装置、计算机可读存储介质和计算机设备
CN106611435B (zh) 动画处理方法和装置
US8248420B2 (en) Method and system for displaying animation with an embedded system graphics API
US11972514B2 (en) Animation file processing method and apparatus, computer-readable storage medium, and computer device
CN104616243B (zh) 一种高效的gpu三维视频融合绘制方法
KR101523888B1 (ko) 정적 이미지들의 디스플레이
US11418832B2 (en) Video processing method, electronic device and computer-readable storage medium
EP2319017B1 (en) Apparatus and method of viewing electronic documents
JP2005135410A (ja) 適応型画像拡大方法及び装置
TW201816713A (zh) 在移動設備上動畫展現圖像的方法和裝置
WO2021008427A1 (zh) 图像合成方法、装置、电子设备及存储介质
US20110069065A1 (en) Image processing apparatus, computer readable medium and method thereof
US8787466B2 (en) Video playback device, computer readable medium and video playback method
JP3616241B2 (ja) アニメーション表示方法、及びアニメーション表示プログラムを記録したコンピュータ読み取り可能な記録媒体
JP4633595B2 (ja) 動画生成装置、動画生成方法、及びプログラム
CN112488911B (zh) 在html5画布canvas上渲染gif文件的方法
JP4402088B2 (ja) 画像処理方法および装置およびこれらを利用した電子機器
Concolato et al. Design of an efficient scalable vector graphics player for constrained devices
WO2024087971A1 (zh) 用于图像处理的方法、装置及存储介质
CN113496537A (zh) 动画播放方法、装置及服务器
JP2008039839A (ja) アニメーション表示方法、アニメーション表示装置、及びプログラム
JP2004265331A (ja) 3次元描画システム、3次元描画方法、およびコンピュータプログラム
JP2007158705A (ja) 動画生成装置、動画生成方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20823436

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021563216

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020823436

Country of ref document: EP

Effective date: 20220111