CN114241094A - Animation drawing method and device, storage medium and electronic equipment - Google Patents

Animation drawing method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN114241094A
CN114241094A CN202111543512.9A CN202111543512A CN114241094A CN 114241094 A CN114241094 A CN 114241094A CN 202111543512 A CN202111543512 A CN 202111543512A CN 114241094 A CN114241094 A CN 114241094A
Authority
CN
China
Prior art keywords
animation
target
frame
animations
graphical user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111543512.9A
Other languages
Chinese (zh)
Inventor
姜雪纯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Boguan Information Technology Co Ltd
Original Assignee
Guangzhou Boguan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Boguan Information Technology Co Ltd filed Critical Guangzhou Boguan Information Technology Co Ltd
Priority to CN202111543512.9A priority Critical patent/CN114241094A/en
Publication of CN114241094A publication Critical patent/CN114241094A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • G06F9/5038Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering the execution order of a plurality of tasks, e.g. taking priority or time dependency constraints into consideration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2209/00Indexing scheme relating to G06F9/00
    • G06F2209/50Indexing scheme relating to G06F9/50
    • G06F2209/5011Pool

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to the field of computer processing, and in particular, to an animation drawing method, an animation drawing device, a storage medium, and an electronic device. The animation drawing method comprises the steps of responding to drawing instructions of a plurality of target animations in the graphical user interface, and obtaining animation drawing information of the target animations; wherein the animation drawing information comprises an animation position of the target animation in the graphical user interface and animation resources of the target animation; and asynchronously drawing the target animations on the graphical user interface according to the animation drawing information of the target animations through an animation playing container comprising an asynchronous thread processing mechanism. The animation drawing method provided by the disclosure can achieve the purpose of asynchronously drawing the SVGA animation.

Description

Animation drawing method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer processing, and in particular, to an animation rendering method, apparatus, storage medium, and electronic device.
Background
Animation is often required to be played in many scenes of a graphical user interface. For example, in a live scene, animations such as seat box services, pendant services, stamping services, and the like, can be played on a user seat in a voice room, all animated using SVGA.
The prior art is to play SVGA animations by using SVGA aimageview. However, when SVGAImageView is used, the SVGAImageView is required to be carried out on a main UI thread, and the performance loss of the main thread is large.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure aims to provide a method and an apparatus for rendering an animation, a storage medium, and an electronic device, and aims to achieve the purpose of asynchronously rendering an SVGA animation.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the embodiments of the present disclosure, there is provided an animation method, including: responding to drawing instructions of a plurality of target animations in the graphical user interface, and acquiring animation drawing information of the plurality of target animations; wherein the animation drawing information comprises an animation position of the target animation in the graphical user interface and animation resources of the target animation; and asynchronously drawing the target animations on the graphical user interface according to the animation drawing information of the target animations through an animation playing container comprising an asynchronous thread processing mechanism.
According to some embodiments of the present disclosure, based on the foregoing solution, the asynchronously drawing the plurality of target animations on the graphical user interface according to the animation drawing information of the plurality of target animations through the animation playing container including the asynchronous thread processing mechanism includes: constructing animation agent classes corresponding to the target animations respectively; creating drawing instructions corresponding to the target animations by using the animation agent classes according to the animation drawing information of the target animations corresponding to the animation agent classes; and asynchronously executing the drawing instruction corresponding to each target animation through an animation playing container comprising an asynchronous thread processing mechanism so as to asynchronously draw the plurality of target animations on the graphical user interface.
According to some embodiments of the present disclosure, based on the foregoing scheme, the obtaining animation drawing information of the plurality of target animations includes: acquiring an animation position of a first target animation in the graphical user interface; the first target animation is any one of a plurality of target animations; the obtaining of the animation position of the first target animation in the graphical user interface comprises: determining a target associated object corresponding to the first target animation based on the drawing instruction of the first target animation; acquiring the position information of the target associated object in the graphical user interface, and acquiring the relative position information of the first target animation and the target associated object in the graphical user; and determining the animation position of the first target animation in the graphical user interface according to the position information and the relative position information.
According to some embodiments of the present disclosure, based on the foregoing solution, after obtaining the animation position of the first target animation in the graphical user interface, the method further includes: storing the position information of the target associated object corresponding to the first target animation and the relative position information of the first target animation and the target associated object into an animation proxy class corresponding to the first target animation; and updating the position information of the target associated object corresponding to the first target animation in the animation proxy class when the position information of the target associated object in the graphical user interface changes.
According to some embodiments of the present disclosure, based on the foregoing scheme, after obtaining the animation resource of the target animation, the method further includes: converting animation resources of the target animation into display frames by using the drawable resource class; and storing the display frame of the target animation into an animation proxy class corresponding to the target animation.
According to some embodiments of the present disclosure, based on the foregoing solution, the animation proxy class stores animation positions and display frames corresponding to a target animation; the drawing instruction comprises a single-frame drawing request of each display frame in each target animation; the creating of the drawing instruction corresponding to the target animation by using each animation agent class according to the animation drawing information of the target animation corresponding to the animation agent class comprises the following steps: creating a single-frame drawing request of each display frame corresponding to the first target animation according to the animation drawing information of the first target animation by using an animation agent class corresponding to the first target animation; the first target animation is any one of a plurality of target animations; the creating of the single-frame drawing request of each display frame corresponding to the first target animation according to the animation drawing information of the first target animation by using the animation agent class corresponding to the first target animation comprises the following steps: extracting a current display frame based on the display frame of the first target animation stored in the animation agent class corresponding to the first target animation, and rewriting the drawing logic of the current display frame to obtain a current drawing frame of the first target animation; extracting the current drawing position of the current drawing frame of the first target animation based on the animation position of the first target animation stored in the animation agent class corresponding to the first target animation; creating a single-frame drawing request corresponding to the current drawing frame in the first target animation according to the current drawing frame and the current drawing position of the first target animation, and sending the single-frame drawing request to the animation playing container; and when the animation playing container finishes the drawing of the current drawing frame in the first target animation based on the single-frame drawing request, repeating the processes of creating and sending the single-frame drawing request until the single-frame drawing requests of all display frames of the first target animation are sent.
According to some embodiments of the present disclosure, based on the foregoing solution, the asynchronously executing each drawing instruction by an animation playing container including an asynchronous thread processing mechanism includes: sequentially executing the single-frame drawing request of each display frame of the first target animation through the animation playing container; the sequentially executing the single-frame drawing requests of the display frames of the first target animation through the animation playing container comprises the following steps: responding to a single-frame drawing request of the current drawing frame of the first target animation, which is sent by an animation agent class corresponding to the first target animation, and extracting the current drawing frame and the current drawing position corresponding to the single-frame drawing request through the animation playing container; deleting the display frame corresponding to the current drawing position, and drawing the current drawing frame of the first target animation at the current drawing position; and repeating the drawing process of the current drawing frame in sequence until all the display frames of the first target animation are drawn.
According to a second aspect of the embodiments of the present disclosure, there is provided an animation drawing device including: the response module is used for responding to drawing instructions of a plurality of target animations in the graphical user interface and acquiring animation drawing information of the plurality of target animations; wherein the animation drawing information comprises an animation position of the target animation in the graphical user interface and animation resources of the target animation; and the drawing module is used for asynchronously drawing the target animations on the graphical user interface according to the animation drawing information of the target animations through an animation playing container comprising an asynchronous thread processing mechanism.
According to a third aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements an animation rendering method as in the above embodiments.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an electronic apparatus, including: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the animation method as in the above embodiments.
Exemplary embodiments of the present disclosure may have some or all of the following benefits:
according to the technical scheme provided by some embodiments of the disclosure, when drawing instructions of a plurality of target animations in a graphical user interface are received, firstly, animation drawing information of each target animation is obtained, then, an asynchronous thread processing mechanism of an animation playing container is utilized, and the asynchronous thread is utilized to draw the plurality of target animations according to the animation drawing information, so that the target animations can be drawn in the asynchronous thread, the expenditure of a UI thread is reduced, and page blockage is optimized; on the other hand, the asynchronous thread pool of the animation playing container can be used for realizing the asynchronous drawing of a plurality of target animations, and the overhead of position measurement and animation drawing is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 schematically illustrates a flow diagram of a method of animation in an exemplary embodiment of the disclosure;
FIG. 2 schematically illustrates a graphical user interface diagram of a speech room in an exemplary embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating a position relationship between an animation playback container and a speech room in an exemplary embodiment of the disclosure;
FIG. 4 schematically illustrates a flow diagram of a method of animation in an exemplary embodiment of the disclosure;
FIG. 5 schematically illustrates a composition diagram of an animation rendering device in an exemplary embodiment of the present disclosure;
FIG. 6 schematically illustrates a schematic diagram of a computer-readable storage medium in an exemplary embodiment of the disclosure;
fig. 7 schematically shows a structural diagram of a computer system of an electronic device in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
The technical scheme of the application can be applied to animation playing scenes in the interface of the terminal equipment, and the terminal equipment can be a mobile phone, a computer or a handheld terminal and the like; the animation can be SVGA animation, SVGA (super Video Graphics array) is a high-grade Video Graphics array, is compatible with android, ios and web, and can realize animation formats of various animations, such as a pendant playing animation in a live scene, a seat frame playing animation and the like.
Implementation details of the technical solution of the embodiments of the present disclosure are set forth in detail below.
Fig. 1 schematically illustrates a flow diagram of an animation rendering method in an exemplary embodiment of the present disclosure. As shown in fig. 1, the animation drawing method includes steps S1 to S2:
step S1, responding to the drawing instruction of the multiple target animations in the graphical user interface, and obtaining animation drawing information of the multiple target animations; the animation drawing information comprises an animation position of the target animation in the graphical user interface and animation resources of the target animation;
and step S2, drawing the multiple target animations asynchronously on the graphical user interface according to the animation drawing information of the multiple target animations through the animation playing container comprising the asynchronous thread processing mechanism.
According to the technical scheme provided by some embodiments of the disclosure, when drawing instructions of a plurality of target animations in a graphical user interface are received, firstly, animation drawing information of each target animation is obtained, then, an asynchronous thread processing mechanism of an animation playing container is utilized, and the asynchronous thread is utilized to draw the plurality of target animations according to the animation drawing information, so that the target animations can be drawn in the asynchronous thread, the expenditure of a UI thread is reduced, and page blockage is optimized; on the other hand, the asynchronous thread pool of the animation playing container can be used for realizing the asynchronous drawing of a plurality of target animations, and the overhead of position measurement and animation drawing is reduced.
Hereinafter, each step of the animation rendering method in the present exemplary embodiment will be described in more detail with reference to the drawings and examples.
In step S1, in response to a drawing instruction of a plurality of target animations in the graphical user interface, acquiring animation drawing information of the plurality of target animations; the animation drawing information comprises the animation position of the target animation in the graphical user interface and animation resources of the target animation.
In particular, the drawing of a target animation may often be required in a graphical user interface. For example, in a live scene, a voice room includes a plurality of user seats, each seat may simultaneously display a plurality of service functions such as a seat frame, a pendant, a stamp, etc., and SVGA is used for animation display. Therefore, the target animation may be an SVGA animation that needs to be drawn at an agent.
FIG. 2 schematically illustrates a graphical user interface diagram of a speech room in an exemplary embodiment of the present disclosure. As shown in the graphical user interface of FIG. 2, the live scene comprises 9 agents 201-209, wherein the agent 202 and the agent 209 comprise a pendant service. And different pendant services and SVGA animation drawing positions are different, the pendant at the position of the seat 202 is drawn above and below the seat 202, and the pendant at the position of the seat 209 is drawn above and to the right of the seat 209.
Thus, different agents may have different SVGA animation needs. When a certain seat in the voice room needs to display the service, the live broadcast platform client generates a drawing instruction of the seat animation by combining the target seat and the service. Such as the "hat" pendant seat animation of target seat 209 in fig. 2.
Different seats need to draw different target animations, so that the live broadcast platform client can respectively generate a plurality of drawing instructions of the target animations, and the live broadcast platform background server responds to the plurality of drawing instructions to obtain animation drawing information of the target animations corresponding to the drawing instructions respectively.
The animation drawing information may include animation positions and animation resources required to be drawn. The animation resources refer to animation content data to be drawn, such as seat frames, pendants, stamping and other animation contents, and different services of the same seat can correspond to different animation resources; and the animation position is coordinate position information that the animation needs to be drawn on the graphical user interface.
In step S2, the plurality of target animations are asynchronously drawn on the graphical user interface according to the animation drawing information of the plurality of target animations by the animation playback container including the asynchronous thread processing mechanism.
In one embodiment of the present disclosure, the development of a video container, that is, an audiohalimanttextview (animation playback container), is performed in advance and arranged on an upper layer of a graphical user interface.
Fig. 3 schematically illustrates a position relationship between an animation playback container and a live broadcast room in an exemplary embodiment of the present disclosure, and referring to fig. 3, 301 is a voice room, a circle 302 indicates an agent of the room, and an animation playback container 303 is disposed on an upper layer of the voice room 301.
The animation playing container is a video container, and can pack files in a plurality of formats together to generate a file, and the video playing software can open a package file of the animation playing container for playing.
In particular, audiohallanimttextview is implemented based on TextureView (texture view), which can implement asynchronous rendering of content streams. The drawing of the SVGA animation is entrusted to AudioHalAnimTextureView processing, and the asynchronous drawing of the SVGA animation can be realized through the AudioHalAnimTextureView.
In addition, AudioHallAnimTextureView may also store the AnimDrawableDelegate that it is delegated to draw, in order to manage multiple SVGA animations in a unified manner.
The animation playing container comprises a Handler (asynchronous thread processing mechanism) constructed by asynchronous threads, the Handler is used for processing asynchronous messages, the asynchronous thread Handler can perform drawing operation on the thread for creating the Handler after receiving drawing instructions, and animation playing of a plurality of SVGA (scalable vector graphics) can be processed through one video container at the same time.
Compared with the situation that SVGA animations are played by using an SVGA AImageView video container in the prior art, one SVGA animation needs to correspond to a single SVGA AImageView video container when the SVGA AImage is used, the method provided by the disclosure only has one Audio HallAnimTextureView animation playing container, and the views (animation playing containers) can simultaneously process the animation playing of a plurality of SVGA animations, so that when the SVGA animations are excessive, the expenditure of the views in the aspects of measurement, layout and drawing can be reduced, and the phenomenon of page jamming is avoided.
In an embodiment of the present disclosure, after receiving a drawing instruction, animation drawing information corresponding to the instruction may be obtained, and animation drawing is performed according to animation resource data and animation position information through an asynchronous thread.
Compared with the prior art that animation drawing by using SVGAImageView needs to be carried out on a main thread, the expenditure of a UI thread can be reduced and page pause can be optimized by completing the animation drawing by using an asynchronous thread created by an AudioHalAnimTextureView animation playing container.
In one embodiment of the present disclosure, for step S2, asynchronously drawing the plurality of target animations on the graphical user interface according to the animation drawing information of the plurality of target animations through the animation play container including the asynchronous thread processing mechanism, includes:
step S21, constructing animation agent classes corresponding to a plurality of target animations respectively;
step S22, creating drawing instructions corresponding to the target animations according to the animation drawing information of the target animations corresponding to the animation agents respectively by using the animation agents;
step S23, asynchronously executing the drawing instruction corresponding to each target animation through the animation playing container including the asynchronous thread processing mechanism, so as to asynchronously draw the multiple target animations on the graphical user interface.
Next, step S21 to step S23 will be explained in detail.
In step S21, animation agent classes corresponding to the plurality of target animations are constructed.
Specifically, after receiving a drawing instruction of a target animation, an animdawabledelete (animation agent class) corresponding to the target animation is created in advance for the target animation, and is used for managing the animation drawing process of the target animation in a whole process. The animation proxy class belongs to the idea of class in the programming field, and the class is an abstract concept and is an abstraction of a certain kind of things.
In order to manage animation rendering, the animation agent needs to store animation positions and display frames corresponding to target animations corresponding to the animation agent in advance. Therefore, after the animation proxy class is created in step S21, the animation position and the display frame corresponding to the target animation also need to be stored in the animdraawabledelete.
Animation positions in animation agent class
The animation position of the target animation can be determined by the target associated object corresponding to the target animation. Generally, the animation can be drawn for the associated object on the graphical user interface, so the position of the animation drawing is related to the position of the associated object, and also related to the relative position between the animation and the associated object.
Taking the graphical user interface shown in fig. 3 as an example, a plurality of seats, that is, associated objects, are laid out in one voice room, each seat has its corresponding coordinates, and the target animation is drawn for the seat, so when determining the animation position, it is necessary to determine the position of the target seat corresponding to the animation drawing instruction, and it is also necessary to acquire the relative position of the SVGA animation and the seat, for example, drawing on the left or right side.
Therefore, further, for any first target animation in the plurality of target animations, obtaining the animation position of the first target animation in the graphical user interface comprises: determining a target associated object corresponding to the first target animation based on the drawing instruction of the first target animation; acquiring position information of a target associated object in a graphical user interface, and acquiring relative position information of a first target animation and the target associated object in a graphical user; and determining the animation position of the first target animation in the graphical user interface according to the position information and the relative position information.
When the position information of the target associated object corresponding to the target animation in the graphical user interface is obtained, two forms are available.
One is that the position information of all the related objects in the graphical user interface is stored in advance, and the position information of the target object is directly extracted when a drawing instruction is received. Based on the method, when the animation is drawn, the animation position can be calculated by combining the stored positions of the associated objects only by acquiring the relative position relation between the animation and the associated objects in the target animation according to the drawing instruction, so that the overhead during page measurement is reduced to a certain extent, and the page performance is optimized.
And the other method is to acquire the position information of the related object in real time according to the drawing instruction when the drawing instruction is received. Since the position information of the related object is not stored in advance, it is necessary to acquire the position information of the related object and the relative position information of the target animation and the related object when a drawing instruction is received, and then calculate the animation position based on the acquired information.
It should be noted that both the position information of the target associated object and the relative position relationship between the target animation and the target associated object need to be stored in a pre-created animation agent class animdrowtabledelete, and the animation agent class further implements drawing management of the target animation.
Thus, after obtaining the animation location of the target animation in the graphical user interface, and after constructing the animation proxy class, the method further comprises: and storing the position information of the target associated object corresponding to the first target animation and the relative position information of the first target animation and the target associated object into the animation proxy class corresponding to the first target animation.
In one embodiment of the present disclosure, the method further comprises: and when the position information of the target associated object in the graphical user interface changes, updating the position information of the target associated object corresponding to the first target animation in the animation proxy class.
Specifically, the layout information of the associated objects in the graphical user interface may change, and when the layout of the associated objects changes, the position information of each associated object in the layout will also change, and the position information of the associated objects stored in the animation agent class needs to be updated in real time, so as to ensure that the position of the animation is calculated correctly, and the drawing of the target animation is completed accurately and without errors.
For example, in the voice room shown in fig. 3, when the layout of the seats in the voice room changes, the animation agent class can be notified in time, and the latest position information of the seats can be updated synchronously.
② display frames in animation agent class
For the display frame of the target animation, other tools are needed to convert the animation resources into the display frame in advance, and then the display frame is stored in the animation proxy class.
Further, after obtaining the animation resources of the target animation and building the animation agent class, the method further comprises: converting animation resources of the target animation into display frames by using the drawable resource class; and storing the display frame of the target animation into the animation proxy class corresponding to the target animation.
The Drawable resource class is a ccsvgdhrawable (frame animation) module, and the module can perform frame animation calculation on the acquired animation resource of the seat animation, convert the animation resource into a Drawable (display frame), and render the Drawable on a screen by View. Similar to the animation agent class, all belong to the concept of "class" in programming.
After the display frame is calculated, the ccsvgdvawarable module stores the current display frame in the animdrowbledelete animation agent class for animation rendering by the subsequent animation agent class.
In step S22, a drawing command corresponding to each target animation is created by each animation agent based on the animation drawing information of the target animation corresponding to the animation agent.
The animdawabledelete animation agent class stores the position where SVGA animation needs to be displayed and the display frame to be played. When drawing the SVGA, the animation agent class creates a drawing instruction of each frame, and then the drawing work agent sends the drawing instruction to the AudioHalAnimTextureView playing container, and informs the AudioHalAnimTextureView playing container that the drawing work of the SVGA should be performed at the animation position.
In addition, since one target animation includes a plurality of frames, when the drawing of the target animation is completed, the drawing instruction for creating the target animation is a single-frame drawing request corresponding to each display frame in the creation target animation.
Further, in step S22, it is considered that the animation proxy process is performed on a plurality of animation proxy classes, specifically on a single animation proxy class, that is: the method for creating the single-frame drawing request of each display frame corresponding to the first target animation by using the animation agent class corresponding to the first target animation according to the animation drawing information of the first target animation specifically comprises the following steps:
step S221, extracting a current display frame based on the display frame of the first target animation stored in the animation agent class corresponding to the first target animation, and rewriting the drawing logic of the current display frame to obtain a current drawing frame of the first target animation;
step S222, extracting the current drawing position of the current drawing frame of the first target animation based on the animation position of the first target animation stored in the animation agent class corresponding to the first target animation;
step S223, creating a single-frame drawing request corresponding to the current drawing frame in the first target animation according to the current drawing frame and the current drawing position of the first target animation, and sending the single-frame drawing request to an animation playing container;
step S224, when the animation playing container completes the drawing of the current drawing frame in the first target animation based on the single-frame drawing request, the above process of creating and sending the single-frame drawing request is repeated until the sending of the single-frame drawing requests of all the display frames of the first target animation is completed.
In step S221, since the view container used for animation rendering according to the present disclosure is a pre-developed audiohal animttexturcewieview animation playback container, the rendering logic of the obtained Drawable display frame of the ccsvgdvrawable module needs to be rewritten, and thus can be rendered by the animation playback container audiohal animtextureview.
Therefore, the current display frame is extracted from the display frames stored in the animation agent class, and the current drawing frame capable of being drawn by the animation playing container is obtained by rewriting the drawing logic of the current display frame. It should be noted that the current display frame is a rendering logic obtained by using the ccsvgdhrawable module, and the current rendering frame is a rendering logic readable by the rewritten animation playing container, and the contents of the two frames are the same, but the rendering logic is different.
In step S222, the animdawabledelete animation agent class extracts the animation position of the target animation acquired in advance as the current drawing position of the current drawing frame, that is, the position indicating animation drawing.
In step S223, a single frame drawing request is created according to the current display frame and the current drawing position, and the single frame drawing request is sent to the animation playing container, so as to delegate the content of drawing the current display frame to the audiohalimamttextview animation playing container.
In step S224, the animation playback container executes the single-frame drawing request, and feeds back the drawing result to the animation playback container. If an instruction of drawing success is received, the steps are repeated to continue drawing the next frame of the animation, namely, the current display frame is extracted again, the drawing logic of the current display frame is rewritten to obtain the current drawing frame, a single-frame drawing request is created, the process is circulated until the last key frame of the SVGA is processed, and therefore the animation effect is formed visually.
In step S23, the drawing instructions corresponding to the target animations are asynchronously executed by the animation playing container including the asynchronous thread processing mechanism, so as to asynchronously draw the target animations on the graphical user interface.
During animation drawing, since the animdawabledelete animation agent class proxies the animated content to the audiohalimamidtextureview animation playing container, the drawing is specifically completed by the animation playing container. Specifically, the animation playing container is responsible for executing drawing instructions of a plurality of different target animations, and the drawing instructions of each target animation comprise single-frame drawing requests of a plurality of different display frames.
Further, when the animation playing container executes the single-frame drawing request, step S23 specifically executes the single-frame drawing request of each display frame of the first target animation in sequence through the animation playing container, and specifically includes the following steps:
step S231, in response to a single-frame drawing request of a current drawing frame of the first target animation, which is sent by an animation agent class corresponding to the first target animation, extracting the current drawing frame and the current drawing position corresponding to the single-frame drawing request through an animation playing container;
step S232, deleting the display frame corresponding to the current drawing position, and drawing the current drawing frame of the first target animation at the current drawing position;
in step S233, the above-mentioned drawing process of the current drawn frame is sequentially repeated until all the display frames of the first target animation are drawn.
Specifically, the animation playing container responds to a single-frame drawing request sent by the animation agent class, and the single-frame drawing request comprises a frame to be drawn and a position.
And clearing the existing Drawable at the current drawing position by using a Canvas control in the view container, and drawing the current drawing frame obtained according to the single-frame drawing request at the position of the Canvas.
Based on the method, when the SVGA animation needs to be drawn, an animdawableDelegate animation agent class is created to be specially used for managing the drawing of the SVGA animation. The animation agent class is used for storing relevant information about drawing target animation, such as display frames, animation positions and the like, and can determine the display position of the SVGA animation, the starting and ending of the SVGA animation and the frame Drawable of the SVGA currently being displayed; on the other hand, the system is also used for making a drawing work agent to the animation playing container AudioHalAnimTextureView, and informing the animation playing container AudioHalAnimTextureView of which frame of SVGA animation drawing should be made at which position when each frame is drawn.
Meanwhile, the same animation play container AudioHalImImTextureView simultaneously controls a plurality of animation agent classes animDrawableDelegates, so the animation play container AudioHalImTextureView can simultaneously draw a plurality of SVGA animations, the expenditure of UI threads is reduced, and page blockage is optimized.
FIG. 4 schematically illustrates a flow diagram of a method of animation in an exemplary embodiment of the disclosure. The process of animation will be described in detail with reference to fig. 4. The specific process is as follows:
and S401, acquiring the relative position of the SVGA animation and the target seat.
Step S402, acquiring the position of a target seat; can be extracted from animdawabledelegate.
And step S403, obtaining SVGA animation resources.
Step S404, constructing an animDrawableDelegate animation agent class;
step S405, calculating a frame Drawable which needs to be drawn currently by CcSVGADRrawable; namely, calculation is carried out according to animation resources;
step S406, notify animdrawableDelegate; i.e. the calculated current display frame Drawable is stored in animdrowbledelete.
Step S407, notifying AudioHalImTextureView; namely, the animdawabledele-gate sends a single-frame drawing request according to the current drawing frame and the current drawing position.
Step S408, clearing the picture at the position according to the current drawing position;
step S409, drawing a current drawing frame on a Canvas;
step S410, judging whether the current drawing frame is the last frame; if not, executing step S405, and repeatedly calculating and drawing; if yes, go to step S411 to finish drawing.
Fig. 5 schematically illustrates a composition diagram of an animation rendering apparatus in an exemplary embodiment of the present disclosure, and as shown in fig. 5, the animation rendering apparatus 500 may include a response module 501 and a rendering module 502. Wherein:
a response module 501, configured to respond to a drawing instruction of a plurality of target animations in a graphical user interface, to obtain animation drawing information of the plurality of target animations; the animation drawing information comprises an animation position of the target animation in the graphical user interface and animation resources of the target animation;
the drawing module 502 is configured to draw the multiple target animations asynchronously on the graphical user interface according to animation drawing information of the multiple target animations via an animation playing container including an asynchronous thread processing mechanism.
According to an exemplary embodiment of the present disclosure, the drawing module 502 includes a construction unit, an instruction unit, and a drawing unit, the construction unit is configured to construct animation agent classes corresponding to a plurality of target animations, respectively, and the instruction unit is configured to create a drawing instruction corresponding to each target animation by using each animation agent class according to animation drawing information of the target animation corresponding to the animation agent class, respectively; the drawing unit is used for asynchronously executing drawing instructions corresponding to the target animations through an animation playing container comprising an asynchronous thread processing mechanism so as to asynchronously draw the target animations on the graphical user interface.
According to an exemplary embodiment of the present disclosure, the response module 501 is further configured to determine a target associated object corresponding to the first target animation based on the drawing instruction of the first target animation; acquiring position information of a target associated object in a graphical user interface, and acquiring relative position information of a first target animation and the target associated object in a graphical user; and determining the animation position of the first target animation in the graphical user interface according to the position information and the relative position information.
According to an exemplary embodiment of the disclosure, the response module 501 is further configured to, after obtaining an animation position of the first target animation in the graphical user interface, store position information of a target associated object corresponding to the first target animation and relative position information of the first target animation and the target associated object into an animation proxy class corresponding to the first target animation; and when the position information of the target associated object in the graphical user interface changes, updating the position information of the target associated object corresponding to the first target animation in the animation proxy class.
According to an exemplary embodiment of the present disclosure, the response module 501 is further configured to, after obtaining the animation resource of the target animation, convert the animation resource of the target animation into a display frame by using a drawable resource class; and storing the display frame of the target animation into the animation proxy class corresponding to the target animation.
According to the exemplary embodiment of the disclosure, the animation proxy class stores animation positions and display frames corresponding to the target animation; the drawing instruction comprises a single-frame drawing request of each display frame in each target animation; the instruction unit is also used for extracting a current display frame based on the display frame of the first target animation stored in the animation agent class corresponding to the first target animation, and rewriting the drawing logic of the current display frame to obtain a current drawing frame of the first target animation; extracting the current drawing position of the current drawing frame of the first target animation based on the animation position of the first target animation stored in the animation agent class corresponding to the first target animation; creating a single-frame drawing request corresponding to the current drawing frame in the first target animation according to the current drawing frame and the current drawing position of the first target animation, and sending the single-frame drawing request to an animation playing container; when the animation playing container finishes the drawing of the current drawing frame in the first target animation based on the single-frame drawing request, the process of creating and sending the single-frame drawing request is repeated until the single-frame drawing requests of all display frames of the first target animation are sent.
According to an exemplary embodiment of the present disclosure, the drawing unit is further configured to, in response to a single-frame drawing request of a current drawing frame of the first target animation, sent by the animation agent class corresponding to the first target animation, extract, by the animation playing container, the current drawing frame and the current drawing position corresponding to the single-frame drawing request; deleting the display frame corresponding to the current drawing position, and drawing the current drawing frame of the first target animation at the current drawing position; and repeating the drawing process of the current drawing frame in sequence until all the display frames of the first target animation are drawn.
The details of each module in the animation rendering device 500 are described in detail in the corresponding animation rendering method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, there is also provided a storage medium capable of implementing the above-described method. Fig. 6 schematically illustrates a schematic diagram of a computer-readable storage medium in an exemplary embodiment of the disclosure, and as shown in fig. 6, a program product 600 for implementing the above method according to an embodiment of the disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a mobile phone. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided. Fig. 7 schematically shows a structural diagram of a computer system of an electronic device in an exemplary embodiment of the disclosure.
It should be noted that the computer system 700 of the electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiments of the present disclosure.
As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU)701, which can perform various appropriate actions and processes according to a program stored in a Read-Only Memory (ROM) 702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for system operation are also stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An Input/Output (I/O) interface 705 is also connected to the bus 704.
The following components are connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage section 708 including a hard disk and the like; and a communication section 709 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read out therefrom is mounted into the storage section 708 as necessary.
In particular, the processes described below with reference to the flowcharts may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program, when executed by a Central Processing Unit (CPU)701, performs various functions defined in the system of the present disclosure.
It should be noted that the computer readable medium shown in the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
As another aspect, the present disclosure also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An animation drawing method for providing a graphical user interface through a terminal device, comprising:
responding to drawing instructions of a plurality of target animations in the graphical user interface, and acquiring animation drawing information of the plurality of target animations; wherein the animation drawing information comprises an animation position of the target animation in the graphical user interface and animation resources of the target animation;
and asynchronously drawing the target animations on the graphical user interface according to the animation drawing information of the target animations through an animation playing container comprising an asynchronous thread processing mechanism.
2. The animation rendering method according to claim 1, wherein the asynchronously rendering the plurality of target animations on the gui according to the animation rendering information of the plurality of target animations via an animation playback container including an asynchronous thread processing mechanism comprises:
constructing animation agent classes corresponding to the target animations respectively;
creating drawing instructions corresponding to the target animations by using the animation agent classes according to the animation drawing information of the target animations corresponding to the animation agent classes;
and asynchronously executing the drawing instruction corresponding to each target animation through an animation playing container comprising an asynchronous thread processing mechanism so as to asynchronously draw the plurality of target animations on the graphical user interface.
3. The animation rendering method according to claim 2, wherein the acquiring animation rendering information of the plurality of target animations includes: acquiring an animation position of a first target animation in the graphical user interface; the first target animation is any one of a plurality of target animations;
the obtaining of the animation position of the first target animation in the graphical user interface comprises:
determining a target associated object corresponding to the first target animation based on the drawing instruction of the first target animation;
acquiring the position information of the target associated object in the graphical user interface, and acquiring the relative position information of the first target animation and the target associated object in the graphical user;
and determining the animation position of the first target animation in the graphical user interface according to the position information and the relative position information.
4. The animation rendering method of claim 3, further comprising, after obtaining the animation position of the first target animation in the graphical user interface:
storing the position information of the target associated object corresponding to the first target animation and the relative position information of the first target animation and the target associated object into an animation proxy class corresponding to the first target animation;
and updating the position information of the target associated object corresponding to the first target animation in the animation proxy class when the position information of the target associated object in the graphical user interface changes.
5. The animation rendering method according to claim 2, wherein after acquiring the animation resources of the target animation, the method further comprises:
converting animation resources of the target animation into display frames by using the drawable resource class;
and storing the display frame of the target animation into an animation proxy class corresponding to the target animation.
6. The animation rendering method according to claim 2, wherein the animation agent class stores animation positions and display frames corresponding to the target animation; the drawing instruction comprises a single-frame drawing request of each display frame in each target animation; the creating of the drawing instruction corresponding to the target animation by using each animation agent class according to the animation drawing information of the target animation corresponding to the animation agent class comprises the following steps:
creating a single-frame drawing request of each display frame corresponding to the first target animation according to the animation drawing information of the first target animation by using an animation agent class corresponding to the first target animation; the first target animation is any one of a plurality of target animations;
the creating of the single-frame drawing request of each display frame corresponding to the first target animation according to the animation drawing information of the first target animation by using the animation agent class corresponding to the first target animation comprises the following steps:
extracting a current display frame based on the display frame of the first target animation stored in the animation agent class corresponding to the first target animation, and rewriting the drawing logic of the current display frame to obtain a current drawing frame of the first target animation; and
extracting the current drawing position of the current drawing frame of the first target animation based on the animation position of the first target animation stored in the animation agent class corresponding to the first target animation;
creating a single-frame drawing request corresponding to the current drawing frame in the first target animation according to the current drawing frame and the current drawing position of the first target animation, and sending the single-frame drawing request to the animation playing container;
and when the animation playing container finishes the drawing of the current drawing frame in the first target animation based on the single-frame drawing request, repeating the processes of creating and sending the single-frame drawing request until the single-frame drawing requests of all display frames of the first target animation are sent.
7. The animation rendering method of claim 6, wherein asynchronously executing each of the rendering instructions via an animation playback container comprising an asynchronous thread handling mechanism comprises:
sequentially executing the single-frame drawing request of each display frame of the first target animation through the animation playing container;
the sequentially executing the single-frame drawing requests of the display frames of the first target animation through the animation playing container comprises the following steps:
responding to a single-frame drawing request of the current drawing frame of the first target animation, which is sent by an animation agent class corresponding to the first target animation, and extracting the current drawing frame and the current drawing position corresponding to the single-frame drawing request through the animation playing container;
deleting the display frame corresponding to the current drawing position, and drawing the current drawing frame of the first target animation at the current drawing position;
and repeating the drawing process of the current drawing frame in sequence until all the display frames of the first target animation are drawn.
8. An animation rendering apparatus for providing a graphical user interface through a terminal device, comprising:
the response module is used for responding to drawing instructions of a plurality of target animations in the graphical user interface and acquiring animation drawing information of the plurality of target animations; wherein the animation drawing information comprises an animation position of the target animation in the graphical user interface and animation resources of the target animation;
and the drawing module is used for asynchronously drawing the target animations on the graphical user interface according to the animation drawing information of the target animations through an animation playing container comprising an asynchronous thread processing mechanism.
9. A computer-readable storage medium, on which a computer program is stored, which program, when executed by a processor, implements the animation rendering method according to any one of claims 1 to 7.
10. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the animation method as claimed in any one of claims 1 to 7.
CN202111543512.9A 2021-12-16 2021-12-16 Animation drawing method and device, storage medium and electronic equipment Pending CN114241094A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111543512.9A CN114241094A (en) 2021-12-16 2021-12-16 Animation drawing method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111543512.9A CN114241094A (en) 2021-12-16 2021-12-16 Animation drawing method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN114241094A true CN114241094A (en) 2022-03-25

Family

ID=80757323

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111543512.9A Pending CN114241094A (en) 2021-12-16 2021-12-16 Animation drawing method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114241094A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024046284A1 (en) * 2022-09-01 2024-03-07 北京字跳网络技术有限公司 Drawing animation generation method and apparatus, and device, readable storage medium and product

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024046284A1 (en) * 2022-09-01 2024-03-07 北京字跳网络技术有限公司 Drawing animation generation method and apparatus, and device, readable storage medium and product

Similar Documents

Publication Publication Date Title
CN109460233B (en) Method, device, terminal equipment and medium for updating native interface display of page
CN108876887B (en) Rendering method and device
US20220277481A1 (en) Panoramic video processing method and apparatus, and storage medium
CN109377554B (en) Large three-dimensional model drawing method, device, system and storage medium
CN110047121B (en) End-to-end animation generation method and device and electronic equipment
WO2021146930A1 (en) Display processing method, display processing apparatus, electronic device and storage medium
CN110555900B (en) Rendering instruction processing method and device, storage medium and electronic equipment
US11095957B2 (en) Method and apparatus for publishing information, and method and apparatus for processing information
US20190080017A1 (en) Method, system, and device that invokes a web engine
CN110047119B (en) Animation generation method and device comprising dynamic background and electronic equipment
WO2022033131A1 (en) Animation rendering method based on json data format
US20190005156A1 (en) Data flow visualization system
WO2017129105A1 (en) Graphical interface updating method and device
CN114241094A (en) Animation drawing method and device, storage medium and electronic equipment
CN112947905A (en) Picture loading method and device
WO2023202361A1 (en) Video generation method and apparatus, medium, and electronic device
CN112492399B (en) Information display method and device and electronic equipment
CN110647273B (en) Method, device, equipment and medium for self-defined typesetting and synthesizing long chart in application
CN109672931B (en) Method and apparatus for processing video frames
CN110147283B (en) Display content switching display method, device, equipment and medium
CN114237398B (en) Method and device for generating room small map based on illusion engine and storage medium
CN114489910B (en) Video conference data display method, device, equipment and medium
CN114247138B (en) Image rendering method, device and equipment and storage medium
CN108804088A (en) Protocol processes method and apparatus
CN109600558B (en) Method and apparatus for generating information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination