CN115460435A - Special effect animation playing method and device, storage medium and computer equipment - Google Patents

Special effect animation playing method and device, storage medium and computer equipment Download PDF

Info

Publication number
CN115460435A
CN115460435A CN202211029820.4A CN202211029820A CN115460435A CN 115460435 A CN115460435 A CN 115460435A CN 202211029820 A CN202211029820 A CN 202211029820A CN 115460435 A CN115460435 A CN 115460435A
Authority
CN
China
Prior art keywords
data
special effect
animation
basic
playing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211029820.4A
Other languages
Chinese (zh)
Inventor
张凯
沈志铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Boguan Information Technology Co Ltd
Original Assignee
Guangzhou Boguan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Boguan Information Technology Co Ltd filed Critical Guangzhou Boguan Information Technology Co Ltd
Priority to CN202211029820.4A priority Critical patent/CN115460435A/en
Publication of CN115460435A publication Critical patent/CN115460435A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application discloses a special-effect animation playing method and device, a storage medium and computer equipment. The method comprises the following steps: the client receives basic animation data and special effect data which are sent by the server according to the special effect triggering information; analyzing the basic animation data to determine at least one target data in the basic animation data and at least one field position corresponding to the target data; finally, in a field position corresponding to at least one kind of target data, updating the at least one kind of target data into special effect data to obtain special effect animation data containing at least one kind of special effect; and playing the special effect animation data to realize the playing of at least one special effect animation. The special effect animation playing method can play the special effect animation containing at least one special effect at the client.

Description

Special effect animation playing method and device, storage medium and computer equipment
Technical Field
The application relates to the technical field of computers, in particular to a special-effect animation playing method and device, a storage medium and computer equipment.
Background
In some live scenes, in order to make the live effect more interesting, the interactivity between the live effect and the user is increased, and corresponding animations are often added in the live process.
However, some animation resources do not support adding different types of special effects and do not support normal playing on some devices, so that the animation resources can only be used in limited scenes.
Disclosure of Invention
The embodiment of the application provides a special-effect animation playing method and device, a storage medium and computer equipment. The special effect animation playing method can play one or more special effect animations.
In a first aspect, an embodiment of the present application provides a special effect animation playing method, including:
receiving basic animation data and special effect data which are sent by a server according to the special effect trigger information;
analyzing the basic animation data to determine the field positions corresponding to at least one target data and at least one target data in the basic animation data;
updating at least one target data into special effect data in a field position corresponding to at least one target data to obtain special effect animation data containing at least one special effect;
and playing the special effect animation data to realize the playing of at least one special effect animation.
In a second aspect, an embodiment of the present application provides a special effect animation playback device, including:
the receiving module is used for receiving basic animation data and special effect data which are sent by the server according to the special effect trigger information;
the analysis module is used for analyzing the basic animation data so as to determine at least one target data in the basic animation data and at least one field position corresponding to the target data;
the updating module is used for updating at least one kind of target data into special effect data in a field position corresponding to the at least one kind of target data so as to obtain special effect animation data containing at least one kind of special effect;
and the playing module is used for playing the special effect animation data so as to play at least one special effect animation.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, where a plurality of instructions are stored in the computer-readable storage medium, and the instructions are suitable for being loaded by a processor to perform steps in a special-effect animation playing method provided in an embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements, when executing the program, the steps in the special-effect animation playing method provided in the embodiment of the present application.
In the embodiment of the application, a client receives basic animation data and special effect data which are sent by a server according to special effect trigger information; analyzing the basic animation data to determine at least one target data in the basic animation data and at least one field position corresponding to the target data; finally, in a field position corresponding to at least one kind of target data, updating the at least one kind of target data into special effect data to obtain special effect animation data containing at least one kind of special effect; and playing the special effect animation data to realize the playing of at least one special effect animation. In the embodiment of the application, the target data in the basic animation data is analyzed, and then the special effect data containing at least one special effect is updated to the target data, so that the special effect animation containing at least one special effect is played at the client.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a scene schematic diagram of special-effect animation playing provided in an embodiment of the present application.
Fig. 2 is a first flowchart of a special effect animation playing method according to an embodiment of the present application.
Fig. 3 is a second flowchart of a special-effect animation playing method according to an embodiment of the present application.
Fig. 4 is a schematic diagram of a first data structure of basic animation data provided in an embodiment of the present application.
Fig. 5 is a schematic diagram of a second data structure of basic animation data provided in an embodiment of the present application.
FIG. 6 is a schematic diagram of a special effect animation scene including multiple special effects according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of a special effect animation playback device according to an embodiment of the present application.
Fig. 8 is a schematic structural diagram of a computer device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In some live scenes, in order to make the live effect more interesting, the interactivity between the live effect and the user is increased, and corresponding animations are often added in the live process.
However, some animation resources do not support adding different types of special effects and do not support normal playing on some devices, so that the animation resources can only be used in limited scenes.
For example, in some animation resources, custom change according to user information of a user is not supported, for example, the animation resources are changed according to information such as a name and a use duration of the user, so as to generate a special effect animation corresponding to the user.
In order to solve the foregoing technical problem, embodiments of the present application provide a method, an apparatus, and a storage medium for playing special-effect animation. The special effect animation playing method can play one or more special effect animations.
Referring to fig. 1, fig. 1 is a scene schematic diagram of special-effect animation playing according to an embodiment of the present application.
In some scenes such as live scenes and game scenes, corresponding special effect animations are often shown on the client, so that interactivity in the live process is increased, or interestingness in games is increased. The special effect animations are often obtained by sending the server side to the client side and then playing the special effect animations locally by the client side.
As shown in fig. 1, the server may be various servers, the client may be an electronic device such as a mobile phone, a computer, a television, etc., and the client may be set on the electronic device by means of a web page or an application.
In some embodiments, the server may determine special effect trigger information corresponding to the client, for example, the client may determine the special effect trigger information locally, and then send the special effect trigger information to the server, so that the server obtains the special effect trigger information.
The server side can also obtain non-privacy related data of the user corresponding to the client side, such as the fan attention amount of the user, the game duration of the user, the achievement situation of the user and the like, and then special effect triggering information corresponding to the client side is generated according to the non-privacy related data of the user.
The client can receive the basic animation data and the special effect data which are sent by the server according to the special effect trigger information. The basic animation data can be understood as animation data which can be commonly used by one or more clients, and various special effect data can be added into the basic animation data. The data format of the base animation data may be Mp4.
After the client acquires the basic animation data, the client can analyze the basic animation data to determine the field position corresponding to at least one target data in the basic animation data. Then, the client updates at least one kind of target data into special effect data in a field position corresponding to the at least one kind of target data to obtain special effect animation data containing at least one kind of special effect. And finally, playing the special effect animation data by the client, thereby realizing playing the special effect animation containing one or more special effects on the local site of the client.
For more detailed understanding of the method for playing special-effect animation provided by the embodiment of the present application, please refer to fig. 2, and fig. 2 is a first flowchart of the method for playing special-effect animation provided by the embodiment of the present application. The special effect animation playing method can comprise the following steps:
110. and receiving the basic animation data and the special effect data which are sent by the server according to the special effect trigger information.
In some embodiments, the server may send the base animation data and the special effect data to the client according to the special effect trigger information. The basic animation data can be understood as animation data which can be commonly used by a plurality of clients, and various special effect data can be added into the basic animation data. The data format of the base animation data may be Mp4.
The effect data may be understood as data that can be inserted into the basic animation data, the effect data including at least one effect, and the effect data may include various types of effects, such as a dynamic effect, a static effect, a text effect, and the like.
In some embodiments, the client may receive a Content Delivery Network (CDN) corresponding to the basic animation data returned by the server according to the special-effect trigger information, and then obtain the basic animation data according to the Content Delivery Network.
For example, the client can download basic animation data, which may be basic animation data held in the content distribution server, through the content distribution network address.
In some embodiments, the special effect data may include basic data, a data type of the basic data, a padding value of a special effect corresponding to the basic data, a preset insertion position corresponding to the basic data, and the like. Basic data can be understood as elements corresponding to special effects, such as video data, picture data, and text data. The data type may be a specific data type such as video, picture, text. The filling value may be a filling range, a direction, a size, a position, and the like corresponding to the special effect, for example, a special effect picture may be set at a certain position on a screen background of the client, the size of the special effect picture, and the like.
120. And analyzing the basic animation data to determine the field positions corresponding to at least one target data and at least one target data in the basic animation data.
In some implementations, after the client retrieves the base animation data, the client may parse the base animation data. Specifically, the client may perform preprocessing on the basic animation data to obtain a play task queue corresponding to the basic animation data, and then load the play task queue into a player corresponding to the basic animation data.
For example, the client may analyze the basic data to obtain data corresponding to each frame of picture, generate a play task queue according to the data corresponding to each frame of picture, and load the play task queue into the player, where the player may analyze the data corresponding to each frame of picture to determine at least one type of target data in the data corresponding to each frame of picture and a field position corresponding to the at least one type of target data.
In some embodiments, in the basic animation data, the data corresponding to each frame of picture is provided with target data and a field position corresponding to the target data. For example, the target data may be a part of all corresponding fields in a frame of picture, and then the corresponding positions of the part of fields in all fields are determined, so as to obtain the field positions corresponding to the target data.
In some embodiments, at least one type of target data in the basic animation data, for example, in data corresponding to one frame of picture, there are fields corresponding to multiple types of target data, for example, target data 1 corresponds to a part of fields, target data 2 corresponds to a part of fields, target data 3 corresponds to a part of fields, and target data 1, target data 2, and target data 3 are in different field positions in the data corresponding to the frame of picture, so that the data corresponding to the frame of picture corresponds to three types of target data.
130. And updating the at least one target data into special effect data in a field position corresponding to the at least one target data to obtain special effect animation data containing at least one special effect.
In some embodiments, the client may determine basic data and a preset insertion position corresponding to at least one special effect in the special effect data, then determine a target field position corresponding to the preset insertion position in a field position corresponding to at least one target data according to the preset insertion position, and finally insert the basic data into the target field position to obtain special effect animation data including at least one special effect.
For example, in the one-frame basic animation data, there are target data 1, target data 2, and target data 3, where the target field position corresponding to the target data 1 is number 1, the target field position corresponding to the target data 2 is number 2, and the target field position corresponding to the target data 3 is number 3.
The special effect data includes basic data 1, basic data 2, and basic data 3, a preset insertion position corresponding to the basic data 1 is an insertion position with number 1, a preset insertion position corresponding to the basic data 2 is an insertion position with number 2, and a preset insertion position corresponding to the basic data 3 is an insertion position with number 3.
The client may insert the basic data 1 into the target field position corresponding to the target data 1 to replace the target data 1, insert the basic data 2 into the target field position corresponding to the target data 2 to replace the target data 2, and insert the basic data 3 into the target field position corresponding to the target data 3 to replace the target data 3. Therefore, special effect animation data containing various special effects is obtained.
The client can also obtain a filling value and a data type in basic data corresponding to at least one special effect, and then insert the filling value and the data type into the position of the target field to obtain special effect animation data containing at least one special effect.
For example, the client 1 may insert the padding value and the data type corresponding to the base data 1 into the target field position 1, so as to obtain special effect animation data including a special effect corresponding to the base data 1.
140. And playing the special effect animation data to realize the playing of at least one special effect animation.
In some embodiments, after obtaining the special effect animation data, the client may play the special effect animation data through the player, wherein the special effect animation data includes a special effect animation of at least one special effect. For example, the special effect animation may include a static picture, a dynamic video, a text, and the like.
For example, in a live scene, after a user reaches a certain fan achievement, the head portrait of the user and the banner text of "congratulations reach XX fan achievement" are shown, and meanwhile, some dynamic effects, such as "flower spreading", "starry flashing", and the like, are shown on the screen.
That is, in the special effect animation data in the present application, special effect animation of one or more special effects can be realized.
In some embodiments, the special effect animation data includes multi-frame animation, and the client may determine a start frame number and an end frame number corresponding to at least one special effect in the special effect animation data, and then play the special effect animation of the at least one special effect according to the start frame number and the end frame number. For example, the frame number corresponding to the special effect animation data is 600 frames, the starting frame number is 100 th frame, and the ending frame number is 550 th frame, so that the multiple special effects are only played from the 100 th frame to the 550 th frame.
In the embodiment of the application, a client receives basic animation data and special effect data which are sent by a server according to special effect trigger information; analyzing the basic animation data to determine at least one target data in the basic animation data and at least one field position corresponding to the target data; finally, in a field position corresponding to at least one kind of target data, updating the at least one kind of target data into special effect data to obtain special effect animation data containing at least one kind of special effect; and playing the special effect animation data to realize the playing of at least one special effect animation. In the embodiment of the application, the target data in the basic animation data is analyzed, and then the special effect data containing at least one special effect is updated to the target data, so that the special effect animation containing at least one special effect is played at the client.
Referring to fig. 3, fig. 3 is a second flow chart illustrating a method for playing a special-effect animation according to an embodiment of the present application. The special effect animation playing method can comprise the following steps:
201. and when the special effect triggering information is determined, sending the special effect triggering information to the server.
In some embodiments, after the user triggers a certain condition in a game scene or a live scene, the client determines special effect triggering information according to the condition. For example, when a fan concerned by a user in live broadcasting reaches 100 thousands, and the fan reaches 100 thousands as a trigger condition, the client determines special effect trigger information. For example, when the number of enemies killed by the user reaches a certain number in a game scene, the number of enemies killed reaches a certain number as a trigger condition, and at this time, the client determines special effect trigger information.
The client sends the special effect triggering information to the server, and the server obtains the special effect triggering information.
202. And receiving the basic animation data and the special effect data which are sent by the server according to the special effect trigger information.
In some embodiments, the server may send the base animation data and the special effect data to the client according to the special effect trigger information. The basic animation data can be understood as animation data which can be commonly used by a plurality of clients, and various special effect data can be added into the basic animation data. The data format of the base animation data may be Mp4.
The effect data may be understood as data that can be inserted into the basic animation data, the effect data including at least one effect, and the effect data may include various types of effects, such as a dynamic effect, a static effect, a text effect, and the like.
In some embodiments, the client may receive a Content Delivery Network (CDN) corresponding to the basic animation data returned by the server according to the special-effect trigger information, and then obtain the basic animation data according to the Content Delivery Network.
For example, the client can download basic animation data, which may be basic animation data held in the content distribution server, through the content distribution network address.
In some embodiments, the special effect data may include basic data, a data type of the basic data, a padding value of a special effect corresponding to the basic data, a preset insertion position corresponding to the basic data, and the like. Basic data can be understood as elements corresponding to special effects, such as video data, picture data, and text data. The data type may be a specific data type such as video, picture, text. The filling value may be a filling range, a direction, a size, a position, and the like corresponding to the special effect, for example, a special effect picture may be set at a certain position on a screen background of the client, the size of the special effect picture, and the like.
203. And preprocessing the basic animation data to obtain a play task queue corresponding to the basic animation data.
The client preprocesses the basic animation data, and reads and analyzes the basic animation data so as to obtain the play task queue corresponding to the multi-frame image.
For example, the client may analyze the basic data to obtain data corresponding to each frame of picture, and then generate the play task queue according to the data corresponding to each frame of picture.
204. And loading the play task queue to a player corresponding to the basic animation data.
After the client obtains the play task queue, the client loads the play task queue into the player. And subsequently, each frame of picture in the play task queue is analyzed through the player.
205. Analyzing the data corresponding to each frame of picture in the play task queue through the player to determine at least one target data in the data corresponding to each frame of picture and a field position corresponding to the at least one target data.
In some embodiments, the data format of the base animation data is Mp4 format, and the at least one target data is disposed in a special effects container of the base animation data.
Referring to fig. 4, fig. 4 is a schematic diagram of a first data structure of basic animation data according to an embodiment of the present application.
The basic animation data may be data in Mp4 format, among others. All data in the MP4 formatted file are contained in a box, that is, the MP4 file is composed of several boxes, each box has a type and a length, and a box can be understood as a data object block. One box may contain another box, referred to as a container box.
The MP4 files in the prior art include several types of boxes, such as byte Box, moov Box, uuid Box, and Mdat Box, wherein an MP4 file first has and only has one "ftyp" type of Box as an MP4 format flag and contains some information about the file, such as the file type. There will then be and only one "moov" type box, which is a container box, whose sub-boxes contain the spatio-temporal information of the audiovisual data. The media data of the MP4 file is contained in a box of the "mdat" type, which is also a container box, and may or may not be plural.
In the application, the special effect data needs to be stored in a separate container, so that a Ccms Box (special effect container) is added to the data format of the existing MP4 file, and the special effect container is used for storing the special effect data.
Referring to fig. 5, fig. 5 is a diagram illustrating a second data structure of basic animation data according to an embodiment of the present application.
The special effect container (Ccms Box) has a plurality of sub-containers, each of which can store special effects of corresponding elements, for example, the sub-container 1Box can store elements of a first special effect, such as characters. The sub container 2Box may store elements of a second special effect, such as a picture. The sub container 3Box may store therein elements of a third special effect, such as a motion video. That is, a plurality of special effect elements may be inserted into the special effect container.
When the player plays, a plurality of special effect elements in the special effect container can be read, so that different special effects can be played, and the fusion playing of various special effects can be realized.
The basic animation data is provided with the special effect container, initial data in the special effect container is target data, and the method can be understood as that when a developer develops basic animation resources, the target data is independently stored in the special effect container so as to be convenient for replacing the target data with the special effect data subsequently, and therefore the purpose of playing the special effect animation of at least one special effect on a client is achieved.
It should be noted that the basic animation data is applicable to a plurality of users, to a plurality of electronic devices, and to a plurality of system platforms, and after the basic animation data is downloaded by a client on a corresponding computer device, corresponding special effect data can be inserted, so as to generate special effect animation data.
In some embodiments, the client may parse, by the player, data corresponding to each frame of picture in the play task queue to determine a field position corresponding to at least one target data and at least one target data in the data corresponding to each frame of picture.
For example, in the one-frame basic animation data, there are target data 1, target data 2, and target data 3, where the target field position corresponding to the target data 1 is number 1, the target field position corresponding to the target data 2 is number 2, and the target field position corresponding to the target data 3 is number 3.
In some embodiments, the client may sequentially parse the fields corresponding to the basic animation data, and determine the target data and the field position corresponding to the target data according to the field marking information corresponding to the target data.
For example, the client may read each box in the basic animation data according to the field sequence, thereby determining the field corresponding to the target data, and then determining the field position corresponding to the target data.
206. And determining basic data and a preset insertion position corresponding to at least one special effect in the special effect data.
For example, the special effect data includes basic data 1, basic data 2, and basic data 3, the preset insertion position corresponding to the basic data 1 is the insertion position of number 1, the preset insertion position corresponding to the basic data 2 is the insertion position of number 2, and the preset insertion position corresponding to the basic data 3 is the insertion position of number 3.
207. And determining a target field position corresponding to the preset insertion position in the field positions corresponding to the at least one type of target data according to the preset insertion position.
For example, the client may insert base data 1 into a destination field position corresponding to destination data 1 to replace destination data 1, insert base data 2 into a destination field position corresponding to destination data 2 to replace destination data 2, and insert base data 3 into a destination field position corresponding to destination data 3 to replace destination data 3. Therefore, special effect animation data containing various special effects is obtained.
208. And inserting the basic data into the target field position to obtain special effect animation data containing at least one special effect.
The client can also obtain a filling value and a data type in basic data corresponding to at least one special effect, and then insert the filling value and the data type into the position of the target field to obtain special effect animation data containing at least one special effect. The data type may be a specific data type such as video, picture, text. The filling value may be a filling range, a direction, a size, a position, and the like corresponding to the special effect, for example, a special effect picture may be set at a certain position on a screen background of the client, the size of the special effect picture, and the like.
For example, the client 1 may insert a filling range and a data type corresponding to the base data 1 into the target field position 1, so as to obtain special effect animation data including a special effect corresponding to the base data 1 in a certain filling range.
209. And determining a starting frame number and an ending frame number corresponding to at least one special effect in the special effect animation data.
The special effect animation data corresponds to 600 frames, the starting frame number is 100 th frame, and the ending frame number is 550 th frame, that is, at least one special effect is played at the beginning of the 100 th frame and is played at the end of the 550 th frame.
210. And playing the special effect animation of at least one special effect according to the starting frame number and the ending frame number.
The client can play special effect animation data through the player and play at least one special effect animation from the starting frame number to the ending frame number.
For example, in a live scene, after a user reaches a certain fan achievement, the head portrait of the user and the banner text of "congratulations reach XX fan achievement" are shown, and meanwhile, some dynamic effects, such as "flower spreading", "starry flashing", and the like, are shown on the screen.
In some embodiments, when each frame of the special effect animation data contains a plurality of special effects, the client may play a special effect animation corresponding to the plurality of special effects in each frame of the frame.
Referring to fig. 6, fig. 6 is a schematic diagram of a special effect animation scene including multiple special effects according to an embodiment of the present application.
As shown in fig. 6, a special effect containing three elements is included, wherein the special effect a may be a head portrait corresponding to a user, i.e., a static special effect. The effect B may be a rotating little sun, i.e. a dynamic effect. And the special effect C has a congratulatory phrase corresponding to the user, namely that the congratulation XXX reaches 10 ten thousand vermicelli, namely the character special effect.
That is to say, in one frame of picture, the client can play three special effects, so that a fusion playing effect of multiple special effects is realized, the special effect animation data is produced based on the basic animation data in the MP4 format, the data in the MP4 format has good compatibility, the special effect animation data in the MP4 format can be compatible with various types of electronic devices, for example, the special effect animation data cannot be influenced by hardware resolutions of various electronic devices, so that the special effect animation of multiple special effects is played.
In the embodiment of the application, when the special effect trigger information is determined, the client sends the special effect trigger information to the server, then receives the basic animation data and the special effect data sent by the server according to the special effect trigger information, preprocesses the basic animation data to obtain the play task queue corresponding to the basic animation data, and loads the play task queue into the player corresponding to the basic animation data.
And analyzing the data corresponding to each frame of picture in the play task queue through the player to determine at least one target data in the data corresponding to each frame of picture and a field position corresponding to the at least one target data. And determining basic data and a preset insertion position corresponding to at least one special effect in the special effect data, and determining a target field position corresponding to the preset insertion position in the field position corresponding to at least one target data according to the preset insertion position.
Finally, the basic data is inserted into the target field position to obtain special effect animation data containing at least one special effect. And determining a starting frame number and an ending frame number corresponding to at least one special effect in the special effect animation data. And playing the special effect animation of at least one special effect according to the starting frame number and the ending frame number. Thereby realizing the playing of one or more special effect animations.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a special-effect animation playback device according to an embodiment of the present application. The special effect animation playback apparatus 300 may include:
the receiving module 310 is configured to receive basic animation data and special effect data sent by the server according to the special effect trigger information.
The receiving module 310 is further configured to receive a content distribution network address corresponding to the basic animation data returned by the server according to the special effect trigger information;
and acquiring basic animation data according to the content distribution network address.
The parsing module 320 is configured to parse the basic animation data to determine a field position corresponding to at least one target data in the basic animation data.
The analysis module 320 is further configured to determine basic data and a preset insertion position corresponding to at least one special effect from the special effect data;
determining a target field position corresponding to a preset insertion position in the field positions corresponding to at least one type of target data according to the preset insertion position;
and inserting the basic data into the target field position to obtain special effect animation data containing at least one special effect.
The parsing module 320 is further configured to obtain a padding value and a data type in the basic data corresponding to at least one special effect;
and inserting the filling value and the data type into the position of the target field to obtain special effect animation data containing at least one special effect.
An updating module 330, configured to update at least one target data into special effect data in a field position corresponding to the at least one target data, so as to obtain special effect animation data including at least one special effect.
The updating module 330 is further configured to, before analyzing the basic animation data to determine a field position corresponding to at least one target data in the basic animation data, preprocess the basic animation data to obtain a play task queue corresponding to the basic animation data; and loading the play task queue to a player corresponding to the basic animation data.
The updating module 330 is further configured to analyze, by the player, data corresponding to each frame of picture in the play task queue to determine a field position corresponding to at least one target data and at least one target data in the data corresponding to each frame of picture.
The updating module 330 is further configured to sequentially analyze the fields corresponding to the basic animation data, and determine the target data and the field position corresponding to the target data according to the field mark information corresponding to the target data.
The playing module 340 is configured to play the special effect animation data to implement playing of at least one special effect animation.
The playing module 340 is further configured to determine a starting frame number and an ending frame number corresponding to at least one special effect in the special-effect animation data;
and playing the special effect animation of at least one special effect according to the starting frame number and the ending frame number.
The playing module 340 is further configured to play a special effect animation corresponding to a plurality of special effects in each frame of picture when each frame of picture in the special effect animation data contains the plurality of special effects.
In the embodiment of the application, a client receives basic animation data and special effect data which are sent by a server according to special effect trigger information; analyzing the basic animation data to determine at least one target data in the basic animation data and at least one field position corresponding to the target data; finally, in a field position corresponding to at least one kind of target data, updating the at least one kind of target data into special effect data to obtain special effect animation data containing at least one kind of special effect; and playing the special effect animation data to realize the playing of at least one special effect animation. In the embodiment of the application, the target data in the basic animation data is analyzed, and then the special effect data containing at least one special effect is updated to the target data, so that the special effect animation containing at least one special effect is played at the client.
Correspondingly, the embodiment of the present application further provides a Computer device, where the Computer device may be a terminal or a server, and the terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 8, fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer apparatus 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer-readable storage media, and a computer program stored on the memory 402 and operable on the processor. The processor 401 is electrically connected to the memory 402. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 401 is a control center of the computer apparatus 400, connects various parts of the entire computer apparatus 400 using various interfaces and lines, performs various functions of the computer apparatus 400 and processes data by running or loading software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby integrally monitoring the computer apparatus 400.
In the embodiment of the present application, the processor 401 in the computer device 400 loads instructions corresponding to processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 runs the application programs stored in the memory 402, thereby implementing various functions:
receiving basic animation data and special effect data which are sent by a server according to the special effect trigger information;
analyzing the basic animation data to determine at least one target data in the basic animation data and at least one field position corresponding to the target data;
updating at least one target data into special effect data in a field position corresponding to at least one target data to obtain special effect animation data containing at least one special effect;
and playing the special effect animation data to realize the playing of at least one special effect animation.
The processor 401 is further configured to perform:
receiving a content distribution network address corresponding to basic animation data returned by the server according to the special effect trigger information;
and acquiring basic animation data according to the content distribution network address.
The processor 401 is further configured to perform:
determining basic data and a preset insertion position corresponding to at least one special effect in the special effect data;
determining a target field position corresponding to a preset insertion position in field positions corresponding to at least one type of target data according to the preset insertion position;
and inserting the basic data into the target field position to obtain special effect animation data containing at least one special effect.
The processor 401 is further configured to perform:
acquiring a filling value and a data type in basic data corresponding to at least one special effect;
and inserting the filling value and the data type into the position of the target field to obtain special effect animation data containing at least one special effect.
The processor 401 is further configured to perform:
the method comprises the steps that before basic animation data are analyzed to determine at least one target data and at least one field position corresponding to the target data in the basic animation data, the basic animation data are preprocessed to obtain a play task queue corresponding to the basic animation data;
and loading the play task queue to a player corresponding to the basic animation data.
The processor 401 is further configured to perform:
analyzing the data corresponding to each frame of picture in the play task queue through the player to determine at least one target data in the data corresponding to each frame of picture and a field position corresponding to the at least one target data.
The processor 401 is further configured to perform:
and sequentially analyzing the fields corresponding to the basic animation data, and determining the target data and the field positions corresponding to the target data according to the field mark information corresponding to the target data.
The processor 401 is further configured to perform:
determining a starting frame number and an ending frame number corresponding to at least one special effect in special effect animation data;
and playing the special effect animation of at least one special effect according to the starting frame number and the ending frame number.
The processor 401 is further configured to perform:
when each frame of picture in the special effect animation data contains a plurality of special effects, the special effect animation corresponding to the plurality of special effects is played in each frame of picture.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 8, the computer device 400 further includes: touch-sensitive display screen 403, radio frequency circuit 404, audio circuit 405, input unit 406 and power 407. The processor 401 is electrically connected to the touch display screen 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power source 407. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 8 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 403 can be used for displaying a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 401, and can receive and execute commands sent by the processor 401. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel may transmit the touch operation to the processor 401 to determine the type of the touch event, and then the processor 401 may provide a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 403 may also be used as a part of the input unit 406 to implement an input function.
The rf circuit 404 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
The audio circuit 405 may be used to provide an audio interface between a user and a computer device through a speaker, microphone. The audio circuit 405 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 405 and converted into audio data, which is then processed by the audio data output processor 401, and then sent to, for example, another computer device via the radio frequency circuit 404, or output to the memory 402 for further processing. The audio circuit 405 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 400. Optionally, the power source 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. The power supply 407 may also include one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, or any other component.
Although not shown in fig. 8, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related descriptions of other embodiments.
As can be seen from the above, in the embodiment of the present application, the client receives the basic animation data and the special effect data that are sent by the server according to the special effect trigger information; analyzing the basic animation data to determine the field positions corresponding to at least one target data and at least one target data in the basic animation data; finally, in a field position corresponding to at least one kind of target data, updating the at least one kind of target data into special effect data to obtain special effect animation data containing at least one kind of special effect; and playing the special effect animation data to realize the playing of at least one special effect animation. In the embodiment of the application, the target data in the basic animation data is analyzed, and then the special effect data containing at least one special effect is updated to the target data, so that the special effect animation containing at least one special effect is played at the client.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer-readable storage medium, in which a plurality of computer programs are stored, where the computer programs can be loaded by a processor to execute steps in any one of the special effect animation playing methods provided by the embodiments of the present application. For example, the computer program may perform the steps of:
receiving basic animation data and special effect data which are sent by a server according to the special effect trigger information;
analyzing the basic animation data to determine the field positions corresponding to at least one target data and at least one target data in the basic animation data;
updating at least one target data into special effect data in a field position corresponding to at least one target data to obtain special effect animation data containing at least one special effect;
and playing the special effect animation data to realize the playing of at least one special effect animation.
The computer program may further perform:
receiving a content distribution network address corresponding to basic animation data returned by the server according to the special effect trigger information;
and acquiring basic animation data according to the content distribution network address.
The computer program may further perform:
determining basic data and a preset insertion position corresponding to at least one special effect in the special effect data;
determining a target field position corresponding to a preset insertion position in field positions corresponding to at least one type of target data according to the preset insertion position;
and inserting the basic data into the target field position to obtain special effect animation data containing at least one special effect.
The computer program may further perform:
acquiring a filling value and a data type in basic data corresponding to at least one special effect;
and inserting the filling value and the data type into the position of the target field to obtain special effect animation data containing at least one special effect.
The computer program may further execute:
the method comprises the steps that before basic animation data are analyzed to determine at least one target data and at least one field position corresponding to the target data in the basic animation data, the basic animation data are preprocessed to obtain a play task queue corresponding to the basic animation data;
and loading the play task queue to a player corresponding to the basic animation data.
The computer program may further execute:
analyzing the data corresponding to each frame of picture in the play task queue through the player to determine at least one target data in the data corresponding to each frame of picture and a field position corresponding to the at least one target data.
The computer program may further execute:
and sequentially analyzing the fields corresponding to the basic animation data, and determining the target data and the field positions corresponding to the target data according to the field mark information corresponding to the target data.
The computer program may further perform:
determining a starting frame number and an ending frame number corresponding to at least one special effect in special effect animation data;
and playing the special effect animation of at least one special effect according to the starting frame number and the ending frame number.
The computer program may further perform:
when each frame of picture in the special effect animation data contains a plurality of special effects, the special effect animation corresponding to the plurality of special effects is played in each frame of picture.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
In the embodiment of the application, a client receives basic animation data and special effect data which are sent by a server according to special effect trigger information; analyzing the basic animation data to determine at least one target data in the basic animation data and at least one field position corresponding to the target data; finally, in a field position corresponding to at least one kind of target data, updating the at least one kind of target data into special effect data to obtain special effect animation data containing at least one kind of special effect; and playing the special effect animation data to realize the playing of at least one special effect animation. In the embodiment of the application, the target data in the basic animation data is analyzed, and then the special effect data containing at least one special effect is updated to the target data, so that the special effect animation containing at least one special effect is played at the client.
In the embodiment of the application, the computer device transmits rays through the entrance positions to perform collision detection on the obstacles and transmits rays through the entrance positions to detect the edges of the scene area, so that routes corresponding to a plurality of entrance positions are determined in the scene area, and the road network in the scene area is determined.
Wherein the storage medium may include: read Only Memory (ROM), random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any special effect animation playing method provided in the embodiment of the present application, the beneficial effects that can be achieved by any special effect animation playing method provided in the embodiment of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The method, the device, the storage medium and the computer device for playing the special-effect animation provided by the embodiment of the application are introduced in detail, a specific example is applied in the description to explain the principle and the implementation mode of the application, and the description of the embodiment is only used for helping to understand the method and the core idea of the application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (13)

1. A special effect animation playing method is characterized by comprising the following steps:
receiving basic animation data and special effect data which are sent by a server according to the special effect trigger information;
analyzing the basic animation data to determine at least one target data in the basic animation data and a field position corresponding to the at least one target data;
updating the at least one target data into the special effect data in a field position corresponding to the at least one target data to obtain special effect animation data containing at least one special effect;
and playing the special effect animation data to realize the playing of the special effect animation of the at least one special effect.
2. The method for playing special effect animation according to claim 1, wherein the receiving of the basic animation data sent by the server according to the special effect trigger information includes:
receiving a content distribution network address corresponding to the basic animation data returned by the server according to the special effect trigger information;
and acquiring the basic animation data according to the content distribution network address.
3. The method for playing the special effect animation according to claim 1, wherein the updating the at least one target data to the special effect data in a field position corresponding to the at least one target data to obtain special effect animation data including at least one special effect comprises:
determining basic data and a preset inserting position corresponding to the at least one special effect in the special effect data;
determining a target field position corresponding to the preset insertion position in the field positions corresponding to the at least one type of target data according to the preset insertion position;
and inserting the basic data into the target field position to obtain special effect animation data containing the at least one special effect.
4. The special effect animation playing method according to claim 3, wherein the inserting the base data into the target field position to obtain special effect animation data including the at least one special effect comprises:
acquiring a filling value and a data type in basic data corresponding to the at least one special effect;
and inserting the filling value and the data type into the target field position to obtain special effect animation data containing the at least one special effect.
5. The special effect animation playing method according to claim 1, wherein before the parsing the base animation data to determine at least one target data in the base animation data and a field position corresponding to the at least one target data, the method further comprises:
preprocessing the basic animation data to obtain a play task queue corresponding to the basic animation data;
and loading the play task queue to a player corresponding to the basic animation data.
6. The method for playing the special effect animation according to claim 5, wherein the parsing the base animation data to determine at least one target data in the base animation data and a field position corresponding to the at least one target data comprises:
analyzing data corresponding to each frame of picture in the play task queue through the player to determine at least one type of target data in the data corresponding to each frame of picture and a field position corresponding to the at least one type of target data.
7. The method for playing special effect animation according to claim 1, wherein the parsing the base animation data to determine at least one target data in the base animation data and a field position corresponding to the at least one target data includes:
and sequentially analyzing fields corresponding to the basic animation data, and determining the target data and the field position corresponding to the target data according to the field mark information corresponding to the target data.
8. The special effect animation playing method according to any one of claims 1 to 7, wherein the playing the special effect animation data to realize playing the special effect animation of the at least one special effect comprises:
determining a starting frame number and an ending frame number corresponding to the at least one special effect in the special effect animation data;
and playing the special effect animation of the at least one special effect according to the starting frame number and the ending frame number.
9. The special effect animation playing method according to any one of claims 1 to 7, wherein playing the special effect animation data to realize playing of the special effect animation of the at least one special effect comprises:
and when each frame of picture in the special effect animation data contains a plurality of special effects, playing special effect animations corresponding to the plurality of special effects in each frame of picture.
10. The special effect animation playback method according to any one of claims 1 to 7, wherein the data format of the base animation data is Mp4 format, and the at least one type of object data is provided in a special effect container of the base animation data.
11. A special effect animation playback device, comprising:
the receiving module is used for receiving the basic animation data and the special effect data which are sent by the server according to the special effect trigger information;
the analysis module is used for analyzing the basic animation data to determine at least one target data in the basic animation data and a field position corresponding to the at least one target data;
the updating module is used for updating the at least one kind of target data into the special effect data in the field position corresponding to the at least one kind of target data so as to obtain special effect animation data containing at least one kind of special effect;
and the playing module is used for playing the special effect animation data so as to play the special effect animation of the at least one special effect.
12. A computer-readable storage medium storing instructions adapted to be loaded by a processor to perform the steps of the special effect animation playing method according to any one of claims 1 to 10.
13. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the special effect animation playback method according to any one of claims 1 to 10 when executing the program.
CN202211029820.4A 2022-08-25 2022-08-25 Special effect animation playing method and device, storage medium and computer equipment Pending CN115460435A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211029820.4A CN115460435A (en) 2022-08-25 2022-08-25 Special effect animation playing method and device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211029820.4A CN115460435A (en) 2022-08-25 2022-08-25 Special effect animation playing method and device, storage medium and computer equipment

Publications (1)

Publication Number Publication Date
CN115460435A true CN115460435A (en) 2022-12-09

Family

ID=84299978

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211029820.4A Pending CN115460435A (en) 2022-08-25 2022-08-25 Special effect animation playing method and device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN115460435A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108093307A (en) * 2017-12-29 2018-05-29 广州酷狗计算机科技有限公司 Obtain the method and system of played file
CN111277854A (en) * 2020-03-04 2020-06-12 网易(杭州)网络有限公司 Display method and device of virtual live broadcast room, electronic equipment and storage medium
CN112308947A (en) * 2019-07-25 2021-02-02 腾讯科技(深圳)有限公司 Animation generation method and device and storage medium
CN113254393A (en) * 2021-04-07 2021-08-13 互影科技(北京)有限公司 Interactive video packaging method and device and electronic equipment
CN113485617A (en) * 2021-07-02 2021-10-08 广州博冠信息科技有限公司 Animation display method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108093307A (en) * 2017-12-29 2018-05-29 广州酷狗计算机科技有限公司 Obtain the method and system of played file
CN112308947A (en) * 2019-07-25 2021-02-02 腾讯科技(深圳)有限公司 Animation generation method and device and storage medium
CN111277854A (en) * 2020-03-04 2020-06-12 网易(杭州)网络有限公司 Display method and device of virtual live broadcast room, electronic equipment and storage medium
CN113254393A (en) * 2021-04-07 2021-08-13 互影科技(北京)有限公司 Interactive video packaging method and device and electronic equipment
CN113485617A (en) * 2021-07-02 2021-10-08 广州博冠信息科技有限公司 Animation display method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WELKIN XIE: "估计是全网第一篇——视频动画框架 VAP iOS 源码解读", pages 1 - 16, Retrieved from the Internet <URL:https://welkinx.com/2021/07/03/vap/> *
许宏斌: "YYEVA-透明MP4播放器支持动态元素", Retrieved from the Internet <URL:https://zhuanlan.zhihu.com/p/552081291?utm_id=0> *

Similar Documents

Publication Publication Date Title
US20170161931A1 (en) Adapting content to augmented reality virtual objects
WO2020010819A1 (en) Live broadcast room-based data interaction method and device, terminal, and storage medium
CN110795056B (en) Method, device, terminal and storage medium for adjusting display parameters
CN111408136A (en) Game interaction control method, device and storage medium
CN113076048B (en) Video display method and device, electronic equipment and storage medium
CN112995759A (en) Interactive service processing method, system, device, equipment and storage medium
US20180373736A1 (en) Method and apparatus for storing resource and electronic device
CN103975313A (en) Information processing system, electronic device, image file reproduction method and generation method
CN112969093B (en) Interactive service processing method, device, equipment and storage medium
CN111240777A (en) Dynamic wallpaper generation method and device, storage medium and electronic equipment
CN113485617A (en) Animation display method and device, electronic equipment and storage medium
CN114816308B (en) Information partition display method and related equipment
CN112911320A (en) Live broadcast method and device, computer equipment and storage medium
CN104808975A (en) Filter realizing method and device of camera application
CN111209065A (en) Dynamic wallpaper setting method and device, storage medium and electronic equipment
CN109635131B (en) Multimedia content list display method, pushing method, device and storage medium
CN111818279A (en) Subtitle generating method, display method and interaction method
CN114095793A (en) Video playing method and device, computer equipment and storage medium
CN112181442B (en) Unloading page display method, device, terminal, server and storage medium
CN112732250A (en) Interface processing method, device and storage medium
CN115460435A (en) Special effect animation playing method and device, storage medium and computer equipment
CN114640876B (en) Multimedia service video display method, device, computer equipment and storage medium
CN115225971A (en) Video progress adjusting method and device, computer equipment and storage medium
CN112188268B (en) Virtual scene display method, virtual scene introduction video generation method and device
CN114816600B (en) Session message display method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination