CN115272536A - Animation playing method and device and electronic equipment - Google Patents

Animation playing method and device and electronic equipment Download PDF

Info

Publication number
CN115272536A
CN115272536A CN202211169659.0A CN202211169659A CN115272536A CN 115272536 A CN115272536 A CN 115272536A CN 202211169659 A CN202211169659 A CN 202211169659A CN 115272536 A CN115272536 A CN 115272536A
Authority
CN
China
Prior art keywords
playing
file
animation
information
original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211169659.0A
Other languages
Chinese (zh)
Inventor
张昱辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Leyuyou Network Technology Co ltd
Original Assignee
Shenzhen Leyuyou Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Leyuyou Network Technology Co ltd filed Critical Shenzhen Leyuyou Network Technology Co ltd
Priority to CN202211169659.0A priority Critical patent/CN115272536A/en
Publication of CN115272536A publication Critical patent/CN115272536A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides an animation playing method, an animation playing device and electronic equipment, wherein a target animation file and a pre-constructed engineering file associated with the target animation file are obtained firstly; the target animation file comprises a plurality of animation original files; then, analyzing the engineering file to obtain first playing information; the first playing information is used for representing the playing information of the target animation file; and finally, playing the target animation file according to the first playing information based on the plurality of animation original files. The animation playing method can be used for orderly playing a plurality of animation original files with different formats according to the pre-constructed project file, and can be used for realizing that on the basis of not changing the original animation file, the first playing information in the project file is modified to add pictures or identifiers when the target animation file is played, so that the time consumption caused by re-recording the animation original file is avoided, and the time cost is reduced.

Description

Animation playing method and device and electronic equipment
Technical Field
The present application relates to the field of animation technologies, and in particular, to an animation playing method and apparatus, and an electronic device.
Background
At present, when playing animation in a player, files with multiple formats, such as svga, gif, mp4, etc., are often selected, and a conventional solution is to compound multiple decoders in the player, analyze the format of each file to be played, and use the corresponding decoder to play. However, if some picture identifiers need to be added to the existing playing files, the videos need to be recorded again, which results in an excessively long production time.
Disclosure of Invention
In view of the above, an object of the present application is to provide a method and an apparatus for playing an animation, and an electronic device.
Based on the above purpose, the present application provides an animation playing method, including:
acquiring a target animation file and a pre-constructed project file associated with the target animation file; the target animation file comprises a plurality of animation original files;
analyzing the engineering file to obtain first playing information; the first playing information is used for representing the playing information of the target animation file;
and playing the target animation file according to the first playing information based on the plurality of animation original files.
Optionally, the construction of the engineering document includes:
acquiring attribute information of a plurality of animation original files;
determining the first playing information based on the attribute information and the association information among the plurality of animation original files;
and recording the first playing information to generate the project file.
Optionally, the attribute information at least includes a name, a call path, a duration, and an occupied memory size of the animation source file.
Optionally, the first playing information includes a total number of playing layers, a total playing time length, and a plurality of second playing information of the target animation file; the second playing information is used for representing the playing information of each animation original file.
Optionally, the second playing information at least includes a playing start time, a playing end time, the playing layer where the animation original file is located during playing, a playing position in the playing layer, a playing size, and a display state.
Optionally, the total playing duration includes a total playing frame number.
Optionally, the recording the first playing information and generating the project file include:
recording the first playing information in a json format;
and carrying out binary conversion on the first playing information to generate the project file.
Optionally, the playing the target animation file according to the first playing information based on the plurality of animation original files includes:
calling the animation original file based on the calling path of each animation original file;
and decoding the animation original file through a decoder matched with the animation original file, and playing the target animation file according to the first playing information.
Based on the same inventive concept, the present disclosure also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable by the processor, wherein the processor implements the method as described above when executing the computer program.
As can be seen from the above, according to the animation playing method, the animation playing device and the electronic device provided by the application, firstly, a target animation file and a pre-constructed engineering file associated with the target animation file are obtained; the target animation file comprises a plurality of animation original files; then, analyzing the engineering file to obtain first playing information; the first playing information is used for representing the playing information of the target animation file; and finally, playing the target animation file according to the first playing information based on the plurality of animation original files. The animation playing method can be used for orderly playing a plurality of animation original files with different formats according to the pre-constructed project file, and can be used for realizing that on the basis of not changing the original animation file, the first playing information in the project file is modified to add pictures or identifiers when the target animation file is played, so that the time consumption caused by re-recording the animation original file is avoided, and the time cost is reduced. In addition, the method can flexibly process the playing of the target animation file, is convenient for a user to adjust the playing sequence of a plurality of animation original files and simultaneously add or delete the image-text information.
Drawings
In order to more clearly illustrate the technical solutions in the present application or related technologies, the drawings required for the embodiments or related technologies in the following description are briefly introduced, and it is obvious that the drawings in the following description are only the embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic flowchart of an animation playing method according to an embodiment of the present application;
FIG. 2 is a schematic flowchart of a project file construction method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an animation playback device according to an embodiment of the present application;
fig. 4 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is further described in detail below with reference to specific embodiments and the accompanying drawings.
It should be noted that technical terms or scientific terms used in the embodiments of the present application should have a general meaning as understood by those having ordinary skill in the art to which the present application belongs, unless otherwise defined. The use of "first," "second," and similar terms in the embodiments of the present application is not intended to indicate any order, quantity, or importance, but rather is used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
As described in the background art, when playing a video, an existing player is usually configured with only audio or subtitles, and does not support displaying other pictures or vector diagrams, and if a user wants to add some description identifiers to the video, the user needs to record the video again, which is inconvenient and time-consuming to operate. In view of this, the present application provides an animation playing method, which can effectively solve the above problems, and facilitate a user to add a representation symbol or an explanatory text on a video without changing an original video, so as to provide a flexible and convenient animation playing method for the user, so as to achieve a playing effect expected by the user.
Embodiments of the present application are described in detail below with reference to the accompanying drawings.
The application provides an animation playing method, which comprises the following steps with reference to fig. 1:
102, acquiring a target animation file and a pre-constructed engineering file associated with the target animation file; the target animation file includes a plurality of animation source files.
The target animation file is an animation file to be played, and the target animation file comprises a plurality of animation original files. The animation original file may be specifically a picture, a video, or a character, etc., wherein formats of the plurality of videos may be the same or different. When the target animation file is played, all the animation original files contained in the target animation file are played. Meanwhile, the target animation file is obtained and simultaneously the project file associated with the target animation file is also obtained, in this embodiment, the project file is in a table form, and the related information of the target animation file is recorded, so that the target animation file is played according to the related information.
It should be noted that when the player acquires the target animation file and the project file, a packed file of the target animation file and the project file is obtained, and the packed file is in a zip format. And after the player acquires the packaged file, decompressing the packaged file to obtain a target animation file and an engineering file.
Step 104, analyzing the project file to obtain first playing information; the first playing information is used for representing the playing information of the target animation file.
Specifically, the project file stores the relevant information of the target animation file in a table form, the project file is analyzed through a corresponding decoder to obtain the first playing information, the decoder can be a composite decoder, the project file can be analyzed, and other animation original files can also be decoded. The first playing information records the playing time length of the target animation file, the playing time and the playing position of each animation original file in detail.
And 106, playing the target animation file according to the first playing information based on the plurality of animation original files.
The target animation file is played according to the first playing information, that is, the animation original files are played one by one according to the first playing information, or the multiple animation original files can be played simultaneously, that is, the playing durations of the multiple animation original files are overlapped, the specific playing mode can be written into the first playing information, and the target animation file is played according to the first playing information so as to achieve the playing effect of the user.
In some embodiments, referring to fig. 2, the construction of the project file includes the following steps:
step 202, obtaining attribute information of a plurality of animation original files.
Specifically, the attribute information at least includes a name, a call path, a duration, and an occupied memory size of the animation source file. The name is used for uniquely identifying the animation original file so as to be different from other animation original files; the calling path records the storage position of the animation original file, and the animation original file can be called through the calling path; the duration represents the total playing duration of the animation original file, and is 10s or 1min and the like for example; the occupied memory size represents the size of the space occupied by the animation original file, and is 300kb or 2Mb, for example. The playing information of each animation original file when the target animation file is played can be further determined by acquiring the attribute information of the animation original file.
Step 204, determining the first playing information based on the attribute information and the association information among the plurality of animation original files.
Specifically, the attribute information of a plurality of animation source files is obtained through step 400, and the first playing information is determined by combining the association information between each animation source file. The association information in this embodiment represents a playing logic relationship between multiple animation original files, for example, a front-back order relationship during playing, a front-back shielding relationship required during playing, and identification association information between animation original files.
And step 206, recording the first playing information and generating the project file.
After the incidence relation of the plurality of animation original files is analyzed, first playing information of the target animation file is determined, and the attribute information of the animation original files and the playing information of each animation original file can be recorded in the first playing information. And digitizing the first playing information and recording the digitized first playing information in a table to generate the project file. It should be noted that each animation original file can be numbered and generate a unique id, the id is used for identifying the animation original file, and each animation original file id is associated with corresponding playing information in the engineering file, so that the engineering file can be quickly analyzed subsequently.
In some embodiments, the first playing information includes a total number of playing layers, a total playing time length, and a plurality of second playing information of the target animation file; the second playing information is used for representing the playing information of each animation original file.
Specifically, the target animation file is divided into a plurality of playing layers during playing, the playing layers represent shielding relations of different animation original files during playing, the first playing layer is a bottom layer, and a second layer, a third layer and the like are sequentially arranged from bottom to top. The second playing layer can shield the first playing layer, and the third playing layer can shield the second playing layer. In this embodiment, the total number of the playing layers is not limited, and the total number of the playing layers can be set according to the actual playing requirement. Each playing layer corresponds to playing of one animation original file, the playing layer on the upper layer can cover the playing layer on the lower layer, and if character identification needs to be added on the basis of the video, the character identification can be arranged in the playing layer above the playing layer where the video is located. The total playing time represents the total playing time of the target animation file, the total playing time integrates the playing time of a plurality of animation original files, each animation original file can be set to play once or circularly, and the total playing time is the sum of the time after all the animation original files are played according to a preset playing rule. The total number of playing layers and the total playing time are determined, and second playing information of each animation original file, that is, the playing layer on which each animation original file is specifically played, the playing sequence of each animation original file, and the like, also needs to be determined. After the first playing information is determined, the target animation file can be played according to the first playing information.
In some embodiments, the second playing information at least includes a playing start time, a playing end time, the playing layer where the animation original file is played, a playing position in the playing layer, a playing size, and a display state of the animation original file.
Specifically, each piece of second playing information corresponds to playing information of one animation original file. The second playing information of each animation original file comprises a playing start time and a playing end time in the target animation file, for example, the total time length of the target animation file is 50min, the playing start time of a certain animation original file is 20min, and the playing end time is 40min. Each animation original file needs to determine the playing layer played by the animation original file, for example, the total number of the playing layers of the target animation file is 5, and the animation original file is played on the 3 rd playing layer. When the playing layer is played, the playing position of the original animation file may be set, that is, the original animation file is determined to be located in a specific playing area in the playing layer, for example, the original animation file may be played at an upper left corner or a lower right corner of the playing layer. When the playing position of the original animation file needs to be accurately positioned, a coordinate system can be established on the playing layer to determine the coordinate value of the original animation file in the coordinate system. In addition, during playing, the playing size of the original animation file can be determined, the original animation file is played in an amplifying mode or in a reducing mode on the basis of the original size of the original animation file, and the zooming size of the original animation file can be selected according to the actual requirements of a user. The display state mainly refers to an animation original file of a picture or an identification, for example, a line segment is played at a certain position in a certain playing layer, and the color, the thickness and the line type of the line segment are the display state of the line segment; or displaying a line of characters in a certain area of a certain playing layer, wherein the fonts, colors and font sizes of the line of characters are the display states of the line of characters. The user can restrict the display state of the original animation file according to the actual identification requirement, and the adding or modifying process is flexible and convenient.
In some embodiments, the total duration of play comprises a total number of frames played.
In this embodiment, the target animation file is played in frames, and the total number of frames played by the target animation file is determined, so as to determine the starting frame number and the ending frame number of each animation original file, that is, the playing interval of each animation original file is determined. When the attribute information of the animation original file is obtained, the duration of the animation original file is obtained, and the duration represents the playing frame number of each animation original file, so that the playing start frame number and the playing stop frame number of each animation original file when the target animation file is played are determined. For example, the frame number playing section of a certain animation original file is from the 2 nd frame to the 50 th frame.
In some embodiments, the recording the first playing information and generating the project file includes: recording the first playing information in a json format; and carrying out binary conversion on the first playing information to generate the project file.
Specifically, the engineering file may be generated by describing each playing layer, and writing information related to each id in the engineering file table in the same row as the id by using each playing layer as an id or using the animation source file played on the playing layer as an id, that is, each row of data in the engineering file corresponds to one playing layer or one animation source file. And simultaneously writing second playing information of the animation original file, such as a playing position, a playing start time, a playing end time and the like, in the line. And writing the total playing time length of the target playing file and the total number of playing layers in the initial position of the engineering file. And integrating the information, namely the first playing information, describing the first playing information in a json format and recording the first playing information in the project file. In order to reduce the memory occupied by the engineering file, binary conversion is performed on the first playing information, and some parameter types can be customized during the binary conversion, for example, the binary conversion is performed according to the arrangement sequence of the type-duration-id of the animation original file. The first playing information is subjected to binary conversion, so that the size of the project file can be reduced, and the project file can be rapidly analyzed.
The following is an example of the write information of a specific project file. According to the description of the engineering file, the total number of playing layers of the target animation file is 3, the total playing time is 100 frames, a video is displayed in a certain area on the first playing layer, the playing start time is 10 th frame, and the playing end time is 50 frames; displaying a segment of characters in a certain area of a second playing layer, and specifying the font, color and font size of the characters, wherein the playing start time of the segment of characters is frame 1, and the playing end time is frame 100; defining two points on the third playing layer, drawing a straight line in the two points, and defining the color, line width and line type of the straight line, wherein the playing start time is the first frame, and the playing end time is the 100 th frame.
In some embodiments, the playing the target animation file according to the first playing information based on the plurality of animation primitive files includes: calling the animation original file based on the calling path of each animation original file; decoding the animation original file through a decoder matched with the animation original file, and playing the target animation file according to the first playing information.
Specifically, when the target animation file is played, the target animation file needs to be played according to the first playing information obtained by analyzing the engineering file, the animation original file needs to be called and decoded when the animation original file is played, different types of animation original files need to be decoded by corresponding decoders, for example, decoders of videos, audios or pictures use existing open-source decoders. The calling path of the animation original file is stored in the first playing information, and the calling path of the animation original file can be obtained through the first playing information, so that the animation original file is called.
In this embodiment, a composite player decodes and parses a target original file and a project file, the composite player integrates multiple open-source decoders, the multiple open-source decoders can decode video, audio, or pictures, and the composite player integrates a file decoder capable of parsing the project file. After the composite player obtains the target animation file and the engineering file, the first playing information is obtained through the analysis of the file decoder, and each animation original file is called in sequence through a calling path in the first playing information to be decoded and played.
It should be noted that the method of the embodiment of the present application may be executed by a single device, such as a computer or a server. The method of the embodiment can also be applied to a distributed scene and completed by the mutual cooperation of a plurality of devices. In such a distributed scenario, one of the multiple devices may only perform one or more steps of the method of the embodiment, and the multiple devices interact with each other to complete the method.
It should be noted that the above describes some embodiments of the present application. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Based on the same inventive concept, corresponding to the method of any embodiment, the application also provides an animation playing device.
Referring to fig. 3, the animation playback apparatus includes:
an obtaining module 302 configured to obtain a target animation file and a project file associated with the target animation file; the target animation file comprises a plurality of animation original files;
the analysis module 304 is configured to analyze the engineering file to obtain first playing information and second playing information; the first playing information is used for representing the playing information of the target animation file, and the second playing information is used for representing the playing information of each animation original file;
a playing module 306 configured to play the target animation file according to the first playing information and the second playing information based on the plurality of animation original files.
In some embodiments, the construction of the project file comprises:
acquiring attribute information of a plurality of animation original files;
determining the first playing information based on the attribute information and the association information among the plurality of animation original files;
and recording the first playing information to generate the project file.
In some embodiments, the attribute information at least includes a name, a call path, a duration, and an occupied memory size of the animation source file.
In some embodiments, the first playing information includes a total number of playing layers, a total playing time length, and a plurality of second playing information of the target animation file; the second playing information is used for representing the playing information of each animation original file.
In some embodiments, the second playing information at least includes a playing start time, a playing end time, the playing layer where the animation original file is located during playing, a playing position in the playing layer, a playing size, and a display state of the animation original file.
In some embodiments, the total playing duration is a total playing frame number.
In some embodiments, the recording the first playing information and generating the project file includes:
recording the first playing information in a json format;
and carrying out binary conversion on the first playing information to generate the project file.
In some embodiments, the play module 306 is further configured to:
calling the animation original file based on the calling path of each animation original file;
and decoding the animation original file through a decoder matched with the animation original file, and playing the target animation file according to the first playing information.
For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, the functionality of the various modules may be implemented in the same one or more software and/or hardware implementations as the present application.
The apparatus of the foregoing embodiment is used to implement the corresponding animation playing method in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which are not described herein again.
Based on the same inventive concept, corresponding to any of the above embodiments, the present application further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the program, the animation playing method according to any of the above embodiments is implemented.
Fig. 4 is a schematic diagram illustrating a more specific hardware structure of an electronic device according to this embodiment, where the device may include: a processor 1010, a memory 1020, an input/output interface 1030, a communication interface 1040, and a bus 1050. Wherein the processor 1010, memory 1020, input/output interface 1030, and communication interface 1040 are communicatively coupled to each other within the device via bus 1050.
The processor 1010 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solutions provided in the embodiments of the present disclosure.
The Memory 1020 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random Access Memory), a static Memory device, a dynamic Memory device, or the like. The memory 1020 may store an operating system and other application programs, and when the technical solution provided by the embodiments of the present specification is implemented by software or firmware, the relevant program codes are stored in the memory 1020 and called to be executed by the processor 1010.
The input/output interface 1030 is used for connecting an input/output module to input and output information. The i/o module may be configured as a component in a device (not shown) or may be external to the device to provide a corresponding function. Wherein the input devices may include a keyboard, mouse, touch screen, microphone, various sensors, etc., and the output devices may include a display, speaker, vibrator, indicator light, etc.
The communication interface 1040 is used for connecting a communication module (not shown in the drawings) to implement communication interaction between the present device and other devices. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, bluetooth and the like).
Bus 1050 includes a path that transfers information between various components of the device, such as processor 1010, memory 1020, input/output interface 1030, and communication interface 1040.
It should be noted that although the above-mentioned device only shows the processor 1010, the memory 1020, the input/output interface 1030, the communication interface 1040 and the bus 1050, in a specific implementation, the device may also include other components necessary for normal operation. In addition, those skilled in the art will appreciate that the above-described apparatus may also include only those components necessary to implement the embodiments of the present description, and not necessarily all of the components shown in the figures.
The electronic device of the above embodiment is used to implement the corresponding animation playing method in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which are not described herein again.
Based on the same inventive concept, corresponding to any of the above-mentioned embodiment methods, the present application further provides a non-transitory computer-readable storage medium storing computer instructions for causing the computer to execute the animation playback method according to any of the above-mentioned embodiments.
Computer-readable media of the present embodiments, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The computer instructions stored in the storage medium of the foregoing embodiment are used to enable the computer to execute the animation playing method according to any embodiment, and have the beneficial effects of the corresponding method embodiment, which are not described herein again.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, is limited to these examples; within the context of the present application, features from the above embodiments or from different embodiments may also be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the embodiments of the present application as described above, which are not provided in detail for the sake of brevity.
In addition, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown in the provided figures for simplicity of illustration and discussion, and so as not to obscure the embodiments of the application. Furthermore, devices may be shown in block diagram form in order to avoid obscuring embodiments of the application, and this also takes into account the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform within which the embodiments of the application are to be implemented (i.e., specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the application, it should be apparent to one skilled in the art that the embodiments of the application can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative instead of restrictive.
While the present application has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of these embodiments will be apparent to those of ordinary skill in the art in light of the foregoing description. For example, other memory architectures, such as Dynamic RAM (DRAM), may use the discussed embodiments.
The present embodiments are intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Therefore, any omissions, modifications, substitutions, improvements, and the like that may be made without departing from the spirit and principles of the embodiments of the present application are intended to be included within the scope of the present application.

Claims (10)

1. An animation playing method, comprising:
acquiring a target animation file and a pre-constructed project file associated with the target animation file; the target animation file comprises a plurality of animation original files;
analyzing the engineering file to obtain first playing information; the first playing information is used for representing the playing information of the target animation file;
and playing the target animation file according to the first playing information based on the plurality of animation original files.
2. The method of claim 1, wherein the building of the project file comprises:
acquiring attribute information of a plurality of animation original files;
determining the first playing information based on the attribute information and the association information among the plurality of animation original files;
and recording the first playing information to generate the project file.
3. The method according to claim 2, wherein the attribute information at least includes a name, a call path, a duration, and an occupied memory size of the animation primitive file.
4. The method according to claim 1, wherein the first playing information includes a total number of playing layers, a total playing time length and a plurality of second playing information of the target animation file; the second playing information is used for representing the playing information of each animation original file.
5. The method according to claim 4, wherein the second playing information at least includes a playing start time, a playing end time, the playing layer where the animation original file is located during playing, a playing position in the playing layer, a playing size, and a display state of the animation original file.
6. The method of claim 4, wherein the total duration of play comprises a total number of frames of play.
7. The method of claim 2, wherein the recording the first playback information and generating the project file comprises:
recording the first playing information in a json format;
and carrying out binary conversion on the first playing information to generate the project file.
8. The method according to claim 3, wherein said playing the target animation file according to the first playing information based on the plurality of animation original files comprises:
calling the animation original file based on the calling path of each animation original file;
decoding the animation original file through a decoder matched with the animation original file, and playing the target animation file according to the first playing information.
9. An animation playback apparatus, comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is configured to acquire a target animation file and a project file related to the target animation file; the target animation file comprises a plurality of animation original files;
the analysis module is configured to analyze the engineering file to obtain first playing information and second playing information; the first playing information is used for representing the playing information of the target animation file, and the second playing information is used for representing the playing information of each animation original file;
and the playing module is configured to play the target animation file according to the first playing information and the second playing information based on the plurality of animation original files.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 8 when executing the program.
CN202211169659.0A 2022-09-26 2022-09-26 Animation playing method and device and electronic equipment Pending CN115272536A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211169659.0A CN115272536A (en) 2022-09-26 2022-09-26 Animation playing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211169659.0A CN115272536A (en) 2022-09-26 2022-09-26 Animation playing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN115272536A true CN115272536A (en) 2022-11-01

Family

ID=83756246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211169659.0A Pending CN115272536A (en) 2022-09-26 2022-09-26 Animation playing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN115272536A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080122849A1 (en) * 2005-06-02 2008-05-29 Tencent Technology (Shenzhen) Company Limited Method for displaying animation and system thereof
CN112348928A (en) * 2020-11-25 2021-02-09 北京沃东天骏信息技术有限公司 Animation synthesis method, animation synthesis device, electronic device, and medium
CN113538633A (en) * 2021-07-23 2021-10-22 北京达佳互联信息技术有限公司 Animation playing method and device, electronic equipment and computer readable storage medium
WO2022193141A1 (en) * 2021-03-16 2022-09-22 华为技术有限公司 Multimedia file playing method and related apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080122849A1 (en) * 2005-06-02 2008-05-29 Tencent Technology (Shenzhen) Company Limited Method for displaying animation and system thereof
CN112348928A (en) * 2020-11-25 2021-02-09 北京沃东天骏信息技术有限公司 Animation synthesis method, animation synthesis device, electronic device, and medium
WO2022193141A1 (en) * 2021-03-16 2022-09-22 华为技术有限公司 Multimedia file playing method and related apparatus
CN113538633A (en) * 2021-07-23 2021-10-22 北京达佳互联信息技术有限公司 Animation playing method and device, electronic equipment and computer readable storage medium

Similar Documents

Publication Publication Date Title
US10939069B2 (en) Video recording method, electronic device and storage medium
US10891032B2 (en) Image reproduction apparatus and method for simultaneously displaying multiple moving-image thumbnails
US11189320B2 (en) System and methods for concatenating video sequences using face detection
CN112181554B (en) Interactive interface display method, device, electronic device and storage medium
CN111818123A (en) Network front-end remote playback method, device, equipment and storage medium
CN109672902A (en) A kind of video takes out frame method, device, electronic equipment and storage medium
US9176607B2 (en) Input/output apparatus for displaying superposed images including a handwritten image
CN105589667B (en) Method and device for capturing display image of display equipment
CN110727825A (en) Animation playing control method, device, server and storage medium
CN113741753A (en) Revocation method, electronic device, storage medium, and computer program product
CN111565336B (en) Video playing method and device
WO2022227329A1 (en) Media file generation method and device, and media file playback method and device
RU2679562C1 (en) Method of video playback and device
CN115272536A (en) Animation playing method and device and electronic equipment
CN108831510B (en) Method, device, terminal and storage medium for dotting audio and video files
CN114339289B (en) Video playing processing method
US20070101270A1 (en) Method and system for generating a presentation file for an embedded system
JP2020509624A (en) Method and apparatus for determining a time bucket between cuts in audio or video
CN114979531A (en) Double-recording method for android terminal to support real-time voice recognition
CN104462249B (en) Webpage loading method and device
US20230377606A1 (en) Video editing projects using single bundled video files
CN111050106B (en) Video playback method, device and computer storage medium
CN111813994B (en) Data processing and file playback method and device based on interactive whiteboard
CN115643442A (en) Audio and video converging recording and playing method, device, equipment and storage medium
KR20090078198A (en) Apparatus for processing moving image ancillary information using script and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20221101