CN112652039A - Animation segmentation data acquisition method, segmentation method, device, equipment and medium - Google Patents

Animation segmentation data acquisition method, segmentation method, device, equipment and medium Download PDF

Info

Publication number
CN112652039A
CN112652039A CN202011536058.XA CN202011536058A CN112652039A CN 112652039 A CN112652039 A CN 112652039A CN 202011536058 A CN202011536058 A CN 202011536058A CN 112652039 A CN112652039 A CN 112652039A
Authority
CN
China
Prior art keywords
animation
segment
current
frame
segmentation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011536058.XA
Other languages
Chinese (zh)
Inventor
包炎
赵男
施一东
胡婷婷
刘超
师锐
李鑫培
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mihoyo Tianming Technology Co Ltd
Original Assignee
Shanghai Mihoyo Tianming Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mihoyo Tianming Technology Co Ltd filed Critical Shanghai Mihoyo Tianming Technology Co Ltd
Priority to CN202011536058.XA priority Critical patent/CN112652039A/en
Publication of CN112652039A publication Critical patent/CN112652039A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/7867Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, title and artist information, manually generated time, location and usage information, user ratings

Abstract

The invention discloses an animation segmentation data acquisition method, an animation segmentation device, animation segmentation equipment and animation segmentation media. The animation segment data acquisition method comprises the following steps: the method comprises the steps that segmented data set by a user in an animation drawing process are obtained in real time, wherein the segmented data comprise a starting frame identifier, an ending frame identifier, playing time and special effect data of each animation frame segment; according to the segmentation data export command, the segmentation data are imported into a corresponding configuration file, and the configuration file can be used for solving the problem that the existing animation segmentation method is low in accuracy.

Description

Animation segmentation data acquisition method, segmentation method, device, equipment and medium
Technical Field
The embodiment of the invention relates to the technical field of computer programs, in particular to an animation segmentation data acquisition method, an animation segmentation device, animation segmentation equipment and an animation segmentation medium.
Background
At present, animation on the market is created in DCC (Digital Content Creation), and then split according to the animation time axis in the engine and the actual frame number in the DCC.
However, in most cases the animation in the engine is split by the planner, or animator, who does not have to be involved in the actual animation, so the animation is not well understood. This results in the possibility of a split error in the process of splitting the animation by the animator, who has an error between the actual segment of the animation and the segment that the animator intended. Furthermore, setting errors often occur when animation segment modifications are performed in the engine. Therefore, the existing animation segmentation method has the problem of low accuracy.
Disclosure of Invention
The invention provides an animation segmentation data acquisition method, an animation segmentation device, animation segmentation equipment and an animation segmentation medium, which are used for solving the problem of low accuracy of the conventional animation segmentation method.
In a first aspect, an embodiment of the present invention provides an animation segment data obtaining method, executed by a processor of an animation rendering device, including:
acquiring segmentation data set by a user in an animation drawing process in real time, wherein the segmentation data comprises a start frame identifier, an end frame identifier, a playing time length and special effect data of each animation frame segment;
and according to the segment data export command, importing the segment data into a corresponding configuration file.
In a second aspect, an embodiment of the present invention further provides an animation segmentation method, which is performed by an animation engine, and includes:
acquiring a drawn target animation and a configuration file in an animation segment data acquisition method corresponding to the target animation;
completing a segmentation operation of the target animation according to the segmentation data to update the target animation.
In a third aspect, an embodiment of the present invention further provides an animation segment data obtaining apparatus, where the apparatus includes:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring segment data set by a user in the animation drawing process in real time, and the segment data comprises a starting frame identifier, an ending frame identifier, a playing time length and special effect data of each animation frame segment;
and the importing module is used for importing the segmented data into the corresponding configuration file according to the segmented data exporting command.
In a fourth aspect, an embodiment of the present invention further provides an animation segmentation apparatus, including:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a drawn target animation and a configuration file in an animation segment data acquisition method corresponding to the target animation, and the configuration file comprises segment data of the target animation;
and the segmentation module is used for finishing the segmentation operation of the target animation according to the segmentation data so as to update the target animation.
In a fifth aspect, an embodiment of the present invention further provides a computer device, where the computer device includes:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement an animation segmentation method as provided by embodiments of the invention.
In a sixth aspect, embodiments of the present invention further provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are used to perform an animation segmentation method as provided by embodiments of the present invention.
The embodiment of the invention has the following advantages or beneficial effects:
acquiring segmentation data set by a user in an animation drawing process in real time, wherein the segmentation data comprises a start frame identifier, an end frame identifier, a playing time length and special effect data of each animation frame segment; and importing the segmented data into the corresponding configuration file according to the segmented data export command. The method and the device have the advantages that the configuration file containing the segment data is generated in the process of drawing the animation by a user (animation drawing personnel), the animation engine can automatically complete the segmentation operation of the animation by the configuration file, and compared with the prior art that animation segmentation is performed by an animator according to experience, the efficiency and the accuracy of the animation segmentation are greatly improved by the configuration file.
Drawings
In order to more clearly illustrate the technical solutions of the exemplary embodiments of the present invention, a brief description is given below of the drawings used in describing the embodiments. It should be clear that the described figures are only views of some of the embodiments of the invention to be described, not all, and that for a person skilled in the art, other figures can be derived from these figures without inventive effort.
Fig. 1 is a schematic flow chart of an animation segment data obtaining method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an animation segment data acquisition apparatus according to a second embodiment of the present invention;
FIG. 3 is a flowchart illustrating an animation segmentation method according to a third embodiment of the present invention;
FIG. 4 is a flowchart illustrating an animation segmentation method according to a fourth embodiment of the present invention;
FIG. 5 is a schematic structural diagram of an animation segmentation apparatus according to a fifth embodiment of the present invention;
fig. 6 is a schematic structural diagram of a computer device according to a sixth embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions of the present invention will be clearly and completely described through embodiments with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a schematic flowchart of an animation segment data obtaining method according to an embodiment of the present invention, where the method is executed by a processor of an animation rendering device. The present embodiment is applicable to a case where a configuration file for an animation segment is generated based on segment data set by a user in an animation drawing process. The method can be executed by an animation segment data acquisition device, which can be implemented by hardware and/or software, and specifically comprises the following steps:
s110, segment data set by a user in the animation drawing process is obtained in real time, and the segment data comprises a starting frame identifier, an ending frame identifier, playing time and special effect data of each animation frame segment.
In the present embodiment, the processor of the animation rendering device may be understood as a processor of an electronic device that runs an animation rendering program. Wherein, the electronic device may be a device such as a computer, a tablet computer or a smart phone; the animation drawing program may be a Digital Content Creation (DCC) software program, and provides a functional service of animation drawing for a user. The user can use the animation drawing function through the interface of the DCC, and when the user finishes drawing the animation, the drawn target animation can be obtained by reading the interface of the DCC.
It is to be understood that the target animation may generate at least one animation frame segment by segmentation. An animation frame segment refers to an animation that is composed of a series of temporally successive image frames. The animation frame segments are generated based on animation segments in the animation.
The segment data includes segment information of the target animation, such as a start frame identifier, an end frame identifier, a play time length, and special effect data of each animation frame segment. The start frame mark indicates the mark of the first animation frame in the animation frame section; the end frame identification refers to the identification of the last animation frame in the animation frame segment. In one embodiment, the start frame identification and the end frame identification are frame sequence numbers of the start frame and the end frame, respectively, such as NO1, NO9, and the like.
In the present embodiment, the playback time length refers to the length of the playback time of each animation frame segment. It can be understood that the playing time length of the animation frame segment is the time difference between the starting frame and the ending frame when the target animation is played.
The special effect data carries data of the playing effect of each animation frame segment. The playing effect of the animation frame segment can be at least one of circular playing, double-speed playing, visual special effect playing and sound special effect playing. The double-speed playing refers to accelerating or slowing down the playing rate of the animation frame segment, such as 0.5 times, 1.25 times or 2 times. The visual special effect can be a special effect of setting light, color mixing or background, a special effect of setting a painting style, or a special effect of amplifying, flickering or adding a decoration component to display a certain object in the animation frame segment. The sound special effect can be the addition of setting sound, such as explosion sound, footstep sound or breaking sound, to the animation frame segment.
And S120, importing the segmented data into a corresponding configuration file according to the segmented data export command.
The segment data export command can be generated when a user triggering a corresponding export operation is detected, and is used for exporting the segment data configured by the user into a file with a specific format, namely a configuration file, so that the exported configuration file can be read in the subsequent segmentation processing of the animation. The format of the configuration file may be an ini, json, fbx, or the like. It can be understood that the configuration file and the corresponding animation need to be stored in association, and may be stored in a memory of the animation drawing device, and may also be stored in a server interacting with the animation drawing device, so that the animation engine may obtain the configuration file and the corresponding animation.
According to the technical scheme of the embodiment, segment data set by a user in an animation drawing process is obtained in real time, wherein the segment data comprises a starting frame identifier, an ending frame identifier, a playing time length and special effect data of each animation frame segment; and importing the segmented data into the corresponding configuration file according to the segmented data export command. The method and the device have the advantages that the configuration file containing the segment data is generated in the process of drawing the animation by a user (animation drawing personnel), the animation engine can automatically complete the segmentation operation of the animation by the configuration file, and compared with the prior art that animation segmentation is performed by an animator according to experience, the efficiency and the accuracy of the animation segmentation are greatly improved by the configuration file.
In one embodiment, the method for acquiring the animation segment data comprises the following specific steps:
(1) and acquiring the DCC animation resources. The DCC animation resource is an animation drawn in the DCC by a user;
(2) and recording the animation key splitting time point. The key splitting time point is a splitting time point corresponding to each animation frame segment of the animation, and can be represented by the start frame identifier corresponding to each animation frame segment. The steps further include: recording the key splitting time point in a data file (namely the configuration file); the format of the file can be in-i, json, fbx, and the data file includes a split start frame (the start frame identifies a corresponding frame), a split end frame (the end frame identifies a corresponding frame), whether to loop, a trick play time, and the like. The special effect playing time refers to a time length of the animation frame segment with a playing effect, and illustratively, if the playing time length of a certain animation frame segment is 5s, the playing effect is a special effect of a set painting, and the special effect playing time is 3 s.
Example two
Fig. 2 is a schematic structural diagram of an animation segment data obtaining apparatus according to a second embodiment of the present invention, which is applicable to a case where a corresponding configuration file is generated based on segment data set by a user during an animation drawing process, so that an animation is segmented based on the configuration file after the drawing is completed. The device specifically includes: an acquisition module 210 and an import module 220.
An obtaining module 210, configured to obtain, in real time, segment data set by a user in an animation drawing process, where the segment data includes a start frame identifier, an end frame identifier, a play duration, and special effect data of each animation frame segment;
the importing module 220 is configured to import the segment data into the corresponding configuration file according to the segment data export command.
According to the technical scheme of the animation segmentation data acquisition device provided by the embodiment of the invention, segmentation data set by a user in an animation drawing process is acquired in real time through an acquisition module, and the segmentation data comprises a start frame identifier, an end frame identifier, a playing time length and special effect data of each animation frame segment; and importing the segmented data into the corresponding configuration file through the importing module according to the segmented data exporting command. The method and the device have the advantages that the configuration file containing the segment data is generated in the process of drawing the animation by a user (animation drawing personnel), the animation engine can automatically complete the segmentation operation of the animation by the configuration file, and compared with the prior art that animation segmentation is performed by an animator according to experience, the efficiency and the accuracy of the animation segmentation are greatly improved by the configuration file.
The animation segment data acquisition device provided by the embodiment of the invention can execute the animation segment data acquisition method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the system are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.
EXAMPLE III
Fig. 3 is a flowchart illustrating an animation segmentation method according to a third embodiment of the present invention, where the animation segmentation method is executed by an animation engine. The embodiment is applicable to the situation that the drawn animation is segmented to update the animation based on the configuration file generated by the user in the animation drawing process. The method may be performed by an animation segmentation apparatus, which may be implemented by hardware and/or software, and specifically comprises the steps of:
s310, acquiring the drawn target animation and the configuration file in the animation segment data acquisition method corresponding to the target animation.
In the present embodiment, the animation engine refers to a core component of some interactive real-time image applications, and can implement operations such as animation segmentation.
The user obtains a configuration file through an interface of the animation drawing software, wherein the configuration file comprises segment data which is automatically generated by the animation drawing software in the animation drawing process and/or manually set by an animation designer.
In one embodiment, the user obtains the configuration file from a server connected to the animation software, which is also connected to the animation engine. That is, when the user segments the target animation using the animation engine, the user may download the corresponding configuration file from the server directly through the download instruction.
S320, completing the segmentation operation of the target animation according to the segmentation data to update the target animation.
And acquiring a configuration file containing segment data, inputting an animation segment instruction by a user, and automatically carrying out segment processing on the target animation by an animation engine according to the animation segment instruction and the configuration file until the segmentation is finished, and generating the segmented target animation, namely the updated target animation.
Specifically, according to the start frame identifier and the end frame identifier of each animation frame segment, an animation segment can be positioned in the target animation, and the animation segment is the main content played in the animation frame segment, that is, the animation frame segment is generated by the animation segment. The animation segment is composed of a start frame, an end frame, and all intermediate frames between the start frame and the end frame.
According to the technical scheme of the embodiment, the configuration file in the method for acquiring the drawn target animation and the animation segment data corresponding to the target animation is acquired; the target animation segmentation operation is completed according to the segmentation data to update the target animation, segmentation processing of the animation is achieved based on a predetermined configuration file, an animator does not need to manually segment the animation, and therefore the problem that an existing animation segmentation method is low in accuracy is solved, and meanwhile rapid segmentation of the animation is achieved.
Example four
Fig. 4 is a flowchart of an animation segmentation method according to a fourth embodiment of the present invention, and this embodiment further optimizes "completing the segmentation operation of the target animation according to the segmentation data to update the target animation" on the basis of the third embodiment. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted. Referring to fig. 4, the animation segmentation method provided by the present embodiment includes the following steps:
s410, acquiring the drawn target animation and the configuration file in the animation segment data acquisition method corresponding to the target animation.
And S420, reading the current animation frame segment identification, and the current start frame identification and the current end frame identification corresponding to the current animation frame segment identification from the segment data according to a preset reading sequence.
The preset reading sequence is used for reading the segment data corresponding to each animation frame segment in the segment data according to the specified sequence. It will be appreciated that the segment data corresponding to each animation frame segment includes a corresponding animation frame segment identifier to facilitate distinguishing the segment data of different animation frame segments. The current animation frame segment identification refers to identification corresponding to the read segment data of the current animation frame segment.
S430, positioning an animation frame corresponding to the current starting frame identification in the target animation to be used as a starting animation frame, and positioning an animation frame corresponding to the current ending frame identification in the target animation to be used as an ending animation frame.
The current start frame identifier and the current end frame identifier may be frame numbers or corresponding time information in the target animation. And positioning the corresponding starting animation frame and ending animation frame in the target animation according to the current starting frame identifier and the current ending frame identifier. It is understood that, according to the starting animation frame and the ending animation frame, the animation segment corresponding to the current animation frame segment can be determined in the target animation.
S440, adding the current animation frame segment identification to animation frame segments formed by the starting animation frame, the ending animation frame and all animation frames between the starting animation frame and the ending animation frame to generate a current animation frame segment.
And taking the starting animation frame, the ending animation frame and all animation frames between the starting animation frame and the ending animation frame as a current animation frame segment, and adding the current animation frame segment identification to the current animation frame segment to generate a current animation frame segment.
And (3) segmenting the animation frame segment from the target animation through the sequence of the steps, and adding corresponding animation frame segment identification for the segmented animation frame segment to finish all segmentation operations of the target animation.
Optionally, the special effect data includes a cycle number of a cycle frame segment, and after the current animation frame segment is generated, the method further includes: extracting the cycle number of the current animation frame segment from the segment data; and adding the cycle times to the current animation frame segment to update the current animation frame segment, so that when the target animation is played, the updated current animation frame segment is played circularly for the cycle times.
Wherein, the cycle number refers to the cycle playing number of the current animation frame segment. Specifically, after the current animation frame segment is generated, the loop playing frequency of the current animation frame is set according to the loop playing frequency, and the current animation frame segment is updated based on the current animation frame segment after loop playing. Illustratively, the number of loops is 3, and the updated current animation frame segment is an animation frame segment composed of the original current animation frame segment played repeatedly for 3 times. In this embodiment, the loop times are added to the current animation frame segment to update the current animation frame segment, so that when the target animation is played, the updated current animation frame segment is played circularly for the loop times, and the loop playing of the current animation frame segment based on the preset loop times is realized, thereby realizing the loop processing of each split animation frame segment when the animation is split.
Optionally, after the generating the current animation frame segment, the method further includes: extracting the playing time length of the current animation frame segment from the segment data; and adding the playing time length to the current animation frame segment to update the current animation frame segment, so that when the target animation is played, the updated current animation frame segment is circularly played for the playing time length.
Specifically, when the playing duration of the current animation frame segment is longer than the interval duration between the start animation frame and the end animation frame of the current animation frame segment, the current animation frame segment may be played circularly within the playing time. Illustratively, the interval duration between the start animation frame and the end animation frame of the current animation frame segment is 8s, and the play duration is 16s, then the play duration 16s is added to the current animation frame segment to update the current animation frame segment based on the current animation frame segment repeatedly played 2 times.
In the present embodiment, the playing time length of the current animation frame segment is extracted from the segment data; the playing time length is added to the current animation frame section to update the current animation frame section, so that when the target animation is played, the updated current animation frame section is played circularly for the playing time length, and the purpose that when the animation is split, the animation frame sections are circularly processed based on the set time length is achieved.
Optionally, the extracting the loop times of the current animation frame segment from the segment data includes: detecting whether the segment data contains a loop identification of the current animation frame segment; and if so, extracting the cycle number of the current animation frame segment from the segment data.
And if the segment data contains the cycle identifier of the current animation frame segment, extracting the corresponding cycle times. In another embodiment, it may also be determined whether loop processing needs to be performed on the current animation frame segment according to a value of the loop identifier, for example, when the loop identifier is 0, it indicates that the current animation frame segment does not loop, and when the loop identifier is 1, it indicates that the current animation frame segment loops. It is to be understood that, upon detecting that the loop identification of the current animation frame segment is included in the segment data, the number of loops of the current animation frame segment may also be determined based on the play time length.
In the embodiment, whether the segment data contains the loop identification of the current animation frame segment is detected; if so, extracting the cycle times of the current animation frame segment from the segment data, and realizing accurate determination of the cycle times of the current animation frame segment.
Optionally, the segment data further includes a time interval between adjacent animation frame segments, and after the generating the current animation frame segment, further includes: reading a current time interval corresponding to a current animation frame segment from the segment data; and setting the playing interval time between the current animation frame segment and the next adjacent animation frame segment as the current time interval so as to update the target animation.
The time interval between adjacent animation frame segments refers to the playing time interval between the current animation frame segment and the next adjacent animation frame segment; specifically, it refers to the time interval between the ending animation frame of the current animation frame segment and the starting animation frame of the next adjacent animation frame segment. During the time interval, a still image may be played, and the still image may be a black image, a starting animation frame of a next adjacent animation frame segment, or an ending animation frame of a current animation frame segment. Illustratively, the current time interval corresponding to the current animation frame segment is 2s, and then the next adjacent animation frame segment of the current animation frame segment is played 2s after the current animation frame segment is played.
In this embodiment, the current time interval corresponding to the current animation frame segment is read from the segment data, and the playing interval time between the current animation frame segment and the next adjacent animation frame segment is set as the current time interval, so that when the animation is divided, each animation frame segment is played at intervals based on the set playing interval.
According to the technical scheme of the embodiment, the current animation frame segment identification, the current start frame identification and the current end frame identification corresponding to the current animation frame segment identification are read from the segmented data according to the preset reading sequence; positioning an animation frame corresponding to the current starting frame identification in the target animation to be used as a starting animation frame, and positioning an animation frame corresponding to the current ending frame identification in the target animation to be used as an ending animation frame; the current animation frame segment identification is added to the animation frame segment composed of the starting animation frame, the ending animation frame and the animation frame between the starting animation frame and the ending animation frame to generate the current animation frame segment, the generation of the animation frame segment based on the segment data is realized, an animator does not need to segment the animation manually, and therefore the problem of low accuracy of the existing animation segmentation method is solved.
EXAMPLE five
Fig. five is a schematic structural diagram of an animation segmentation apparatus according to a fifth embodiment of the present invention, which is applicable to a situation where a drawn animation is segmented to update the animation based on a configuration file generated by a user in an animation drawing process. The device specifically includes: an acquisition module 510 and a segmentation module 520.
An obtaining module 510, configured to obtain a drawn target animation and a configuration file in an animation segment data obtaining method corresponding to the target animation, where the configuration file includes segment data of the target animation;
a segmentation module 520, configured to complete a segmentation operation of the target animation according to the segment data to update the target animation.
According to the technical scheme of the embodiment, the drawn target animation and the configuration file in the animation segment data acquisition method corresponding to the target animation are acquired through the acquisition module; the segmentation module completes the segmentation operation of the target animation according to the segmentation data to update the target animation, so that the segmentation processing of the animation based on the predetermined configuration file is realized, an animator does not need to manually segment the animation, the problem of low accuracy of the conventional animation segmentation method is solved, and meanwhile, the rapid segmentation of the animation is realized.
Optionally, the segmentation module 520 includes a reading unit, a positioning unit, and a generating unit; wherein the content of the first and second substances,
a reading unit, configured to read, according to a preset reading sequence, a current animation frame segment identifier, and a current start frame identifier and a current end frame identifier corresponding to the current animation frame segment identifier from the segment data;
the positioning unit is used for positioning an animation frame corresponding to the current starting frame identifier in the target animation to be used as a starting animation frame, and positioning an animation frame corresponding to the current ending frame identifier in the target animation to be used as an ending animation frame;
and the generating unit is used for adding the current animation frame segment identification to animation frame segments formed by the starting animation frame, the ending animation frame and all animation frames between the starting animation frame and the ending animation frame so as to generate the current animation frame segment.
Optionally, the special effect data includes a loop number of a loop frame segment, and the animation segmenting apparatus further includes a first loop module, configured to extract the loop number of the current animation frame segment from the segment data after the current animation frame segment is generated; and adding the cycle times to the current animation frame segment to update the current animation frame segment, so that when the target animation is played, the updated current animation frame segment is played circularly for the cycle times.
Optionally, the animation segmenting apparatus further includes a second loop module, configured to extract a playing time length of a current animation frame segment from the segment data; and adding the playing time length to the current animation frame segment to update the current animation frame segment, so that when the target animation is played, the updated current animation frame segment is circularly played for the playing time length.
Optionally, the first loop module is specifically configured to detect whether the segment data includes a loop identifier of a current animation frame segment; and if so, extracting the cycle number of the current animation frame segment from the segment data.
Optionally, the segment data further includes a time interval between adjacent animation frame segments, and the animation segmentation apparatus further includes an interval setting module, configured to, after the current animation frame segment is generated, read a current time interval corresponding to the current animation frame segment from the segment data; and setting the playing interval time between the current animation frame segment and the next adjacent animation frame segment as the current time interval so as to update the target animation.
The animation segmentation device provided by the embodiment of the invention can execute the animation segmentation method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the system are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.
EXAMPLE six
Fig. 6 is a schematic structural diagram of a computer device according to a sixth embodiment of the present invention. FIG. 6 illustrates a block diagram of an exemplary computer device 60 suitable for use in implementing embodiments of the present invention. The computer device 60 shown in fig. 6 is only an example and should not bring any limitation to the function and scope of use of the embodiments of the present invention.
As shown in fig. 6, computer device 60 is embodied in the form of a general purpose computing device. The components of computer device 60 may include, but are not limited to: one or more processors or processing units 601, a system memory 602, and a bus 603 that couples various system components including the system memory 602 and the processing unit 601.
Bus 603 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Computer device 60 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 60 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 602 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)604 and/or cache memory 605. The computer device 60 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 606 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 6, commonly referred to as a "hard drive"). Although not shown in FIG. 6, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to the bus 603 by one or more data media interfaces. Memory 602 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 608 having a set (at least one) of program modules 607 may be stored, for example, in memory 602, such program modules 607 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. The program modules 607 generally perform the functions and/or methods of the described embodiments of the invention.
Computer device 60 may also communicate with one or more external devices 609 (e.g., keyboard, pointing device, display 610, etc.), with one or more devices that enable a user to interact with computer device 60, and/or with any devices (e.g., network card, modem, etc.) that enable computer device 60 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 611. Also, computer device 60 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via network adapter 612. As shown, network adapter 612 communicates with the other modules of computer device 60 via bus 603. It should be appreciated that although not shown in FIG. 6, other hardware and/or software modules may be used in conjunction with computer device 60, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 601 executes various functional applications and data processing by running a program stored in the system memory 602, for example, to implement the steps of an animation segmentation method provided by the embodiment of the present invention, the method including:
acquiring a drawn target animation and a configuration file in an animation segment data acquisition method corresponding to the target animation;
completing a segmentation operation of the target animation according to the segmentation data to update the target animation.
Of course, those skilled in the art will appreciate that the processor may also implement the solution of the animation segmentation method provided by any embodiment of the present invention.
EXAMPLE seven
An embodiment of the present invention also provides a storage medium containing computer-executable instructions for performing an animation segmentation method when executed by a computer processor. The method comprises the following steps:
acquiring a drawn target animation and a configuration file in an animation segment data acquisition method corresponding to the target animation;
completing a segmentation operation of the target animation according to the segmentation data to update the target animation.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. An animation segment data acquisition method, executed by a processor of an animation rendering device, comprising:
acquiring segmentation data set by a user in an animation drawing process in real time, wherein the segmentation data comprises a start frame identifier, an end frame identifier, a playing time length and special effect data of each animation frame segment;
and according to the segment data export command, importing the segment data into a corresponding configuration file.
2. An animation segmentation method, performed by an animation engine, comprising:
acquiring a drawn target animation and the configuration file in claim 1 corresponding to the target animation;
completing a segmentation operation of the target animation according to the segmentation data to update the target animation.
3. The method of claim 2, wherein completing the segmentation operation of the target animation according to the segment data to update the target animation comprises:
reading the current animation frame segment identification, and the current starting frame identification and the current ending frame identification corresponding to the current animation frame segment identification from the segment data according to a preset reading sequence;
positioning an animation frame corresponding to the current starting frame identifier in the target animation to be used as a starting animation frame, and positioning an animation frame corresponding to the current ending frame identifier in the target animation to be used as an ending animation frame;
and adding the current animation frame segment identification to animation frame segments formed by the starting animation frame, the ending animation frame and all animation frames between the starting animation frame and the ending animation frame so as to generate the current animation frame segment.
4. The method of claim 3, wherein the special effect data includes a number of cycles for a cycle frame segment, and further comprising, after the generating the current animation frame segment:
extracting the cycle number of the current animation frame segment from the segment data;
and adding the cycle times to the current animation frame segment to update the current animation frame segment, so that when the target animation is played, the updated current animation frame segment is played circularly for the cycle times.
5. The method of claim 3, further comprising, after said generating a current animation frame segment:
extracting the playing time length of the current animation frame segment from the segment data;
and adding the playing time length to the current animation frame segment to update the current animation frame segment, so that when the target animation is played, the updated current animation frame segment is circularly played for the playing time length.
6. The method of claim 4, wherein said extracting a number of cycles for a current animated frame segment from the segment data comprises:
detecting whether the segment data contains a loop identification of the current animation frame segment;
and if so, extracting the cycle number of the current animation frame segment from the segment data.
7. The method of claim 3, wherein the segment data further includes a time interval between adjacent animation frame segments, and further comprising, after the generating a current animation frame segment:
reading a current time interval corresponding to a current animation frame segment from the segment data;
and setting the playing interval time between the current animation frame segment and the next adjacent animation frame segment as the current time interval so as to update the target animation.
8. An animation segmentation apparatus, comprising:
an obtaining module, configured to obtain a drawn target animation and a configuration file in claim 1 corresponding to the target animation, where the configuration file includes segment data of the target animation;
and the segmentation module is used for finishing the segmentation operation of the target animation according to the segmentation data so as to update the target animation.
9. A computer device, characterized in that the computer device comprises:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the animation segmentation method of any of claims 2-7.
10. A storage medium containing computer-executable instructions for performing the animation segmentation method as recited in any one of claims 2-7 when executed by a computer processor.
CN202011536058.XA 2020-12-23 2020-12-23 Animation segmentation data acquisition method, segmentation method, device, equipment and medium Pending CN112652039A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011536058.XA CN112652039A (en) 2020-12-23 2020-12-23 Animation segmentation data acquisition method, segmentation method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011536058.XA CN112652039A (en) 2020-12-23 2020-12-23 Animation segmentation data acquisition method, segmentation method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN112652039A true CN112652039A (en) 2021-04-13

Family

ID=75359460

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011536058.XA Pending CN112652039A (en) 2020-12-23 2020-12-23 Animation segmentation data acquisition method, segmentation method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN112652039A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114168878A (en) * 2021-11-23 2022-03-11 上海鸿米信息科技有限责任公司 Dynamic effect playing method, device, equipment, storage medium and program product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103873883A (en) * 2014-03-06 2014-06-18 小米科技有限责任公司 Video playing method and device and terminal equipment
CN104392633A (en) * 2014-11-12 2015-03-04 国家电网公司 Interpretation control method oriented to power system simulating training
CN109741427A (en) * 2018-12-14 2019-05-10 新华三大数据技术有限公司 Animation data processing method, device, electronic equipment and storage medium
CN110533751A (en) * 2019-08-30 2019-12-03 武汉真蓝三维科技有限公司 A kind of three-dimensional visualization cartoon making and playback method with interactive function

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103873883A (en) * 2014-03-06 2014-06-18 小米科技有限责任公司 Video playing method and device and terminal equipment
CN104392633A (en) * 2014-11-12 2015-03-04 国家电网公司 Interpretation control method oriented to power system simulating training
CN109741427A (en) * 2018-12-14 2019-05-10 新华三大数据技术有限公司 Animation data processing method, device, electronic equipment and storage medium
CN110533751A (en) * 2019-08-30 2019-12-03 武汉真蓝三维科技有限公司 A kind of three-dimensional visualization cartoon making and playback method with interactive function

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114168878A (en) * 2021-11-23 2022-03-11 上海鸿米信息科技有限责任公司 Dynamic effect playing method, device, equipment, storage medium and program product

Similar Documents

Publication Publication Date Title
CN107516510B (en) Automatic voice testing method and device for intelligent equipment
US10970334B2 (en) Navigating video scenes using cognitive insights
CN109754783B (en) Method and apparatus for determining boundaries of audio sentences
CN104081784A (en) Information processing device, information processing method, and program
CN111541938B (en) Video generation method and device and electronic equipment
US11954912B2 (en) Method for cutting video based on text of the video and computing device applying method
CN111124371A (en) Game-based data processing method, device, equipment and storage medium
CN105302906A (en) Information labeling method and apparatus
CN112866776B (en) Video generation method and device
CN111813465A (en) Information acquisition method, device, medium and equipment
CN111899859A (en) Surgical instrument counting method and device
CN112416775A (en) Software automation testing method and device based on artificial intelligence and electronic equipment
US20240079002A1 (en) Minutes of meeting processing method and apparatus, device, and medium
CN112631814A (en) Game plot dialogue playing method and device, storage medium and electronic equipment
CN112652039A (en) Animation segmentation data acquisition method, segmentation method, device, equipment and medium
CN114661287A (en) Component linkage rendering method and device, electronic equipment, storage medium and product
CN112306447A (en) Interface navigation method, device, terminal and storage medium
CN112637687B (en) Video playback method and device based on embedded point behaviors, computer equipment and medium
US9799326B2 (en) Training a cognitive agent using document output generated from a recorded process
CN113778419B (en) Method and device for generating multimedia data, readable medium and electronic equipment
CN113411517B (en) Video template generation method and device, electronic equipment and storage medium
CN112346736B (en) Data processing method and system
CN115170700A (en) Method for realizing CSS animation based on Flutter framework, computer equipment and storage medium
CN114461214A (en) Page display method and device, electronic equipment and storage medium
CN111124387A (en) Modeling system, method, computer device and storage medium for machine learning platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210413