CN117152312A - Game scenario animation playing method, device, equipment and storage medium - Google Patents

Game scenario animation playing method, device, equipment and storage medium Download PDF

Info

Publication number
CN117152312A
CN117152312A CN202311109391.6A CN202311109391A CN117152312A CN 117152312 A CN117152312 A CN 117152312A CN 202311109391 A CN202311109391 A CN 202311109391A CN 117152312 A CN117152312 A CN 117152312A
Authority
CN
China
Prior art keywords
track
data
scene
lens
game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311109391.6A
Other languages
Chinese (zh)
Inventor
陈永坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Sanqi Jichuang Network Technology Co ltd
Original Assignee
Guangzhou Sanqi Jichuang Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Sanqi Jichuang Network Technology Co ltd filed Critical Guangzhou Sanqi Jichuang Network Technology Co ltd
Priority to CN202311109391.6A priority Critical patent/CN117152312A/en
Publication of CN117152312A publication Critical patent/CN117152312A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a method, a device, equipment and a storage medium for playing a game scenario animation, and relates to the technical field of computer games. The method comprises the following steps: performing deserialization processing on the data blocks stored in the scenario file to obtain track data of a scene track, a lens track, an audio track and a role track; the data block is obtained based on the serialization processing of the track data; generating a scene track, a lens track, an audio track and a role track according to the track data; and playing the scenario animation of the game according to the time axis and the scene data in the scene track, the time axis and the lens parameters in the lens track, the time axis and the audio data in the sound effect track and the time axis and the character data in the character track. By the technical means, the problems of higher expense and technical requirement for making the scenario animation in the prior art are solved, the development efficiency of game resources is improved, and the capacity of scenario files in a game inclusion is reduced.

Description

Game scenario animation playing method, device, equipment and storage medium
Technical Field
The present application relates to the field of game technologies, and in particular, to a method, an apparatus, a device, and a storage medium for playing a game scenario animation.
Background
The scenario cutscene is one of important links in the game, can push the progress of the game and show the artistry of the game, and can also attract the attention of players, so that the players generate game power and are immersed in the atmosphere created by the game.
In the prior art, the scenario cutscene in the game is a video file which is shot or rendered in advance, the video file is packaged in a game inclusion as a game resource, and when a game player triggers a corresponding game scenario, the game plays the scenario cutscene through the video file. However, the mode of prefabricating the video file of the scenario cutscene has higher cost and technical requirements, and the video file can additionally increase the capacity of the game inclusion.
Disclosure of Invention
The application provides a method, a device, equipment and a storage medium for playing a game scenario animation, which solve the problems of higher expense and technical requirement for making the scenario animation in the prior art, improve the development efficiency of game resources and reduce the capacity of scenario files in a game inclusion.
In a first aspect, the present application provides a game scenario animation playing method, including:
Performing deserialization processing on the data blocks stored in the scenario file to obtain track data of a scene track, a lens track, an audio track and a role track; the data block is obtained based on the serialization processing of the track data;
generating the scene track, the lens track, the sound effect track and the role track according to the track data;
and playing the scenario animation of the game according to the time axis and the scene data in the scene track, the time axis and the lens parameters in the lens track, the time axis and the audio data in the sound effect track and the time axis and the role data in the role track.
In a second aspect, the present application provides a game scenario animation playing apparatus, comprising:
the data acquisition module is configured to perform deserialization processing on the data blocks stored in the scenario file to obtain track data of the scene track, the lens track, the sound effect track and the role track; the data block is obtained based on the serialization processing of the track data;
a track generation module configured to generate the scene track, the lens track, the sound effect track, and the character track from the track data;
And the scenario playing module is configured to play scenario animation of the game according to the time axis and the scene data in the scene track, the time axis and the lens parameters in the lens track, the time axis and the audio data in the sound effect track and the time axis and the role data in the role track.
In a third aspect, the present application provides a game scenario animation playing apparatus, comprising:
one or more processors;
a memory storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the game scenario animation playing method according to the first aspect.
In a fourth aspect, the present application provides a storage medium containing computer executable instructions which, when executed by a computer processor, are used to perform the game scenario animation playing method of the first aspect.
In the application, track data of a scene track, a lens track, an audio track and a role track are obtained by performing deserialization processing on data blocks stored in a scenario file; the data block is obtained based on the serialization processing of the track data; generating a scene track, a lens track, an audio track and a role track according to the track data; and playing the scenario animation of the game according to the time axis and the scene data in the scene track, the time axis and the lens parameters in the lens track, the time axis and the audio data in the sound effect track and the time axis and the character data in the character track. By the technical means, play of the scenario animation can be promoted through the time axis of the functional track corresponding to the scene, the lens, the sound effect and the role when the game is operated, and a developer only prepares the data related to the scene, the lens, the sound effect and the role of the scenario animation when the game is developed without producing video files of the scenario animation, so that the problems of higher cost and technical requirements for producing the scenario animation in the prior art are solved, the development efficiency of game resources is improved, the scenario files are stored in a serialized format, and the capacity of the scenario files in a game inclusion is reduced.
Drawings
FIG. 1 is a flow chart of a method for playing a game scenario animation provided by an embodiment of the present application;
FIG. 2 is a flow chart of generating a scene track provided by an embodiment of the application;
FIG. 3 is a flowchart of generating a lens track according to an embodiment of the present application;
FIG. 4 is a flow chart of generating a role track provided by an embodiment of the present application;
FIG. 5 is a flow chart of generating interface tracks provided by an embodiment of the present application;
fig. 6 is a schematic structural diagram of a game scenario animation playing device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a game scenario animation playing apparatus according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the following detailed description of specific embodiments of the present application is given with reference to the accompanying drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the matters related to the present application are shown in the accompanying drawings. Before discussing exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently, or at the same time. Furthermore, the order of the operations may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type, and are not limited to the number of objects, such as the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
In a more common existing implementation, scenario cutscenes in a game are video files which are shot or rendered in advance, the video files are packaged in a game inclusion as game resources, and when a game player triggers a corresponding game scenario, the game plays the scenario cutscenes through the video files. However, the mode of prefabricating the video file of the scenario cutscene has higher cost and technical requirements, and the video file can additionally increase the capacity of the game inclusion. The video file is unfavorable for modification after the game is released, and the cost for optimizing the scenario cutscene is high.
In order to solve the above-mentioned problems, the present embodiment provides a method for playing a scenario animation, so as to correspondingly present a scenario animation scene, a lens, an audio and a character through a scene track, a lens track, an audio track, a character track and other functional tracks in a running play of a game, and play the scenario animation without using a video file.
The game scenario animation playing method provided in the embodiment may be executed by a game scenario animation playing device, where the game scenario animation playing device may be implemented in a software and/or hardware manner, and the game scenario animation playing device may be configured by two or more physical entities or may be configured by one physical entity. For example, the game scenario animation playing apparatus may be a game client or a computer running the game client.
The game scenario animation playing device is provided with at least one type of operating system, wherein the operating system comprises, but is not limited to, an android system, a Linux system and a Windows system. The game scenario animation playing device may install at least one application program based on the operating system, where the application program may be an application program carried by the operating system, or may be an application program downloaded from a third party device or a server. In this embodiment, the game scenario animation playing apparatus has at least an application program that can execute the game scenario animation playing method.
For ease of understanding, the present embodiment will be described taking a computer as an example of a main body for executing a game scenario animation playing method.
Fig. 1 shows a flowchart of a method for playing a game scenario animation according to an embodiment of the present application. Referring to fig. 1, the game scenario animation playing method specifically includes:
s110, performing deserialization processing on the data blocks stored in the scenario file to obtain track data of a scene track, a lens track, an audio track and a role track; the data block is based on a serialization process of the track data.
In this embodiment, a game scenario system is developed using a Unity game engine. In order to divide the requirements of each scenario function, support the mutual combination, nesting and collaborative development of different scenario functions, the framework of the game scenario system adopts the design of a plurality of function tracks, and the operation bound on the function tracks is executed by the propulsion of a time axis. The game scenario system supports grouping, nesting and folding of multiple functional tracks, and can maintain the clarity of design logic and the cleanliness of editing panels when developing complex scenario plays. Functional tracks in a game scenario system include a scene track, a lens track, an audio track, and a character track. The scene track is used for providing scenes in the scenario animation; the lens track is used for providing a fortune mirror in the scenario animation; the sound effect track is used for providing sound effects in the scenario animation; the character track is used to provide characters in a scenario animation. When the scene, the fortune mirror, the sound effect and the role are combined and played, the playing effect of the scenario animation can be presented.
The plurality of functional tracks in the game scenario system adopt a modularized scheme, namely the interface design of each functional track ensures the independence of the functional tracks while adhering to the same standard, reduces code coupling, and makes the work of designing or adding new functional tracks simpler in the future. In the game scenario system, interfaces provided by other systems in the game can be directly called, and when a developer develops the game scenario system, the interfaces are communicated, so that each element of the scenario animation can be quickly developed. Meanwhile, the consistency of the interfaces enables the scenario animation and the scene in the game to be well transited, and even can be used for designing game playing methods. Compared with various systems in the traditional game, the game scenario system provided by the embodiment can be debugged after the game is operated to a certain flow, and the design effect of the scenario animation can be previewed and debugged through the game system called by the consistency interface and the time axis of the direct dragging function track, so that the complicated debugging flow is avoided, and the development efficiency is accelerated.
In this embodiment, a scenario file may be understood as a file generated by packaging track data of each functional track during game development, which is used to acquire track data of each functional track during game running. The track data is data for restoring a corresponding function track, which is developed by a developer according to contents of scenario animation in a game development process.
Illustratively, the Unity game engine uses YAML as a data storage format when recording track data in scenario files, which is a data storage format with high readability and strong interactivity with scripting languages used in development. However, even if YAML is used, there are too many ID records and redundant data with poor readability in scenario files, resulting in a large data volume of scenario files, thereby increasing the capacity in the game inclusion. In the initial stage of development, the game scenario system screens and simplifies file contents on the basis of YAML by taking actual required data contents as input according to the characteristic that a plurality of functional tracks independently coexist. In view of the fact that the resource acquisition interfaces of other systems in the game can be directly called by using the game scenario system, the Unity game engine does not need other game objects except a camera in the Hierarchy editing panel, a large number of file IDs and GUID records with extremely poor readability are removed from new track data, and the new simple and readable functional track IDs and track resource IDs are replaced. Simplifying the ID records reduces the coupling between data blocks, and even if the problems such as merging conflicts occur during submission by using version management tools such as SVN and the like, the problems can be easily solved. And secondly, the track data is removed from the part with the numerical value equal to the default value of the data structure, such as a large number of records with the numerical value of 0, so that the file size of the scenario file can be further reduced, and the readable and writable of the file can be enhanced. And carrying out serialization processing on the track data after the simplifying processing to generate data blocks in the YAML format, wherein the data information after the serialization processing is stored in the corresponding data blocks strictly according to the editing sequence, namely the playing sequence, so that the data blocks are subjected to deserialization processing to obtain the track data capable of restoring the corresponding functional track. Therefore, a scenario file read-write mode with higher flexibility and a data storage mode with less redundancy and higher readability are realized.
In this embodiment, the data blocks may be stored in the scenario file in a binary format in addition to being stored in the scenario file in a YAML format. Illustratively, the YAML formatted data blocks are packed and encrypted to generate binary formatted data blocks to further compress the scenario file size and to protect the file contents from easy cracking.
Illustratively, one scenario animation corresponds to one scenario file, and when a play operation of the scenario animation is triggered by a game player, for example, when the player fights against a certain NPC, a fight process between the player and the NPC is demonstrated through the scenario animation. At the moment, a scenario file of the scenario animation in the fight process is obtained from a game scenario system, and data blocks in the scenario file are subjected to deserialization processing to obtain track data of a scene track, a lens track, an audio track and a role track of the scenario animation in the fight process.
S120, generating a scene track, a lens track, an audio track and a role track according to the track data.
In an embodiment, the track data of the scene track includes scene track IDs and scene IDs ordered based on the order of play. Fig. 2 is a flowchart illustrating generation of a scene track according to an embodiment of the present application. As shown in fig. 2, the step of generating a scene track specifically includes S1201-S1203:
S1201, a scene track is created based on the scene track ID.
S1202, calling a scene interface of the game system to acquire target scene data corresponding to the scene ID.
S1203, adding the target scene data on the time axis corresponding to the scene track according to the ordering of the scene IDs corresponding to the target scene data.
Illustratively, a scene track of a scenario animation of the fight process is created based on the scene track ID. The scene corresponding to one scene ID in the track data can be understood as a scene of one frame image of the scenario animation of the fight process, and the frame length of the scene track can be determined according to the number of the scene IDs in the track data. In this embodiment, the game system may be understood as a system other than the game scenario system in the game, and the game scenario system shares a resource interface with the game system, so that the game scenario system may call a scene interface of the game system to acquire target scene data corresponding to the scene ID from the scene resource. Assuming that the first N scene IDs are 01 and the last M scene IDs are 02 in the track data, a scene interface is called to acquire a scene a corresponding to 01 and a scene B corresponding to 02 from a scene resource. Scene a is added on the time axis of the first N frames of the scene track, and scene B is added on the time axis of the last M frames of the scene track, so as to generate the scene track of the scenario animation of the fight process. When the scenario animation in the fight process is played based on the scene track, the player and the NPC fight in the scene A first, and the player and the NPC transfer to the scene B to fight along with the time, so that the loading and the switching of the scene are realized.
In this embodiment, the track data of the scene track further includes environmental special effect material ordered based on the play order. Illustratively, when the scene track is created, the nested environment effect sub-tracks in the scene track are created together, and the environment effect materials are added on the time axis corresponding to the environment effect sub-tracks according to the sequence of the environment effect materials. For example, if the environmental special effect material is a lightning strike special effect, when the scenario animation of the fight process is played based on the scene track, the lightning strike phenomenon occurs in the scene A and the scene B, and the reality of the fight process is increased.
In an embodiment, track data of the shot track includes shot track IDs and shot IDs ordered based on a play order. Fig. 3 is a flowchart illustrating generation of a lens track according to an embodiment of the present application. As shown in fig. 3, the step of generating a lens track specifically includes S1204-S1206:
s1204, creating a lens track according to the lens track ID.
S1205, calling a lens control interface of the game system to obtain lens parameters corresponding to the lens ID.
S1206, adding the lens parameters on a time axis corresponding to the lens track according to the ordering of the lens IDs corresponding to the lens parameters.
Illustratively, a shot track of a scenario animation of the fight process is created based on the shot track ID. The shot parameter corresponding to one shot ID in the shot track can be understood as the shot angle and the image processing parameter of one frame image of the scenario animation in the fight process, and the frame length of the shot track can be determined according to the number of shot IDs in the track data. In this embodiment, a plurality of shots with different shooting angles and image processing parameters are added in each scene, and similarly, the game scenario system may call the lens interface of the game system to obtain the shooting angle and the image processing parameters corresponding to the lens ID from the lens resource. The lens shooting angle and the image processing parameters are added on the time axis of the frame corresponding to the lens ID in the lens track. When the scenario animation in the fight process is played based on the lens track, the game scenario system can process the scenes and the roles of the same frame in the scene track according to the shooting angles of the lenses in the lens track and the image processing parameters so as to demonstrate the scenes and the roles under various view angles, so that the scenario animation presents the mirror effect of the film and television level, and a more realistic and faster visual experience is provided for players.
In this embodiment, the track data of the shot track further includes shot special effect materials ordered based on the play order. Illustratively, when the lens track is created, lens special effect materials nested in the lens track are created, and according to the ordering of the lens special effect materials, the lens special effect materials are added on a time axis corresponding to the lens special effect sub-track.
In one embodiment, track data for an audio track includes audio track IDs and audio material ordered based on a play order. After creating the audio track from the audio track ID, the audio material is added on the timeline of the audio track according to the ordering of the audio material. It should be noted that, more than two audio materials may be in the same playing order in the track data of the audio track, so that more than two audio materials are mixed and added in the audio track, for example, attack sound effects and lightning strike sound effects are on the same time axis in the audio track, when the scenario animation in the fight process is played based on the audio track, attack sound effects and lightning strike sound effects appear in the scenario animation at the same time, and the fight process is truly restored. In addition, the sound effect track also supports advanced functions such as 3D sound field distance, sound effect group control and the like so as to create rich, various and shocking sound field expressive force.
In one embodiment, the track data of the character track includes character IDs, character track IDs, and action materials ordered based on the play order. Fig. 4 is a flowchart illustrating generation of a character track according to an embodiment of the present application. As shown in fig. 4, the step of generating a character track specifically includes S1207-S1210:
S1207, based on the role track ID, creating a corresponding role track, and nesting an action sub-track and a reload sub-track in the role track.
S1208, calling a role interface of the game system to acquire target role data and wearing data corresponding to the role ID.
S1209, adding the target character data to the time axis corresponding to the character track, and adding the wearing data to the time axis corresponding to the replacement sub-track.
S1210, adding the action materials on a time axis corresponding to the action sub-track according to the ordering of the action materials.
Illustratively, the characters of the scenario animation of the fight process include a player and an NPC, creating a player's character track based on the player's character track ID and creating a NPC's character track based on the NPC's character track ID. When creating a player's character track, a nested action sub-track and a reload sub-track within the player's character track are created together. The game scenario system invokes a character interface of the game system to acquire player data corresponding to the player ID and wear data from character resources, the player data including player model data, the wear data including data of the player's current equipment, rides, pets, and the like. The player model data is added to the time axis of each frame of the player's character track, and the player's equipment, riding, pet, etc. data is added to the reloading sub-track. The action material comprises limb action data, skill data and scene position data, wherein the scene position data refers to the position of a character under a scene. According to the ordering of the action materials of the player, adding limb action data, skill data and scene position data of the player to the action sub-track of the player.
When creating the role track of the NPC, the action sub-track and the reload sub-track nested in the role track of the NPC are created together. The game scenario system calls a role interface of the game system to acquire NPC data and wearing data corresponding to the ID of the NPC from the role resource, wherein the NPC data comprises NPC model data, and the wearing data comprises data of the NPC such as current equipment, riding, pets and the like. NPC model data is added to the time axis of each frame of the role orbit of NPC, and NPC equipment, riding, pet, and other data are added to the replacement sub orbit. And adding limb motion data, skill data and scene position data of the NPC to the motion sub-track of the NPC according to the ordering of the motion materials of the NPC.
When the play animation of the fight process is played based on the role track of the NPC and the role track of the player, the NPC wearing the corresponding equipment releases skills with the corresponding limb motions at one position in the scene, the player wearing the corresponding equipment releases skills with the corresponding limb motions at the other position in the scene, and the NPC and the player change limb motions and release other skills along with the time, so that the fight process demonstration is realized.
It should be noted that, more than two action materials in the track data of the character track may be in the same playing sequence so that more than two action materials are mixed and added in the action sub-track, for example, the jump action materials and the fencing action materials are partially overlapped in the action sub-track, so that the character can start to display the fencing action at the same time in the second half of the jump action, so that different actions of the character can be naturally transited, and the hard feeling of action jump is avoided.
In this embodiment, the track data of the character track further includes skill effect material ordered based on the order of play. Correspondingly, when the character track is created, nested skill special effect sub-tracks in the character track are created together, and skill special effect materials are added on a time axis corresponding to the skill special effect sub-tracks according to the sequence of the skill special effect materials. For example, the skill special effect material of the NPC is a water tap special effect, so when the scenario animation of the fight process is played based on the role track of the NPC, the water tap appears when the NPC releases a certain skill, and the visual effect of the fight process is increased.
In one embodiment, in order to enable a player to actually participate in the scenario animation, and become an element for driving the scenario, a man-machine interaction design, that is, a so-called Quick Time Event (QTE), is often required. When the cut scene is advanced to some specific moment, the QTE mechanism will require the player to make the appropriate response, which can enhance the player's sense of achievement after achieving the goal. The game play system provides an interface that enables a high degree of freedom in the control of the time axis, enabling such non-linear narrative forms. In the man-machine interaction design, the game scenario system is added with an interface track, so that the interface system used uniformly by the whole game can be supported, and the system has the same expression form and operation logic. In the process of playing the scenario animation, interactive interfaces such as dialog boxes, prompt icons, operation buttons and the like are added through the interface track, so that the interactive experience of players can be enhanced.
In this embodiment, the track data further includes track data of the interface track including interface track IDs and interface IDs ordered based on the play order. Fig. 5 is a flowchart illustrating the generation of an interface track according to an embodiment of the present application. As shown in fig. 5, the step of generating the interface track specifically includes S1211-S1213:
s1211, creating an interface track based on the interface track ID.
S1212, calling an interface of the game system to obtain target interface data corresponding to the interface ID.
S1213, adding the target interface data on a time axis corresponding to the interface track according to the sequence of the interface IDs corresponding to the target interface data.
Illustratively, an interface track of a scenario animation of the fight process is created based on the interface track ID. The scene corresponding to one interface ID in the interface data can be understood as an interactive interface of one frame image of the scenario animation of the fight process, and the frame length of the interface track can be determined according to the number of the interface IDs in the track data. It should be noted that, in a certain period of scenario animation, the interactive interface may not be displayed, and the interface ID of the frame of the period is 0. The game scenario system calls an interface of the game system to acquire target interface data corresponding to the interface ID from the interface resource. And adding the target interface data on a time axis of a frame corresponding to the interface ID in the interface track. When the scenario animation in the fight process is played based on the interface track, the game scenario system can display the interactive interface so that players can input interactive operation through the interactive interface.
Furthermore, the man-machine interaction design can design a monitor and operation feedback according to the input operation of the player, wherein the input operation is generally divided into common interface interaction operations such as clicking, pressing, dragging and the like, operations involving hardware equipment such as shaking and enabling a camera to shoot are also included in some special application scenes, and data input caused by the operation of a non-player is also included, such as server specific protocol issuing and the like. The game scenario system can design different event listeners for various interactive operations so as to monitor the interactive operations input by players through the event listeners and control the rhythm and trend of scenario animation according to the interactive operations.
S130, playing the scenario animation of the game according to the time axis and the scene data in the scene track, the time axis and the lens parameters in the lens track, the time axis and the audio material in the sound effect track and the time axis and the role data in the role track.
For example, the scene data, the shot parameters and the character data of the same frame in the scene track, the shot track and the character track are rendered into the image frame in the scenario animation, so that the scene and the character under the corresponding shot are displayed in the image frame. Based on the frame sequence of the scene track, the lens track and the role track, the image frames and the audio materials in the sound effect track are played frame by frame, so that the play of the scenario animation is realized.
In this embodiment, when a scenario animation is executed to a certain scene in a scene track, the game scenario system processes the corresponding scene through operations of preloading, loading, removing and releasing. The preload operation instantiates a specified game scene in advance, but does not show up for natural transitions between scenes. The preload operation is an asynchronous process that does not have significant chunking even though the scene is large. The loading operation displays the scene which is already preloaded, if no preloading operation is executed before, the scene is loaded in a synchronous mode, and if the scene is in a state that the preloading is not completed, the progress of the time axis is interrupted until the corresponding scene is immediately displayed after the preloading is completed. The removing operation can hide the designated scene being displayed, and the removing operation can realize the repeated utilization of the same scene resource in different time periods. The release operation immediately destroys the instantiated scene and interrupts the preloading process of the specified scene. In order to prevent memory leakage, the release operation is automatically performed after each functional track is finished playing.
In summary, according to the game scenario animation playing method provided by the embodiment of the application, the track data of the scene track, the lens track, the sound effect track and the role track are obtained by performing deserialization processing on the data blocks stored in the scenario file; the data block is obtained based on the serialization processing of the track data; generating a scene track, a lens track, an audio track and a role track according to the track data; and playing the scenario animation of the game according to the time axis and the scene data in the scene track, the time axis and the lens parameters in the lens track, the time axis and the audio data in the sound effect track and the time axis and the character data in the character track. By the technical means, play of the scenario animation can be promoted through the time axis of the functional track corresponding to the scene, the lens, the sound effect and the role when the game is operated, and a developer only prepares the data related to the scene, the lens, the sound effect and the role of the scenario animation when the game is developed without producing video files of the scenario animation, so that the problems of higher cost and technical requirements for producing the scenario animation in the prior art are solved, the development efficiency of game resources is improved, the scenario files are stored in a serialized format, and the capacity of the scenario files in a game inclusion is reduced.
On the basis of the above embodiments, fig. 6 is a schematic structural diagram of a game scenario animation playing device according to an embodiment of the present application. Referring to fig. 6, the game scenario animation playing device provided in this embodiment specifically includes: a data acquisition module 21, a track generation module 22 and a scenario play module 23.
The data acquisition module 21 is configured to perform deserialization processing on the data blocks stored in the scenario file to obtain track data of a scene track, a lens track, an audio track and a role track; the data block is obtained based on the serialization processing of the track data;
a track generation module 22 configured to generate a scene track, a lens track, an audio track, and a character track from the track data;
the scenario playing module 23 is configured to play scenario animation of the game according to the time axis and the scene data in the scene track, the time axis and the lens parameters in the lens track, the time axis and the audio data in the sound effect track, and the time axis and the character data in the character track.
On the basis of the above embodiment, the data blocks are stored in the scenario file in the YAML format or the binary format.
On the basis of the above embodiment, the track data of the scene track includes scene track IDs and scene IDs ordered based on the play order; accordingly, the track generation module 22 includes: a scene track creation unit configured to create a scene track based on the scene track ID; the scene data acquisition unit is configured to call a scene interface of the game system to acquire target scene data corresponding to the scene ID; and a scene data adding unit configured to add the target scene data on a time axis corresponding to the scene track according to the ordering of the scene IDs corresponding to the target scene data.
On the basis of the above embodiment, the track data of the lens track includes the lens track IDs and the lens IDs ordered based on the play order; accordingly, the track generation module 22 includes: a lens track creation unit configured to create a lens track from the lens track ID; a lens parameter acquisition unit configured to call a lens control interface of the game system to acquire a lens parameter corresponding to the lens ID; and the lens parameter adding unit is configured to add lens parameters on a time axis corresponding to the lens track according to the ordering of the lens IDs corresponding to the lens parameters.
On the basis of the embodiment, the track data of the character track comprises character IDs, character track IDs and action materials sequenced based on the playing sequence; accordingly, the track generation module 22 includes: a character track creation unit configured to create a corresponding character track based on the character track ID, the character track being nested with the action sub-track and the reload sub-track; the character data acquisition unit is configured to call a character interface of the game system to acquire target character data and wearing data corresponding to the character ID; a character data adding unit configured to add target character data on a time axis corresponding to the character track and to add wearing data on a time axis corresponding to the reload sub-track; and the action material adding unit is configured to add the action materials on the time axis corresponding to the action sub-track according to the ordering of the action materials.
On the basis of the embodiment, the track data of the role track further comprises skill effect materials sequenced based on the playing sequence, and skill effect sub-tracks are nested in the role track; correspondingly, the special effect material adding unit is configured to add the special effect materials of the skills on the time axis corresponding to the sub-track of the special effect of the skills according to the ordering of the special effect materials of the skills.
On the basis of the above embodiment, the track data further includes track data of an interface track, the track data of the interface track including interface track IDs and interface IDs ordered based on a play order; correspondingly, the track generation module further comprises: an interface track creation unit configured to create an interface track based on the interface track ID; the interface data acquisition unit is configured to call an interface of the game system to acquire target interface data corresponding to the interface ID; and the interface data adding unit is configured to add the target interface data on the time axis corresponding to the interface track according to the sequence of the interface IDs corresponding to the target interface data.
In the above-mentioned, the game scenario animation playing device provided by the embodiment of the present application obtains the track data of the scene track, the lens track, the sound effect track and the character track by performing the anti-serialization processing on the data blocks stored in the scenario file; the data block is obtained based on the serialization processing of the track data; generating a scene track, a lens track, an audio track and a role track according to the track data; and playing the scenario animation of the game according to the time axis and the scene data in the scene track, the time axis and the lens parameters in the lens track, the time axis and the audio data in the sound effect track and the time axis and the character data in the character track. By the technical means, play of the scenario animation can be promoted through the time axis of the functional track corresponding to the scene, the lens, the sound effect and the role when the game is operated, and a developer only prepares the data related to the scene, the lens, the sound effect and the role of the scenario animation when the game is developed without producing video files of the scenario animation, so that the problems of higher cost and technical requirements for producing the scenario animation in the prior art are solved, the development efficiency of game resources is improved, the scenario files are stored in a serialized format, and the capacity of the scenario files in a game inclusion is reduced.
The game scenario animation playing device provided by the embodiment of the application can be used for executing the game scenario animation playing method provided by the embodiment, and has corresponding functions and beneficial effects.
Fig. 7 is a schematic structural diagram of a game scenario animation playing apparatus provided in an embodiment of the present application, and referring to fig. 7, the game scenario animation playing apparatus includes: a processor 31, a memory 32, a communication device 33, an input device 34 and an output device 35. The number of processors 31 in the game scenario animation playing apparatus may be one or more, and the number of memories 32 in the game scenario animation playing apparatus may be one or more. The processor 31, memory 32, communication means 33, input means 34 and output means 35 of the game scenario animation playing apparatus may be connected by a bus or other means.
The memory 32 is a computer-readable storage medium, and may be used to store software programs, computer-executable programs, and modules, such as program instructions/modules (e.g., the data acquisition module 21, the track generation module 22, and the scenario play module 23 in the game scenario animation playing apparatus) corresponding to the game scenario animation playing method according to any embodiment of the present application. The memory 32 may mainly include a storage program area that may store an operating system, at least one application program required for functions, and a storage data area; the storage data area may store data created according to the use of the device, etc. In addition, memory 32 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, the memory may further include memory remotely located with respect to the processor, the remote memory being connectable to the device through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The communication means 33 are for data transmission.
The processor 31 executes various functional applications of the apparatus and data processing by executing software programs, instructions and modules stored in the memory 32, that is, implements the above-described game scenario animation playing method.
The input means 34 may be used to receive entered numeric or character information and to generate key signal inputs related to user settings and function control of the device. The output means 35 may comprise a display device such as a display screen.
The game scenario animation playing device provided by the embodiment can be used for executing the game scenario animation playing method provided by the embodiment, and has corresponding functions and beneficial effects.
The embodiments of the present application also provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are for performing a game scenario animation playing method comprising: performing deserialization processing on the data blocks stored in the scenario file to obtain track data of a scene track, a lens track, an audio track and a role track; the data block is obtained based on the serialization processing of the track data; generating a scene track, a lens track, an audio track and a role track according to the track data; and playing the scenario animation of the game according to the time axis and the scene data in the scene track, the time axis and the lens parameters in the lens track, the time axis and the audio data in the sound effect track and the time axis and the character data in the character track.
Storage media-any of various types of memory devices or storage devices. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk or tape devices; computer system memory or random access memory such as DRAM, DDR RAM, SRAM, EDO RAM, lanbas (Rambus) RAM, etc.; nonvolatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in a first computer system in which the program is executed, or may be located in a second, different computer system connected to the first computer system through a network such as the internet. The second computer system may provide program instructions to the first computer for execution. The term "storage medium" may include two or more storage media residing in different locations (e.g., in different computer systems connected by a network). The storage medium may store program instructions (e.g., embodied as a computer program) executable by one or more processors.
Of course, the storage medium containing the computer executable instructions provided by the embodiment of the present application is not limited to the above-mentioned game scenario animation playing method, and may also perform the related operations in the game scenario animation playing method provided by any embodiment of the present application.
The game scenario animation playing device, the storage medium and the game scenario animation playing equipment provided in the above embodiments may execute the game scenario animation playing method provided in any embodiment of the present application, and technical details not described in detail in the above embodiments may be referred to the game scenario animation playing method provided in any embodiment of the present application.
The foregoing description is only of the preferred embodiments of the application and the technical principles employed. The present application is not limited to the specific embodiments described herein, but is capable of numerous modifications, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, while the application has been described in connection with the above embodiments, the application is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit of the application, the scope of which is set forth in the following claims.

Claims (10)

1. A method for playing a game scenario animation, comprising:
performing deserialization processing on the data blocks stored in the scenario file to obtain track data of a scene track, a lens track, an audio track and a role track; the data block is obtained based on the serialization processing of the track data;
generating the scene track, the lens track, the sound effect track and the role track according to the track data;
and playing the scenario animation of the game according to the time axis and the scene data in the scene track, the time axis and the lens parameters in the lens track, the time axis and the audio data in the sound effect track and the time axis and the role data in the role track.
2. The game scenario animation playing method according to claim 1, wherein the data block is stored in the scenario file in a YAML format or a binary format.
3. The game scenario animation playing method according to claim 1, wherein the track data of the scene track includes scene track IDs and scene IDs ordered based on a playing order;
correspondingly, the generating the scene track, the lens track, the sound effect track and the role track according to the track data comprises:
Creating a scene track based on the scene track ID;
invoking a scene interface of a game system to acquire target scene data corresponding to the scene ID;
and adding the target scene data on a time axis corresponding to the scene track according to the sequence of the scene IDs corresponding to the target scene data.
4. A game scenario animation playing method according to claim 3, wherein the track data of the shot track includes shot track IDs and shot IDs ordered based on a playing order;
correspondingly, the generating the scene track, the lens track, the sound effect track and the role track according to the track data comprises the following steps:
creating a lens track according to the lens track ID;
invoking a lens control interface of a game system to acquire lens parameters corresponding to the lens ID;
and adding the lens parameters on a time axis corresponding to the lens track according to the ordering of the lens IDs corresponding to the lens parameters.
5. A game scenario animation playing method according to claim 3, wherein the track data of the character track includes character IDs, character track IDs, and action materials ordered based on a playing order;
Correspondingly, the generating the scene track, the lens track, the sound effect track and the role track according to the track data comprises the following steps:
based on the character track ID, a corresponding character track is created, and an action sub-track and a reloading sub-track are nested in the character track;
calling a role interface of the game system to acquire target role data and wearing data corresponding to the role ID;
adding the target role data on a time axis corresponding to the role track, and adding the wearing data on a time axis corresponding to the reloading sub-track;
and adding the action materials on a time axis corresponding to the action sub-track according to the ordering of the action materials.
6. The game scenario animation playing method according to claim 5, wherein the track data of the character track further comprises skill effect materials ordered based on playing order, and skill effect sub-tracks are nested in the character track;
correspondingly, after the corresponding role track is created based on the role track ID, the method further comprises:
and adding the skill special effect materials on a time axis corresponding to the skill special effect sub-track according to the ordering of the skill special effect materials.
7. The game scenario animation playing method according to claim 1, wherein the track data further comprises track data of an interface track, the track data of the interface track comprising interface track IDs and interface IDs ordered based on a playing order; correspondingly, the method further comprises the steps of:
creating an interface track based on the interface track ID;
calling an interface of a game system to acquire target interface data corresponding to the interface ID;
and adding the target interface data on a time axis corresponding to the interface track according to the sequence of the interface IDs corresponding to the target interface data.
8. A game scenario animation playing apparatus, comprising:
the data acquisition module is configured to perform deserialization processing on the data blocks stored in the scenario file to obtain track data of the scene track, the lens track, the sound effect track and the role track; the data block is obtained based on the serialization processing of the track data;
a track generation module configured to generate the scene track, the lens track, the sound effect track, and the character track from the track data;
and the scenario playing module is configured to play scenario animation of the game according to the time axis and the scene data in the scene track, the time axis and the lens parameters in the lens track, the time axis and the audio data in the sound effect track and the time axis and the role data in the role track.
9. A game scenario animation playing apparatus, comprising:
one or more processors;
a memory storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the game scenario animation playing method of any one of claims 1-7.
10. A storage medium containing computer executable instructions which, when executed by a computer processor, are for performing the game scenario animation playing method of any one of claims 1-7.
CN202311109391.6A 2023-08-30 2023-08-30 Game scenario animation playing method, device, equipment and storage medium Pending CN117152312A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311109391.6A CN117152312A (en) 2023-08-30 2023-08-30 Game scenario animation playing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311109391.6A CN117152312A (en) 2023-08-30 2023-08-30 Game scenario animation playing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117152312A true CN117152312A (en) 2023-12-01

Family

ID=88903899

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311109391.6A Pending CN117152312A (en) 2023-08-30 2023-08-30 Game scenario animation playing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117152312A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117618921A (en) * 2023-12-13 2024-03-01 广州库洛科技有限公司 User-defined game development method and system based on big data

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117618921A (en) * 2023-12-13 2024-03-01 广州库洛科技有限公司 User-defined game development method and system based on big data
CN117618921B (en) * 2023-12-13 2024-05-03 广州库洛科技有限公司 User-defined game development method and system based on big data

Similar Documents

Publication Publication Date Title
US8405662B2 (en) Generation of video
US20170249785A1 (en) Virtual reality session capture and replay systems and methods
US11169824B2 (en) Virtual reality replay shadow clients systems and methods
US20150375101A1 (en) Character simulation and playback notification in game session replay
US20100124992A1 (en) System and method for production of multiuser network game
JP2006528381A (en) Virtual environment controller
CN117152312A (en) Game scenario animation playing method, device, equipment and storage medium
Mishra et al. Comparison between famous game engines and eminent games
CN112631814B (en) Game scenario dialogue playing method and device, storage medium and electronic equipment
CN112669194B (en) Animation processing method, device, equipment and storage medium in virtual scene
CN113935868A (en) Multi-courseware teaching demonstration system based on Unity3D engine
Sherman et al. FreeVR: honoring the past, looking to the future
US20240004529A1 (en) Metaverse event sequencing
CN111330283B (en) Method and device for processing data in game, electronic equipment and storage medium
WO2018049682A1 (en) Virtual 3d scene production method and related device
CN107305492A (en) Gse2D game engines
CN113577760B (en) Game operation guiding method, game operation guiding device, electronic equipment and storage medium
KR102158676B1 (en) Scenario Player System For Scenario With Divergence
EP2070570A1 (en) Generation of video
US12003833B2 (en) Creating interactive digital experiences using a realtime 3D rendering platform
CN117931166A (en) Graphical programming method, system, electronic equipment and storage medium
Casas et al. SUED: An extensible framework for the development of low-cost DVE systems
Шульга et al. Researched methods for simplifying and optimizing particles for portable gaming devices
CN116389842A (en) Virtual special effect playing method and device, electronic equipment and readable storage medium
CN116920366A (en) Data processing method, apparatus, computer program product, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination