CN108965989B - Processing method and device for interactive application scene and storage medium - Google Patents

Processing method and device for interactive application scene and storage medium Download PDF

Info

Publication number
CN108965989B
CN108965989B CN201810770518.1A CN201810770518A CN108965989B CN 108965989 B CN108965989 B CN 108965989B CN 201810770518 A CN201810770518 A CN 201810770518A CN 108965989 B CN108965989 B CN 108965989B
Authority
CN
China
Prior art keywords
scene
state
client
frame
execution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810770518.1A
Other languages
Chinese (zh)
Other versions
CN108965989A (en
Inventor
林瑞柠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810770518.1A priority Critical patent/CN108965989B/en
Publication of CN108965989A publication Critical patent/CN108965989A/en
Application granted granted Critical
Publication of CN108965989B publication Critical patent/CN108965989B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • A63F13/798Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5553Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history user representation in the game field, e.g. avatar
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/558Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history by assessing the players' skills or ranking

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Business, Economics & Management (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses a processing method and a processing device for an interactive application scene and a storage medium, which are used for the visual playback of an interactive process and leading a user to know the whole interactive process in detail. A processing method of an interactive application scene comprises the following steps: the client sends interactive request information to the server, wherein the interactive request information comprises: the control strategy of the simulation object when the simulation object is executed in the interactive application scene is controlled and executed by the client; the client receives an interaction result sent by the server according to the control strategy, wherein the interaction result comprises: recording data when simulating the object executing in the interactive application scene; the client extracts scene states generated in a plurality of logical frames and corresponding object states from the recorded data; and the client generates a scene reduction video according to the scene state and the object state generated in the plurality of logical frames and plays the scene reduction video, wherein the scene reduction video is used for playing back the execution process of the simulation object in the interactive application scene.

Description

Processing method and device for interactive application scene and storage medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method and an apparatus for processing an interactive application scenario, and a storage medium.
Background
The strategy game provides an environment for players to take an initiative to think about the problem to deal with more complicated matters, and the players need to ensure that the self-controlled objects reach the target specified by the game when playing the game. The player needs to think of the best way to accomplish the goal within the limits of game approval.
In a strategy game, players are allowed to freely control, manage and use people or things in the game, and the target required by the game is achieved through the free means and the method for the players to start the brain to think of fighting against enemies. According to the characteristics of the strategy game, a large number of repeated units, game rules and models included in the strategy occupy a large amount of system resources.
After the strategy game is finished, the system sends a battle mail to both sides of the strategy game, wherein the mail contains the information of the current battle, and the player can know the details of the current battle by reading the battle mail. The report mail displays the course and result of a battle in the form of characters or lists, so that the result of the battle in the strategy game can be reflected. In the prior art, the combat result is displayed to the user only through the war mails, so that the user cannot deeply know the details of the combat, and the randomness and the tacticity in the combat cannot be well reflected.
Disclosure of Invention
The embodiment of the invention provides a processing method and device of an interactive application scene and a storage medium, which are used for visual playback of an interactive process and enable a user to know the whole interactive process in detail.
The embodiment of the invention provides the following technical scheme:
in one aspect, an embodiment of the present invention provides a method for processing an interactive application scenario, including:
the client sends interactive request information to the server, wherein the interactive request information comprises: a control policy for a simulated object when executed in an interactive application scene, the simulated object being controlled by the client for execution;
the client receives an interaction result sent by the server according to the control strategy, wherein the interaction result comprises: recording data of the simulated object when executed in the interactive application scene;
the client extracts scene states generated in a plurality of logical frames and corresponding object states from the recorded data;
and the client generates a scene reduction video according to the scene state and the object state generated in the plurality of logical frames, and plays the scene reduction video, wherein the scene reduction video is used for playing back the execution process of the simulation object in the interactive application scene.
On the other hand, an embodiment of the present invention further provides a method for processing an interactive application scenario, including:
the method comprises the following steps that a server receives interactive request information sent by a client, wherein the interactive request information comprises: a control policy for a simulated object when executed in an interactive application scene, the simulated object being controlled by the client for execution;
the server carries out interactive calculation according to the control strategy, records scene states of the interactive application scene generated in a plurality of logic frames, and records object states corresponding to each logic frame when the simulation object is executed in the interactive application scene;
the server generates recorded data of the simulation object when the simulation object is executed in the interactive application scene according to the scene state generated in the plurality of logical frames and the corresponding object state;
the server sends an interaction result to the client, wherein the interaction result comprises: the data is recorded.
In another aspect, an embodiment of the present invention further provides a client, including:
a sending module, configured to send interaction request information to a server, where the interaction request information includes: a control policy for a simulated object when executed in an interactive application scene, the simulated object being controlled by the client for execution;
a receiving module, configured to receive an interaction result sent by the server according to the control policy, where the interaction result includes: recording data of the simulated object when executed in the interactive application scene;
the state extraction module is used for extracting scene states generated in a plurality of logical frames and corresponding object states from the recorded data;
and the video restoration module is used for generating a scene restoration video according to the scene state and the object state generated in the plurality of logical frames and playing the scene restoration video, wherein the scene restoration video is used for playing back the execution process of the simulation object in the interactive application scene.
On the other hand, an embodiment of the present invention further provides a server, including:
a receiving module, configured to receive interactive request information sent by a client, where the interactive request information includes: a control policy for a simulated object when executed in an interactive application scene, the simulated object being controlled by the client for execution;
the state acquisition module is used for carrying out interactive calculation according to the control strategy, recording scene states of the interactive application scene generated in a plurality of logic frames and recording object states corresponding to each logic frame when the simulation object is executed in the interactive application scene;
the data generation module is used for generating recording data when the simulation object is executed in the interactive application scene according to the scene state generated in the plurality of logical frames and the corresponding object state;
a sending module, configured to send an interaction result to the client, where the interaction result includes: the data is recorded.
In another aspect, an embodiment of the present invention provides a client, where the client includes: a processor, a memory; the memory is used for storing instructions; the processor is configured to execute the instructions in the memory to cause the client to perform the method of any one of the preceding aspects.
In another aspect, an embodiment of the present invention provides a server, where the server includes: a processor, a memory; the memory is used for storing instructions; the processor is configured to execute the instructions in the memory to cause the server to perform a method as in any one of the preceding aspects.
In another aspect, the present invention provides a computer-readable storage medium, which stores instructions that, when executed on a computer, cause the computer to perform the method of the above aspects.
In the embodiment of the invention, a client sends interactive request information to a server, wherein the interactive request information comprises: the method comprises the steps that a control strategy is carried out when a simulation object is executed in an interactive application scene, an interaction result sent to a client by a server comprises recorded data, the client can extract a scene state and an object state generated in a plurality of logic frames from the recorded data, so that a scene restoration video can be generated, and when the client plays the scene restoration video, a user can know an execution process of the simulation object in the interactive application scene through the scene restoration video.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings.
Fig. 1 is a schematic diagram of a system application architecture of a processing method for an interactive application scenario according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a processing method for an interactive application scenario according to an embodiment of the present invention;
fig. 3 is a schematic flowchart illustrating a processing method of an interactive application scenario according to an embodiment of the present invention;
fig. 4 is a schematic diagram illustrating an interaction flow between a client and a server according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of an implementation scenario of a combat framework in a strategy game scenario according to an embodiment of the present invention;
fig. 6 is a schematic diagram of recording logical frames during a combat process according to an embodiment of the present invention;
FIG. 7 is a schematic view of a recovery scenario of a combat frame in a strategy game scenario according to an embodiment of the present invention;
fig. 8 is a schematic diagram illustrating a process of parsing the report data by the client according to the embodiment of the present invention;
FIG. 9 is a schematic process diagram of a client restoring a policy game scenario according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a client according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of a server according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of a terminal to which the processing method for an interactive application scene according to the embodiment of the present invention is applied;
fig. 13 is a schematic structural diagram of a server to which the processing method for an interactive application scenario provided in the embodiment of the present invention is applied.
Detailed Description
The embodiment of the invention provides a processing method and device of an interactive application scene and a storage medium, which are used for visual playback of an interactive process and enable a user to know the whole interactive process in detail.
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the embodiments described below are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments that can be derived by one skilled in the art from the embodiments given herein are intended to be within the scope of the invention.
The terms "comprises" and "comprising," and any variations thereof, in the description and claims of this invention and the above-described drawings are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of elements is not necessarily limited to those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Please refer to fig. 1, which illustrates a system architecture diagram applied by the processing method of an interactive application scenario provided in the embodiment of the present application. The system may include: a server 110 and a client 120, wherein the server 110 can provide video playback information to the client 120, the video playback information comprising: recording data when the simulated object executes in the interactive application scene. Data is transferred between clients 120 and server 110 via a communication network. The client 120 may specifically be a terminal as shown in fig. 1, and for example, the client 120 may also be a game client. The terminal may be a mobile phone, a tablet computer, an e-book reader, an MP3 player (Moving Picture Experts Group Audio Layer III, motion Picture Experts Group Audio Layer IV, motion Picture Experts Group Audio Layer 4), an MP4 player, a laptop, a desktop computer, or the like.
In the embodiment of the present invention, the video playback information sent by the server 110 to the client 120 includes the recorded data, the terminal 120 may acquire the video playback information from the server 110 through the communication network, so as to determine the recorded data, and the client 120 may extract the scene state and the object state generated in the plurality of logical frames from the recorded data, so as to generate the scene restoration video, and when the client 120 plays the scene restoration video, the user may know the execution process of the simulation object in the interactive application scene through the scene restoration video.
The following description is made in detail from the perspective of the client and the server, respectively. The embodiment of the processing method of the interactive application scene can be particularly applied to a scene for reporting the interactive process to the user in real time. The interactive application scene in the embodiment of the invention can be a game scene or an interactive scene of an application program. For example, the processing method of the interactive application scene provided by the embodiment of the present invention may be applied to a scene built for a game role, and may also be applied to a scene built for a user object in a software application system. The interactive application scene described in the embodiment of the present invention displays a simulation object, where the simulation object may be a game character in a game scene, or may also be hero and soldier in the game scene, for example, the simulation object may be a person or thing controlled by a user in a strategy game, and is not limited herein.
First, a description is made from a client side, please refer to fig. 2, where the method for processing an interactive application scenario according to an embodiment of the present invention includes the following steps:
201. the client sends interactive request information to the server, wherein the interactive request information comprises: and controlling the execution of the simulation object in the interactive application scene, wherein the simulation object is controlled by the client to execute.
In the embodiment of the invention, the simulation object is controlled and executed by the client, for example, the interactive application scene may be a strategy game, the simulation object may be a character (such as hero and soldier) in the game, and the user may issue a control strategy for the simulation object through the client. For example, the interactive application scenario may be a game scenario of a pre-battle strategy type, and before a battle between two marchs of a game, a player issues a control strategy through a client, for example, the matrix of several heros in one team may be set, and the number of soldiers in each hero may be adjusted.
202. The client receives an interaction result sent by the server according to the control strategy, wherein the interaction result comprises: recording data when the simulated object executes in the interactive application scene.
In the embodiment of the invention, the simulation object is controlled by the client to execute, for example, the interactive application scene can be a strategy game, the simulation object can be a character in the game, and when the simulation object executes in the interactive application scene of the client, the server stores the record data of the simulation object when the simulation object executes in the interactive application scene. The server sends the interaction result to the client so that the client can receive the interaction result first. For example, the server calculates the whole fighting process according to various control strategies (such as hero, soldier number, skill and other control data) of the player, records the state, such as recording scene state and object state, generates a fighting result and a detailed report, and then sends the interaction result to the client, wherein the interaction result includes the recorded data.
In some embodiments of the present invention, the server may detect whether the scene interaction is completed in real time, and when the interaction is completed, the server may generate result data generated when the simulation object is executed in the interactive application scene, for example, the result data may be a report mail in a strategy game. The server sends an interaction result to the client, taking an interactive application scene as an example of a strategy game scene, after a battle occurs, the server sends a battle email to the clients of both sides of the battle, wherein the battle email contains information of the current battle, and the information generally comprises a killer, self loss, resources plunder and the like. The player can know the details of the battle by reading the war mail.
In some embodiments of the present invention, the interaction result sent by the server includes not only the above-mentioned report mail, but also the server can send the recorded data when the simulation object executes in the interactive application scene. Thereby enabling the user to view the video playback.
203. The client extracts scene states generated at a plurality of logical frames and corresponding object states from the recording data.
In the embodiment of the present invention, after receiving the interaction result, the client acquires the recorded data from the interaction result, and the client acquires the scene state and the object state through the state data generated in the logical frame and recorded in the recorded data. Wherein the logical frame is a frame with a status update in a status synchronization algorithm. In the embodiment of the invention, the client can extract the scene state and the object state generated in the logical frame by recording data, and when a plurality of logical frames with updated states exist, the scene state and the object state generated in the plurality of logical frames can be respectively extracted. Wherein the scene state refers to state data of the interactive application scene, and the object state refers to state data of the simulation object.
In some embodiments of the present invention, the step 203 extracting, by the client, scene states generated in a plurality of logical frames and corresponding object states from the recorded data, includes:
the client analyzes the recorded data to obtain a frame state data list corresponding to each of the plurality of logical frames;
the client analyzes the frame state data list corresponding to each logical frame to obtain the scene state and the corresponding object state generated in the logical frame.
The recording data stored by the server comprises a frame state data list corresponding to a plurality of logical frames respectively, the logical frames are frames with state updating in a state synchronization algorithm, the frame state data list corresponding to each logical frame recorded by the server is analyzed, and a scene state and an object state which can be used in a specified format of the client are generated. When the server is confronted with unstable factors such as multi-system network environment, weak network and the like, the primary objective of good compatibility is to achieve low flow, and to ensure low flow, because the server records the frame state data list corresponding to the logical frame only when the state changes, the data packet of the frame state data list can be minimized, so as to improve the transmission efficiency and the processing efficiency of the recorded data.
In some embodiments of the invention, the scene states comprise: the method comprises an interaction starting stage, an interaction proceeding stage and an interaction ending stage.
Wherein, a plurality of simulation objects in the interactive application scene can be executed under the control of the client, and the scene can be respectively the following three stages according to the progress condition: the method comprises an interaction starting stage, an interaction proceeding stage and an interaction ending stage. For the interaction starting stage, when the interaction starts, the object names and the arrangement of both sides of the simulation objects are displayed firstly, meanwhile, the simulation objects of both sides interact in a character bubble mode, the interaction time is not counted at the moment, and after the interaction starting stage is finished, timing is started, and formal interaction is started. The interaction progress stage refers to the formal interaction of the two simulated objects, for example, in a strategy game scene, the interaction progress stage may include: an enemy searching stage and an attacking stage, and whether the interaction of a single target point is finished or not is judged. The interaction ending phase refers to the interaction completion of the two-party simulation object, for example, in a strategy game scene, the interaction ending phase may include judging whether all the targets are destroyed. The scene state in the embodiment of the invention comprises an interaction starting stage, an interaction proceeding stage and an interaction ending stage, and the state data generated in each logical frame can be respectively recorded in each stage to provide original data for scene restoration.
In some embodiments of the invention, the object states include: an approach phase, an idle standing phase, a moving phase, an attack phase, a skill release phase and a preparatory movement phase.
Wherein the object state refers to state data of the simulated object. For a simulated object designed under different scenes, the object state may include a plurality of different stages, and a combination of the plurality of different stages. The approach phase, the idle standing phase, the moving phase, the attack phase, the skill release phase and the preparatory action phase respectively describe different states of the simulation object in the interactive application scene, the approach phase refers to the situation that the simulation object enters the interactive application scene, the idle standing phase refers to the situation that the simulation object waits for the next control operation after entering the interactive application scene, the moving phase refers to the situation that the simulation object moves (such as moving back, left and right, jumping and the like) under the control of the client, the attack phase refers to the situation that the simulation object launches an attack to the locked target simulation object under the control of the client, the specific attack mode can be determined according to the skill setting of the simulation object, the skill release phase refers to the situation that the simulation object releases skills to the target simulation object under the control of the client, and the skill design of the simulation object is described in the following embodiments, the preparatory operation phase is a waiting period before the simulation object is ready to perform the next operation under the control of the client.
204. And the client generates a scene reduction video according to the scene state and the object state generated in the plurality of logical frames and plays the scene reduction video, wherein the scene reduction video is used for playing back the execution process of the simulation object in the interactive application scene.
In the embodiment of the present invention, after the client extracts the scene state and the object state through the aforementioned recording data, the client may restore a segment of video according to the scene state and the object state generated in a plurality of logical frames through a state synchronization algorithm, where the video records a change process of the scene state and a change process of the object state, and may be referred to as a scene restoration video, and as is known from a generation manner of the video, the scene restoration video is used to play back an execution process of a simulation object in an interactive application scene.
In some embodiments of the present invention, step 204 the client generates a scene restoration video according to the scene state and the object state generated in the plurality of logical frames, including:
the client acquires basic scene information and basic object information;
the client creates an interactive application scene according to the scene basic information and the object basic information, and loads a simulation object in the interactive application scene;
the client generates an execution frame list and determines the occurrence time of each execution frame, wherein the execution frame list comprises a plurality of execution frames;
the client generates a state data list corresponding to each execution frame according to the scene state and the object state generated in the plurality of logical frames;
the client restores the execution content of the simulation object in the interactive application scene according to the state data list corresponding to each execution frame to obtain a video picture corresponding to each execution frame;
and the client combines the video pictures corresponding to each execution frame according to the occurrence time of the execution frames to obtain the scene restoration video.
The scene basic information refers to basic information of an interactive application scene, in different scene examples, the scene basic information may include different contents, and taking the interactive application scene as a policy game scene as an example, the scene basic information may be information of a gay city wall, trap information, and the like in the game scene. The basic object information refers to basic information of a simulated object, and in different scene examples, the basic object information may include different contents, and taking an interactive application scene as a strategy game scene as an example, the basic object information may be hero information (grade, occupation) of a battle on both sides in the game scene, a battle form of several heros, and the number of soldiers in heros. Through the scene basic information and the object basic information, the client can firstly create an interactive application scene and load a simulation object in the interactive application scene.
After the scene creation is completed, the client generates an execution frame list and determines the occurrence time of each execution frame, the execution frame list comprises a plurality of execution frames, the execution frame list can be a list with frame data at a specific time point of the client, and each execution frame comprises an action state data list of hero.
After the client generates the execution frame list, the client generates a state data list corresponding to each execution frame according to the scene state and the object state generated in the plurality of logical frames, wherein the state data list comprises the scene state and the object state which need to be simulated in the corresponding execution frame. And then the client starts to demonstrate the interactive process, and restores the execution content of the simulation object in the interactive application scene according to the state data list corresponding to each execution frame to obtain the video picture corresponding to each execution frame. After the client obtains the video pictures corresponding to the plurality of logical frames, the video pictures corresponding to each execution frame are synthesized according to the occurrence time sequence of the execution frames, so that the scene restoration video can be generated. As known from the generation mode of the video, the scene restoration video is used for playing back the execution process of the simulation object in the interactive application scene.
Taking an interactive application scene as a strategy game scene as an example, a client starts a demonstration fighting process, reads a state data list from an execution frame list every frame, executes the action of each hero, and simulates the behavior of a soldier until the fighting is finished.
Further, in some embodiments of the present invention, the obtaining, by the client, the scene basic information and the object basic information includes:
the client acquires scene basic information and object basic information from the interaction result; or the like, or, alternatively,
and the client acquires the scene basic information and the object basic information from the local cache.
The server can carry the scene basic information and the object basic information in the interaction result, so that the client can analyze the interaction result sent by the server to obtain the scene basic information and the object basic information. The other way to achieve the above is that the server sends the basic scene information and the basic object information to the client in advance, and the client stores the basic scene information and the basic object information in the local cache after receiving the basic scene information and the basic object information, so that the server does not need to carry the basic scene information and the basic object information when sending the interaction result each time, and the client can obtain the basic scene information and the basic object information by reading the specified storage space in the local cache, thereby improving the data transmission efficiency between the server and the client, and reducing the size of the transmission data packet as much as possible.
In some embodiments of the present invention, the client restores, in the interactive application scene, the execution content of the simulation object according to the state data list corresponding to each execution frame, including:
before restoring the execution content of the simulation object by the current execution frame, the client determines the execution content of the simulation object in the next execution frame in advance according to the state data list corresponding to the current execution frame;
and the client restores the execution content of the simulation object by using the predetermined execution content in the next execution frame.
The client determines the execution content of the simulation object in the next execution frame in advance by taking the execution content reduction of the current execution frame as an example, so that the execution content of the simulation object in the next execution frame can be used for reduction, and the accurate control of the simulation object can be realized. For example, when the moving state of the target hero is controlled, only the starting position, the target hero Identifier (ID) and the speed are included in the background data, the arriving position is calculated in advance, and the target hero does not move as long as the target hero moves to the specified position, so that the client can ensure that the target hero does not misplace during moving, the target hero moves accurately, and the client can calculate the arriving position and the moving time in advance. For another example, in order to ensure that the injury caused by the attack or skill can be processed at the right time, the injury value is taken out, and an independent injury state is generated. Because the damage recorded in the warfare newspaper is delayed, the unit of the delayed record is second, the client has a gap when the damage bullets are expressed due to performance problems, and the client can directly calculate and generate the damage state.
In some embodiments of the present invention, the client restores, in the interactive application scene, the execution content of the simulation object according to the state data list corresponding to each execution frame, including:
the client restores the execution content of the simulation object in the interactive application scene at the occurrence time of each execution frame, and adjusts the corresponding direction and distance of a shot for watching the interactive application scene in real time at the occurrence time of each execution frame.
When the client restores the execution content of the simulation object, the direction and the distance corresponding to the lens used for watching the interactive application scene can be adjusted in real time, so that the lens module of the client can control the watching angle and the presence distance of the interactive application scene according to the direction and the distance of the lens adjusted in real time, and the restoring trueness of the interactive application scene is improved. For example, after the client parses the war newspaper data packet, the client obtains the data of the skill identifier, the hero of the released skill, the hero of the skill target, and the time point of the released skill, and sends the time point list of each state to the lens module. The client side can preload the special effect related to the skill according to the skill identification, and release the special effect of the skill according to the hero releasing the skill, the position of the target hero and other information at a specified time point. And the lens module pulls or zooms the lens at a specified time point according to the real-time adjusted lens direction and distance to perform operations such as lens following and the like.
As can be seen from the above description of the embodiments of the present invention, the client sends the interactive request information to the server, where the interactive request information includes: the method comprises the steps that a control strategy is carried out when a simulation object is executed in an interactive application scene, an interaction result sent to a client by a server comprises recorded data, the client can extract a scene state and an object state generated in a plurality of logic frames from the recorded data, so that a scene restoration video can be generated, and when the client plays the scene restoration video, a user can know an execution process of the simulation object in the interactive application scene through the scene restoration video.
As shown in fig. 3, the method for processing an interactive application scene according to an embodiment of the present invention includes the following steps:
301. the server receives interactive request information sent by a client, wherein the interactive request information comprises: and controlling the execution of the simulation object in the interactive application scene, wherein the simulation object is controlled by the client to execute.
In the embodiment of the invention, the simulation object is controlled and executed by the client, for example, the interactive application scene may be a strategy game, the simulation object may be a character (such as hero and soldier) in the game, and the user may issue a control strategy for the simulation object through the client. The client sends the control strategy of the simulation object to the server to request the server to carry out interactive control.
In some embodiments of the present invention, after the server receives the interactive request information sent by the client, the method provided in the embodiments of the present invention further includes:
and the server sends the scene basic information and the object basic information to the client.
The server sends the scene basic information and the object basic information to the client in advance, and the client stores the scene basic information and the object basic information into the local cache after receiving the scene basic information and the object basic information, so that the server does not need to carry the scene basic information and the object basic information when sending the interaction result every time, and the client can obtain the scene basic information and the object basic information by reading the specified storage space in the local cache, thereby improving the data transmission efficiency between the server and the client and reducing the size of a transmission data packet as much as possible.
302. The server carries out interactive calculation according to the control strategy, records scene states of the interactive application scene generated in the plurality of logic frames, and records object states corresponding to each logic frame when the simulation object is executed in the interactive application scene.
In the embodiment of the invention, after receiving the interactive request information sent by the client, the server can create an interactive application scene, perform interactive calculation on the simulation object according to the control strategy indicated by the client, record the scene state of the interactive application scene generated in a plurality of logical frames, and record the object state corresponding to each logical frame when the simulation object is executed in the interactive application scene. For example, the server calculates the whole fighting process according to various control strategies (such as hero, soldier number, skill and other control data) of the player, records the state, such as scene state and object state, and generates a fighting result and a detailed report.
Wherein the logical frame is a frame with a status update in the status synchronization algorithm. In the embodiment of the invention, the client can extract the scene state and the object state generated in the logical frame by recording data, and when a plurality of logical frames with updated states exist, the scene state and the object state generated in the plurality of logical frames can be respectively extracted. Scene state refers to state data of the interactive application scene and object state refers to state data of the simulated object.
303. The server generates recording data of the simulation object when the simulation object is executed in the interactive application scene according to the scene state generated in the plurality of logical frames and the corresponding object state.
In the embodiment of the invention, the server receives the scene states and the corresponding object states reported by the client in a plurality of logical frames, and the server can store the scene states and the object states recorded when the simulation object is executed in the interactive application scene by taking the logical frames as a unit, so that the recording data can be generated.
304. The server sends an interaction result to the client, wherein the interaction result comprises: and recording the data.
In the embodiment of the invention, the server stores the recorded data of the simulation object when the simulation object is executed in the interactive application scene. The server sends the interaction result to the client so that the client can receive the interaction result first. For example, the server calculates the whole fighting process according to various control strategies (such as hero, soldier number, skill and other control data) of the player, records the state, such as recording scene state and object state, generates a fighting result and a detailed report, and then sends the interaction result to the client, wherein the interaction result includes the recorded data.
In some embodiments of the present invention, recording a scene state of an interactive application scene generated in a plurality of logical frames, and recording an object state corresponding to each logical frame when a simulation object is executed in the interactive application scene, includes:
the server judges whether the current logical frame generates state updating, wherein the state updating comprises the following steps: the scene state changes, and/or the object state changes;
if the current logical frame generates state updating, the server records the scene state generated in the current logical frame and the corresponding object state;
if the current logical frame does not generate the state update, the server judges whether the next logical frame generates the state update or not.
When the server carries out interactive computation, only the logic frames with the state updates are recorded, the whole interactive computation process is the updating of the logic frames, whether the state updates exist in each logic frame or not is determined, and the scene state and the object state are recorded aiming at the logic frames with the state updates. According to the judging mode of the current logical frame, the server can judge whether the next logical frame generates state updating or not, the data transmission efficiency between the server and the client is improved, and the size of a transmission data packet is reduced as much as possible.
In some embodiments of the present invention, the server records the scene state generated in the current logical frame and the corresponding object state, including:
the server creates an interactive application scene according to the scene basic information and the object basic information, and loads a simulation object in the interactive application scene;
the server generates an execution frame list and a state data list corresponding to each execution frame, wherein the execution frame list comprises a plurality of execution frames;
the server extracts the scene state generated in the current execution frame and the corresponding object state from the state data list corresponding to each execution frame.
The scene basic information refers to basic information of an interactive application scene, in different scene examples, the scene basic information may include different contents, and taking the interactive application scene as a policy game scene as an example, the scene basic information may be information of a gay city wall, trap information, and the like in the game scene. The basic object information refers to basic information of a simulated object, and in different scene examples, the basic object information may include different contents, and taking an interactive application scene as a strategy game scene as an example, the basic object information may be hero information (grade, occupation) of a battle on both sides in the game scene, a battle form of several heros, and the number of soldiers in heros. Through the scene basic information and the object basic information, the client can firstly create an interactive application scene and load a simulation object in the interactive application scene.
After the scene creation is completed, the server generates an execution frame list and determines the occurrence time of each execution frame, the execution frame list comprises a plurality of execution frames, the execution frame list can be a list with frame data at a specific time point of the client, and each execution frame comprises an action state data list of hero.
After the server generates the execution frame list, a state data list corresponding to each execution frame is generated, a scene state and a corresponding object state generated in the current execution frame are extracted from the state data list corresponding to each execution frame, and the state data list comprises the scene state and the object state which need to be simulated in the corresponding execution frame.
Taking an interactive application scene as a strategy game scene as an example, the server starts to demonstrate a fighting process, executes the action of each hero, simulates the behavior of a soldier until the fighting is finished, and records a state data list corresponding to each execution frame in real time.
As can be seen from the description of the embodiments of the present invention in the above embodiments, the server receives the interactive request information sent by the client, where the interactive request information includes: the method comprises the steps that a control strategy is carried out when a simulation object is executed in an interactive application scene, an interaction result sent to a client by a server comprises recorded data, the client can extract a scene state and an object state generated in a plurality of logic frames from the recorded data, so that a scene restoration video can be generated, and when the client plays the scene restoration video, a user can know an execution process of the simulation object in the interactive application scene through the scene restoration video.
In order to better understand and implement the above-mentioned schemes of the embodiments of the present invention, the following description specifically illustrates corresponding application scenarios.
The embodiment of the invention introduces the technical scheme of SLG hand-game battle playback by taking an interactive application scene as an example, can ensure that a data packet is small, and truly restores the tacticity and the numerical value of a weapon arrangement array through a visual battle process, wherein the visual battle process refers to a visual picture played back after the battle is finished.
The SLG is a strategic numerical game, belonging to a pre-war strategy type game, wherein before the battle of two army in an SLG game scene, a player can set the array of several heros in one team, can adjust the number of soldiers carried by each hero, and can fight. The embodiment of the invention can have better combat performance, and the player can feel the furious and interesting of the combat process by ensuring the value and playing back the combat with ornamental value. In the embodiment of the invention, the combat playback data packet sent by the server is very small, and the primary goal of achieving good compatibility is to achieve low flow rate when the mobile phone faces unstable factors such as multi-system network environment, weak network and the like, and the goal of ensuring low flow rate is to ensure that the combat playback data packet is as small as possible. The data packet is small except for the player, but is also for the server, because thousands of battles are generated every day, and the flow of the battle playback data packet is large.
Please refer to fig. 4, which is a schematic diagram illustrating an interaction flow between a client and a server according to an embodiment of the present invention. The server in the embodiment of the present invention may include: the system comprises a main server and a combat server, wherein the main server is used for processing simple business functions (such as processing login requests, task systems, prop systems and the like) and distributing tasks to the combat server, and the combat server is used for processing combat calculation and playback records. Next, the basic timing sequence process of the embodiment of the present invention mainly includes the following steps:
1. a player lays up a lineup and initiates an attack to another player.
2. The client receives the combat command and sends a combat request to the main server.
3. The main server forwards the request to the combat server.
4. The combat server calculates and records the whole combat process according to various data (hero, soldier number, skill and the like) of the player to generate a combat result and a detailed report.
5. The combat server returns the warfare data to the main server.
6. The main server forwards and returns the warfare data to the client. The client establishes a link only with the main server. The host server also compresses and encrypts the data packets for smaller and more secure transmission.
7. The client analyzes the warfare data into combat playback data which can be used by various clients. In order to transmit smaller and confidential data packets, the main server also compresses and encrypts the data packets, and the client also needs to decrypt and decompress the newspaper data.
8. And the client side completely simulates and demonstrates the fighting process of the whole two-army battle according to the analyzed fighting playback data to generate a scene restoration video.
9. And presenting the picture to the user.
In the above process, the client serves as the front end of the scene interaction, the main server and the battle server serve as the background of the scene interaction, and then the background scheme design of the battle playback is exemplified first.
Fig. 5 is a schematic view of an implementation scenario of a combat framework in a strategy game scenario according to an embodiment of the present invention. The battle frame mainly comprises: the device comprises a combat module, a skill module and a playback recording module, wherein the combat module can be respectively connected with the skill module and the playback recording module.
The client can pull the recorded data from the server when needing to use the recorded data, for example, the client acquires the state recording data packet from the server, extracts the game state and the hero state generated in a plurality of logical frames, and then generates a scene restoration video.
In some embodiments of the invention, the game state comprises: the battle fighting method comprises the steps of attacking a city wall, battle fighting of two troops and ending. The object states include: an approach phase, an idle standing phase, a moving phase, an attack phase, a skill release phase and a preparatory movement phase. In the actual design, the status is updated once in 10 frames (0.1 second), the server outputs a status record packet, and a plurality of status data are stored in the frame 1 and the frame 10, respectively.
In the embodiment of the invention, the whole fighting process of the game is calculated by the background, which is processed by the fighting module and the skill module, so that a battle is completely calculated. The whole calculation process is the updating of the logic frame, each frame in the prior art is relevant to time, the logic frame in the embodiment of the invention is directly completed by calculation of a loop body in order that a server calculates the original 60-second battle within 1 second (or shorter time), and each loop is a logic frame. Each logical frame is used for calculating the movement, attack, skill and other action states of heroes of the two parties. The playback recording module is used as a combat module to call and only records when the state changes. The playback recording module has the following functions:
1. and (4) creating basic data of both fighting parties, such as hero information (grade, grade and occupation) of the battle on both parties, a formation of putting several heros, and the number of soldiers carried by the heros. Information of a gazing city wall, and information of a trap.
2. The game state is recorded, which records the time point of the battle process, and there are 3 time points at present. Firstly, attacking and fighting the city wall, if the city wall is broken, entering a battle stage of two troops (attacking and fighting hero of two parties), and finally ending the battle (recording the winner and the loser).
3. The hero state is recorded, and the change time point of each action state of hero is recorded in the calculation process of the fighting module. For example, in the 10 th logical frame, a hero initiates an attack action to cause 10-point damage to the opposite side, at this time, the playback recording module records the corresponding action, and then inserts the action state into the state data list of the 10 th logical frame.
To facilitate understanding of the recording process, fig. 6 is a schematic diagram of recording logical frames during a combat process according to an embodiment of the present invention. Both sides of the battle only have a hero battle, and finally only 5 logical frames are recorded.
And in the 1 st frame, 3 states are recorded, an attacking hero 1 enters the field, and a designated array position is well established. And (4) paying attention to the hero 1 and entering the field and standing the designated position. The two military battle begins. There is a possibility that 10 heros may simultaneously start the action states such as skills in the same frame.
In frame 2, 2 states are recorded, and the attacking hero 1 moves to the watching hero 1 at a speed of 10. The hero 1 moves to the hero 1 with the speed of 10. During the moving process, other states are not recorded until the two heros meet. The next data in the list is frame 8.
In frame 8, 4 states are recorded, and since the two heros meet each other, the movement operation is changed to the standing operation. The attacking hero 1 stands, and the staying hero 1 stands. Here, it is recorded that the hero attack on both sides is ready for animation to the foreground. The ready state lasts 0.5 seconds.
And in the 13 th frame, 2 states are recorded, the attacking hero 1 attacks the conservative hero 1 to cause 10-point damage, and the damage takes effect after 0.5 second, and in order to enable the foreground to have animation for expressing the attack action, the damage takes effect with delay. Then the conservative hero 1 attacks the offensive hero 1, causing 20-point injury and taking effect after 0.5 second. The injury is delayed in effect, and the time is not recorded.
At frame 18, the results are recorded, the winner is held, and the battle is over. In fact, the hero 1 of the other party is killed by the hero 1 of the other party.
The battle uses the scheme of state recording, only 5 frames of data are needed finally, and the data volume is small. In the normal battle of our game products, one battle of 60 seconds and 5v5 has only 3K of report data at most after being packed. The warfare data is the above 5 logical frames, and each logical frame includes a state and also includes attribute data of hero, soldier number and the like carried by both sides.
Fig. 7 is a schematic view of a recovery scene of a battle frame in a strategy game scene according to an embodiment of the present invention. Next, the foreground design of scene interaction is exemplified, and the client can view the replay of the battle process through the report of the mail. The playback module, the skill module and the lens module together completely demonstrate a battle process. The warfare report data packet comprises a state record data packet. After the replay module analyzes the war newspaper data packet, the data of skill Id, hero of release skill, hero of skill target and time point of release skill are sent to the skill module, and a time point list of each state is also sent to the lens module. The skill module preloads the skill-related special effect according to the skill Id, and releases the skill special effect according to the hero releasing the skill, the position of the target hero and other information at a specified time point. And the lens module pulls or zooms the lens at a specified time point according to the time point list of the state, and the lens follows and the like.
The following describes the functions of the playback module in detail:
1. and (6) analyzing. As shown in fig. 8, which is a schematic diagram of a process of analyzing the warfare data by the client according to the embodiment of the present invention, the client first reads the warfare data, then analyzes hero and soldier information, analyzes city wall and trap information, and analyzes a state data list corresponding to each logical frame. And generating an execution frame list of the client, appointing the occurrence time for each frame, then generating a state data list, and inserting the execution frame. And preprocessing the moving state from the state data list, pre-calculating the arrival position and the moving time, and finally preprocessing the attack and skill state from the state data list, processing the damage value and generating damage state data.
For example, after reading the report data, the playback module analyzes basic information such as hero, soldier number and the like of both parties, analyzes the recorded frame state data list, and generates an executable frame list and an available hero state data list in a format specified by the client, wherein the executable frame list is a list with frame data at a specific time point of the client, and each executable frame includes an action state data list of hero. In order to ensure the correctness of the demonstration process, the analysis process also carries out data preprocessing, such as a moving state, background data only comprises a starting position, a target hero ID and a speed, in order to ensure that the displacement is avoided and the movement is accurate, a client side can calculate the arriving position and the moving time in advance, and the client side does not move any more as long as the client side moves to a specified position because the arriving position is calculated in advance. In order to ensure that the injury caused by the attack and skill can be processed at the right time, the injury value is taken out, and an independent injury state is generated. Because the damage recorded in the war newspaper is delayed, the unit of the delayed record is second, and the client has a gap when the damage is played due to a mobile phone card with performance problems, the client directly calculates and generates a damage state in order to be specific.
2. And (4) creating. After the data is prepared, the game is switched to a playback scene, according to the basic information of the two parties, the appointed game background resources are preloaded, the hero model is loaded, and a plurality of soldier models are loaded according to the soldier-carrying proportion. In order to guarantee the smoothness of the demonstration process, the skill ID is read from the skill state, and the skill resources are preloaded.
3. And (6) executing. Fig. 9 is a schematic process diagram of restoring a policy game scene by a client according to an embodiment of the present invention. The client first reads from the execution frame list as the current frame, and determines whether it is already at the execution time point? If at the execution time point, determine if the last frame? If not, entering a waiting state. The execution time point refers to a time point at which the status frame starts to be executed, for example, the 3 rd frame starts to be executed in 5 th second. If not, execute the current frame, then obtain all the states in the execution frame, determine if it is the battle-ending state? And if the state is not the fighting ending state, executing the fighting performance corresponding to the state, executing hero actions and simulating the behavior of the soldiers. If the fighting is finished, showing the fighting result and popping up a settlement interface.
By way of example, the client starts a demonstration combat process, reads a state data list from a frame state list every frame, executes the action of each hero, and simulates the behavior of a soldier until the combat is finished, wherein the state data list comprises the hero state data list and also comprises a stage state, end state and other data lists. In the embodiment of the invention, multiple cycles need to be executed in the game running process to simulate the behavior of a soldier, and after the action is executed, the data of the next frame is acquired from the execution frame.
As can be seen from the foregoing illustration, the embodiment of the present invention truly restores the pre-war tacticity and the numerical value of the troop formation of the SLG game, and facilitates the player to analyze or dazzle the battle. And the data packet is ensured to be small, and the flow requested by the mobile phone is reduced. The playability and experience of the SLG game are improved.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
To facilitate a better implementation of the above-described aspects of embodiments of the present invention, the following also provides relevant means for implementing the above-described aspects.
Referring to fig. 10, a client 1000 according to an embodiment of the present invention may include: a sending module 1001, a receiving module 1002, a status extracting module 1003, and a video restoring module 1004, wherein,
a sending module 1001, configured to send interaction request information to a server, where the interaction request information includes: a control policy for a simulated object when executed in an interactive application scene, the simulated object being controlled by the client for execution;
a receiving module 1002, configured to receive an interaction result sent by the server according to the control policy, where the interaction result includes: recording data of the simulated object when executed in the interactive application scene;
a state extraction module 1003 for extracting scene states generated in a plurality of logical frames and corresponding object states from the recording data;
and a video restoring module 1004, configured to generate a scene restoring video according to the scene state and the object state generated in the plurality of logical frames, and play the scene restoring video, where the scene restoring video is used to play back an execution process of the simulation object in the interactive application scene.
As can be seen from the above description of the embodiments of the present invention, the client sends the interactive request information to the server, where the interactive request information includes: the method comprises the steps that a control strategy is carried out when a simulation object is executed in an interactive application scene, an interaction result sent to a client by a server comprises recorded data, the client can extract a scene state and an object state generated in a plurality of logic frames from the recorded data, so that a scene restoration video can be generated, and when the client plays the scene restoration video, a user can know an execution process of the simulation object in the interactive application scene through the scene restoration video.
Referring to fig. 11, a server 1100 according to an embodiment of the present invention may include: a receiving module 1101, a state obtaining module 1102, a data generating module 1103, and a sending module 1104, wherein,
a receiving module 1101, configured to receive an interactive request message sent by a client, where the interactive request message includes: a control policy for a simulated object when executed in an interactive application scene, the simulated object being controlled by the client for execution;
a state obtaining module 1102, configured to perform interactive computation according to the control policy, record scene states of the interactive application scene generated in multiple logical frames, and record an object state corresponding to each logical frame when the simulation object is executed in the interactive application scene;
a data generating module 1103, configured to generate recorded data of the simulation object when executed in the interactive application scene according to the scene states generated in the plurality of logical frames and corresponding object states;
a sending module 1104, configured to send an interaction result to the client, where the interaction result includes: the data is recorded.
As can be seen from the description of the embodiments of the present invention in the above embodiments, the server receives the interactive request information sent by the client, where the interactive request information includes: the method comprises the steps that a control strategy is carried out when a simulation object is executed in an interactive application scene, an interaction result sent to a client by a server comprises recorded data, the client can extract a scene state and an object state generated in a plurality of logic frames from the recorded data, so that a scene restoration video can be generated, and when the client plays the scene restoration video, a user can know an execution process of the simulation object in the interactive application scene through the scene restoration video.
As shown in fig. 12, for convenience of description, only the parts related to the embodiment of the present invention are shown, and details of the specific technology are not disclosed, please refer to the method part of the embodiment of the present invention. The terminal may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of sales), a vehicle-mounted computer, etc., taking the terminal as the mobile phone as an example:
fig. 12 is a block diagram showing a partial structure of a cellular phone related to a terminal provided by an embodiment of the present invention. Referring to fig. 12, the cellular phone includes: radio Frequency (RF) circuitry 1212, memory 1220, input unit 1230, display unit 1240, sensors 1250, audio circuitry 1260, wireless fidelity (WiFi) module 1270, processor 1280, and power supply 1290. Those skilled in the art will appreciate that the handset configuration shown in fig. 12 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 12:
RF circuit 1212 may be configured to receive and transmit signals during a message transmission and reception or a call, and in particular, receive downlink information of a base station and then process the received downlink information to processor 1280; in addition, the data for designing uplink is transmitted to the base station. In general, the RF circuitry 1212 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 1212 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to global system for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 1220 may be used to store software programs and modules, and the processor 1280 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1220. The memory 1220 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1220 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1230 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 1230 may include a touch panel 1231 and other input devices 1232. The touch panel 1231, also referred to as a touch screen, can collect touch operations of a user (e.g., operations of the user on or near the touch panel 1231 using any suitable object or accessory such as a finger, a stylus, etc.) thereon or nearby, and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 1231 may include two portions, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 1280, and can receive and execute commands sent by the processor 1280. In addition, the touch panel 1231 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 1230 may include other input devices 1232 in addition to the touch panel 1231. In particular, other input devices 1232 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 1240 may be used to display information input by the user or information provided to the user and various menus of the cellular phone. The Display unit 1240 may include a Display panel 1241, and optionally, the Display panel 1241 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, touch panel 1231 can overlay display panel 1241, and when touch panel 1231 detects a touch operation thereon or nearby, the touch panel 1231 can transmit the touch operation to processor 1280 to determine the type of the touch event, and then processor 1280 can provide a corresponding visual output on display panel 1241 according to the type of the touch event. Although in fig. 12, the touch panel 1231 and the display panel 1241 are implemented as two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1231 and the display panel 1241 may be integrated to implement the input and output functions of the mobile phone.
The cell phone may also include at least one sensor 1250, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1241 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1241 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 1260, speaker 1261, and microphone 1262 can provide an audio interface between a user and a cell phone. The audio circuit 1260 can transmit the received electrical signal converted from the audio data to the speaker 1261, and the audio signal is converted into a sound signal by the speaker 1261 and output; on the other hand, the microphone 1262 converts the collected sound signals into electrical signals, which are received by the audio circuit 1260 and converted into audio data, which are then processed by the audio data output processor 1280, and then transmitted to, for example, another cellular phone via the RF circuit 1212, or output to the memory 1220 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 1270, and provides wireless broadband internet access for the user. Although fig. 12 shows the WiFi module 1270, it is understood that it does not belong to the essential constitution of the handset, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 1280 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1220 and calling data stored in the memory 1220, thereby performing overall monitoring of the mobile phone. Optionally, processor 1280 may include one or more processing units; preferably, the processor 1280 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into the processor 1280.
The handset also includes a power supply 1290 (e.g., a battery) for powering the various components, and preferably, the power supply may be logically connected to the processor 1280 via a power management system, so that the power management system may manage the charging, discharging, and power consumption.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In this embodiment of the present invention, the processor 1280 included in the terminal further has a processing method flow for controlling and executing the above interactive application scenario executed by the terminal.
Fig. 13 is a schematic diagram of a server 1300 according to an embodiment of the present invention, which may include one or more Central Processing Units (CPUs) 1322 (e.g., one or more processors) and a memory 1332, and one or more storage media 1330 (e.g., one or more mass storage devices) storing applications 1342 or data 1344. Memory 1332 and storage medium 1330 may be, among other things, transitory or persistent storage. The program stored on the storage medium 1330 may include one or more modules (not shown), each of which may include a sequence of instructions operating on a server. Still further, the central processor 1322 may be arranged in communication with the storage medium 1330, executing a sequence of instruction operations in the storage medium 1330 on the server 1300.
The server 1300 may also include one or more power supplies 1326, one or more wired or wireless network interfaces 1350, one or more input-output interfaces 1358, and/or one or more operating systems 1341, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
The steps of the processing method of the interactive application scenario executed by the server in the above embodiment may be based on the server structure shown in fig. 13.
It should be noted that the above-described embodiments of the apparatus are merely schematic, where the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. In addition, in the drawings of the embodiment of the apparatus provided by the present invention, the connection relationship between the modules indicates that there is a communication connection between them, and may be specifically implemented as one or more communication buses or signal lines. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present invention may be implemented by software plus necessary general hardware, and may also be implemented by special hardware including special integrated circuits, special CPUs, special memories, special components and the like. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions may be various, such as analog circuits, digital circuits, or dedicated circuits. However, the implementation of a software program is a more preferable embodiment for the present invention. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk of a computer, and includes instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods according to the embodiments of the present invention.
In summary, the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the above embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the above embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (15)

1. A processing method of an interactive application scene is characterized by comprising the following steps:
the client sends interactive request information to the server, wherein the interactive request information comprises: a control policy for a simulated object when executed in an interactive application scene, the simulated object being controlled by the client for execution;
the client receives an interaction result sent by the server according to the control strategy, wherein the interaction result comprises: recording data of the simulation object when the simulation object is executed in the interactive application scene, wherein the recording data includes a frame state data list corresponding to a plurality of logical frames respectively, the logical frames are frames with state update in a state synchronization algorithm, and the state update includes: the scene state changes, and/or the object state changes; the object states include: an approach stage, an idle standing stage, a moving stage, an attack stage, a skill release stage and a preparatory action stage;
the client extracts scene states generated in a plurality of logical frames and corresponding object states from the recorded data;
the client generates a scene reduction video according to the scene state and the object state generated in the plurality of logical frames, and plays the scene reduction video, wherein the scene reduction video is used for playing back the execution process of the simulation object in the interactive application scene;
wherein the client generates a scene restoration video according to the scene state and the object state generated in the plurality of logical frames, and includes:
the client side obtains basic scene information and basic object information;
the client creates the interactive application scene according to the scene basic information and the object basic information, and loads the simulation object in the interactive application scene;
the client generates an execution frame list and determines the occurrence time of each execution frame, the execution frame list comprises a plurality of execution frames, the execution frame list is a list with frame data at a specific time point of the client, and each execution frame comprises an action state data list of a simulation object;
the client generates a state data list corresponding to each execution frame according to the scene state and the object state generated in the plurality of logical frames;
the client restores the execution content of the simulation object in the interactive application scene according to the state data list corresponding to each execution frame to obtain a video picture corresponding to each execution frame;
and the client combines the video pictures corresponding to each execution frame according to the occurrence time of the execution frames to obtain the scene restoration video.
2. The method of claim 1, wherein the client extracts scene states and corresponding object states generated in a plurality of logical frames from the recorded data, comprising:
the client analyzes the recorded data to obtain frame state data lists corresponding to the plurality of logical frames respectively;
and the client analyzes the frame state data list corresponding to each logical frame to obtain the scene state and the corresponding object state generated in the logical frame.
3. The method of claim 1, wherein the client obtains scene basic information and object basic information, and comprises:
the client acquires the scene basic information and the object basic information from the interaction result; or the like, or, alternatively,
and the client acquires the scene basic information and the object basic information from a local cache.
4. The method according to claim 1, wherein the client restores, in the interactive application scene, the execution content of the simulation object according to the state data list corresponding to each execution frame, and includes:
the client determines the execution content of the simulation object in the next execution frame in advance according to the state data list corresponding to the current execution frame before the execution content of the simulation object is restored by the current execution frame;
and the client restores the execution content of the simulation object by using the predetermined execution content in the next execution frame.
5. The method according to claim 1, wherein the client restores, in the interactive application scene, the execution content of the simulation object according to the state data list corresponding to each execution frame, and includes:
and the client restores the execution content of the simulation object in the interactive application scene at the occurrence time of each execution frame, and adjusts the direction and the distance corresponding to the shot for watching the interactive application scene in real time at the occurrence time of each execution frame.
6. The method of any of claims 1 to 5, wherein the scene state comprises: the method comprises an interaction starting stage, an interaction proceeding stage and an interaction ending stage.
7. A processing method of an interactive application scene is characterized by comprising the following steps:
the method comprises the following steps that a server receives interactive request information sent by a client, wherein the interactive request information comprises: a control policy for a simulated object when executed in an interactive application scene, the simulated object being controlled by the client for execution;
the server carries out interactive calculation according to the control strategy, records scene states of the interactive application scene generated in a plurality of logic frames, and records object states corresponding to each logic frame when the simulation object is executed in the interactive application scene;
the server generates recording data of the simulation object when the simulation object is executed in the interactive application scene according to the scene state generated in the plurality of logical frames and the corresponding object state, wherein the recording data includes a frame state data list corresponding to the plurality of logical frames respectively, the logical frames are frames with state update in a state synchronization algorithm, and the state update includes: the scene state changes, and/or the object state changes; the object states include: an approach stage, an idle standing stage, a moving stage, an attack stage, a skill release stage and a preparatory action stage;
the server sends an interaction result to the client, wherein the interaction result comprises: the data is recorded.
8. The method of claim 7, wherein the recording a scene state of the interactive application scene generated in a plurality of logical frames and recording an object state corresponding to each logical frame when the simulation object is executed in the interactive application scene comprises:
the server judges whether state updating is generated in the current logical frame;
if the current logical frame generates state updating, the server records a scene state generated in the current logical frame and a corresponding object state;
and if the current logical frame does not generate the state update, the server judges whether the state update is generated in the next logical frame or not.
9. The method of claim 8, wherein the server records the scene state and corresponding object state generated at the current logical frame, comprising:
the server creates the interactive application scene according to the scene basic information and the object basic information, and loads the simulation object in the interactive application scene;
the server generates an execution frame list and a state data list corresponding to each execution frame, wherein the execution frame list comprises a plurality of execution frames;
and the server extracts the scene state and the corresponding object state generated in the current logical frame from the state data list corresponding to each execution frame.
10. The method according to any one of claims 7 to 9, wherein after the server receives the interactive request information sent by the client, the method further comprises:
and the server sends the scene basic information and the object basic information to the client.
11. A client, comprising:
a sending module, configured to send interaction request information to a server, where the interaction request information includes: a control policy for a simulated object when executed in an interactive application scene, the simulated object being controlled by the client for execution;
a receiving module, configured to receive an interaction result sent by the server according to the control policy, where the interaction result includes: recording data of the simulation object when the simulation object is executed in the interactive application scene, wherein the recording data includes a frame state data list corresponding to a plurality of logical frames respectively, the logical frames are frames with state update in a state synchronization algorithm, and the state update includes: the scene state changes, and/or the object state changes; the object states include: an approach stage, an idle standing stage, a moving stage, an attack stage, a skill release stage and a preparatory action stage;
the state extraction module is used for extracting scene states generated in a plurality of logical frames and corresponding object states from the recorded data;
the video restoration module is used for generating a scene restoration video according to the scene state and the object state generated in the plurality of logical frames and playing the scene restoration video, wherein the scene restoration video is used for playing back the execution process of the simulation object in the interactive application scene;
the video restoration module generates a scene restoration video according to the scene state and the object state generated in the plurality of logical frames, and includes:
the client side obtains basic scene information and basic object information;
the client creates the interactive application scene according to the scene basic information and the object basic information, and loads the simulation object in the interactive application scene;
the client generates an execution frame list and determines the occurrence time of each execution frame, the execution frame list comprises a plurality of execution frames, the execution frame list is a list with frame data at a specific time point of the client, and each execution frame comprises an action state data list of a simulation object;
the client generates a state data list corresponding to each execution frame according to the scene state and the object state generated in the plurality of logical frames;
the client restores the execution content of the simulation object in the interactive application scene according to the state data list corresponding to each execution frame to obtain a video picture corresponding to each execution frame;
and the client combines the video pictures corresponding to each execution frame according to the occurrence time of the execution frames to obtain the scene restoration video.
12. A server, comprising:
a receiving module, configured to receive interactive request information sent by a client, where the interactive request information includes: a control policy for a simulated object when executed in an interactive application scene, the simulated object being controlled by the client for execution;
the state acquisition module is used for carrying out interactive calculation according to the control strategy, recording scene states of the interactive application scene generated in a plurality of logic frames and recording object states corresponding to each logic frame when the simulation object is executed in the interactive application scene;
a data generating module, configured to generate, according to the scene states generated in the plurality of logical frames and corresponding object states, recorded data when the simulation object is executed in the interactive application scene, where the recorded data includes a list of frame state data corresponding to each of the plurality of logical frames, and the logical frame is a frame with a state update in a state synchronization algorithm, and the state update includes: the scene state changes, and/or the object state changes; the object states include: an approach stage, an idle standing stage, a moving stage, an attack stage, a skill release stage and a preparatory action stage;
a sending module, configured to send an interaction result to the client, where the interaction result includes: the data is recorded.
13. A computer-readable storage medium comprising instructions that, when executed on a computer, cause the computer to perform the method of any of claims 1 to 6, or 7 to 10.
14. A client, the client comprising: a processor and a memory;
the memory to store instructions;
the processor, configured to execute the instructions in the memory, to perform the method of any of claims 1 to 6.
15. A server, characterized in that the server comprises: a processor and a memory;
the memory to store instructions;
the processor, configured to execute the instructions in the memory, to perform the method of any of claims 7 to 10.
CN201810770518.1A 2018-07-13 2018-07-13 Processing method and device for interactive application scene and storage medium Active CN108965989B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810770518.1A CN108965989B (en) 2018-07-13 2018-07-13 Processing method and device for interactive application scene and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810770518.1A CN108965989B (en) 2018-07-13 2018-07-13 Processing method and device for interactive application scene and storage medium

Publications (2)

Publication Number Publication Date
CN108965989A CN108965989A (en) 2018-12-07
CN108965989B true CN108965989B (en) 2020-05-05

Family

ID=64484127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810770518.1A Active CN108965989B (en) 2018-07-13 2018-07-13 Processing method and device for interactive application scene and storage medium

Country Status (1)

Country Link
CN (1) CN108965989B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110711384A (en) * 2019-10-24 2020-01-21 网易(杭州)网络有限公司 Game history operation display method, device and equipment
CN112347507B (en) * 2020-10-29 2022-11-08 北京市商汤科技开发有限公司 Online data processing method, electronic device and storage medium
CN112316423B (en) * 2020-11-27 2022-09-23 腾讯科技(深圳)有限公司 Method, device, equipment and medium for displaying state change of virtual object
CN112807689A (en) * 2021-03-02 2021-05-18 网易(杭州)网络有限公司 Game video processing method and device, electronic equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9950257B2 (en) * 2014-03-04 2018-04-24 Microsoft Technology Licensing, Llc Recording companion
CN104915542B (en) * 2015-05-08 2018-05-22 珠海金山网络游戏科技有限公司 A kind of method of network game video recording and playback based on data synchronization
CN105013174B (en) * 2015-07-28 2018-09-11 珠海金山网络游戏科技有限公司 A kind of game video recording playback method and system
CN105763825B (en) * 2016-04-12 2019-11-12 杭州电魂网络科技股份有限公司 A method of frame synchronization recording function is optimized in gaming
CN106611436A (en) * 2016-12-30 2017-05-03 腾讯科技(深圳)有限公司 Animation resource display processing method and device

Also Published As

Publication number Publication date
CN108965989A (en) 2018-12-07

Similar Documents

Publication Publication Date Title
US11857878B2 (en) Method, apparatus, and terminal for transmitting prompt information in multiplayer online battle program
CN111773696B (en) Virtual object display method, related device and storage medium
CN108965989B (en) Processing method and device for interactive application scene and storage medium
CN109107161B (en) Game object control method, device, medium and equipment
KR102319206B1 (en) Information processing method and device and server
CN106693367B (en) Processing method for displaying data at client, server and client
CN110141859B (en) Virtual object control method, device, terminal and storage medium
CN113101652A (en) Information display method and device, computer equipment and storage medium
JP7250403B2 (en) VIRTUAL SCENE DISPLAY METHOD, DEVICE, TERMINAL AND COMPUTER PROGRAM
JP2022540277A (en) VIRTUAL OBJECT CONTROL METHOD, APPARATUS, TERMINAL AND COMPUTER PROGRAM
WO2023029836A1 (en) Virtual picture display method and apparatus, device, medium, and computer program product
CN110860087B (en) Virtual object control method, device and storage medium
CN111249726B (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN112044072A (en) Interaction method of virtual objects and related device
CN112316423A (en) Method, device, equipment and medium for displaying state change of virtual object
CN109857363B (en) Sound effect playing method and related device
US20230033902A1 (en) Virtual object control method and apparatus, device, storage medium, and program product
CN109718552B (en) Life value control method based on simulation object and client
CN108924632A (en) A kind for the treatment of method and apparatus and storage medium of interactive application scene
CN113599825B (en) Method and related device for updating virtual resources in game
CN111589113B (en) Virtual mark display method, device, equipment and storage medium
CN114272608A (en) Control method, device, terminal, storage medium and program product of virtual role
CN114288659A (en) Interaction method, device, equipment, medium and program product based on virtual object
CN113018857A (en) Game operation data processing method, device, equipment and storage medium
CN110743167A (en) Method and device for realizing interactive function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant