CN110711380B - State processing method and related device - Google Patents

State processing method and related device Download PDF

Info

Publication number
CN110711380B
CN110711380B CN201911032443.8A CN201911032443A CN110711380B CN 110711380 B CN110711380 B CN 110711380B CN 201911032443 A CN201911032443 A CN 201911032443A CN 110711380 B CN110711380 B CN 110711380B
Authority
CN
China
Prior art keywords
state
rendering
game object
game
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911032443.8A
Other languages
Chinese (zh)
Other versions
CN110711380A (en
Inventor
莫锡昌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911032443.8A priority Critical patent/CN110711380B/en
Publication of CN110711380A publication Critical patent/CN110711380A/en
Application granted granted Critical
Publication of CN110711380B publication Critical patent/CN110711380B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the application discloses a state processing method and device, wherein a server acquires interactive information of interactive actions, and the interactive information is used for reflecting the influence condition of the interactive actions on game objects; determining a first type game object according to the interaction information, wherein the first type game object is a game object associated with the interaction action, and therefore, a corresponding state of the interaction action needs to be rendered on the first type game object; the server transmits the object identification and the state identification corresponding to the interactive action to the terminal corresponding to the first game object at one time. And the terminal determines the first type of game object according to the object identifier and acquires rendering information of a corresponding state according to the state identifier, so that state rendering is performed on the first type of game object. The server only needs to send the object identifier and the state identifier to the terminal once, so that the synchronization between the rendering of the terminal state and the interaction action can be better ensured under the condition of network fluctuation, and the smoothness and the real-time performance of the game are improved.

Description

State processing method and related device
Technical Field
The present application relates to the field of data processing, and in particular, to a state processing method and related apparatus.
Background
In the field of gaming of terminals, game objects in a game may interact with other objects by applying a state (buff) to themselves or other objects by performing an interactive action. For example, game object a may attack game object b by releasing a skill, such that game object b creates different types of states of body, fly, reduced vital value, etc. under the influence of the skill. The state in the game can be a general term of an entity with a bearing effect of limited time, and can be applied through interactive action, and various types can be provided, such as state control, numerical increase/decrease, visual special effect and the like, which can be realized through applying the state.
Online games (online games) are a development trend of terminal games. In the network game, the interactive synchronization between different players is realized through the network connection of the terminal and the server, wherein the synchronization of various states between the terminal and the server is also included.
This action, due to the applied state, requires the corresponding effect to be rendered by the terminal so that the user is visually aware. The terminal starts state rendering on the game object at any time, and the state rendering cancellation at any time can be realized only by the indication of the server.
Because the user has high requirements on the real-time performance of the game, the time delay and fluctuation of the network can directly influence the response time delay of the game on the terminal. Network delay and fluctuation can cause the terminal to be stuck in various states in the applied game, for example, due to the network delay between the server and the terminal, a specific state which should be rendered on a game object at a certain moment is delayed for 3 seconds on the terminal before being rendered, so that the interaction action and the state are not synchronous on the visual display, and the fluency and the real-time experience of the user on the network game are directly influenced.
Disclosure of Invention
In order to solve the technical problem, the application provides a state processing method and a related device, which ensure the synchronism of interaction action and state and improve the fluency and real-time experience of a user on a network game.
The embodiment of the application discloses the following technical scheme:
in a first aspect, an embodiment of the present application provides a state processing method, where the method includes:
acquiring interaction information of an interaction action, wherein the interaction information is used for reflecting the influence condition of the interaction action on a game object;
determining a first type game object associated with the interaction action according to the interaction information, wherein the first type game object is to be used for rendering a state corresponding to the interaction action;
sending object identification and state identification to a terminal corresponding to the first type game object; the object identifier is used for identifying the first game object, the state identifier corresponds to rendering information of a state corresponding to the interaction action, and the rendering information comprises a time node for starting rendering state and a time node for canceling state rendering.
In a second aspect, an embodiment of the present application provides a state processing method, where the method includes:
acquiring a state identifier and an object identifier sent by a server; the object identification is used for identifying a first type of game object associated with the interactive action;
determining rendering information of a state corresponding to the interactive action according to the state identifier, wherein the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state;
and starting and canceling rendering states on the first type game objects determined according to the object identifications according to the time nodes indicated by the rendering information.
In a third aspect, an embodiment of the present application provides a state processing apparatus, where the apparatus includes a first obtaining unit, a first determining unit, and a sending unit:
the first acquisition unit is used for acquiring interaction information of an interaction action, and the interaction information is used for reflecting the influence condition of the interaction action on a game object;
the first determining unit is configured to determine, according to the interaction information, a first type of game object associated with the interaction action, where the first type of game object is to render a state corresponding to the interaction action;
the sending unit is used for sending the object identification and the state identification to the terminal corresponding to the first type game object; the object identifier is used for identifying the first game object, the state identifier corresponds to rendering information of a state corresponding to the interaction action, and the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state.
In a fourth aspect, an embodiment of the present application provides a state processing apparatus, where the apparatus includes an obtaining unit, a determining unit, and a rendering unit:
the acquiring unit is used for acquiring the state identifier and the object identifier sent by the server; the object identification is used for identifying a first type of game object associated with the interactive action;
the determining unit is configured to determine rendering information of a state corresponding to the interactive action according to the state identifier, where the rendering information includes a time node for starting a rendering state and a time node for canceling the rendering state;
and the rendering unit is used for starting and canceling a rendering state according to the time node indicated by the rendering information on the first type game object determined according to the object identification.
In a fifth aspect, an embodiment of the present application provides an implementation apparatus for state processing, where the apparatus includes a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute the state processing method according to any one of the first aspect or the second aspect according to an instruction in the program code.
In a sixth aspect, an embodiment of the present application provides a computer-readable storage medium for storing a program code, where the program code is configured to execute the state processing method according to any one of the first aspect or the second aspect.
In a seventh aspect, an embodiment of the present application provides a state processing system, where the system includes a server and a terminal:
the server is used for acquiring interaction information of the interaction action, and the interaction information is used for reflecting the influence condition of the interaction action on the game object; determining a first type game object associated with the interaction action according to the interaction information, wherein the first type game object is to be used for rendering a state corresponding to the interaction action; sending object identification and state identification to a terminal corresponding to the first type game object; the object identifier is used for identifying the first game object, the state identifier corresponds to rendering information of a state corresponding to the interaction action, and the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state;
the terminal is used for acquiring the state identifier and the object identifier sent by the server; and starting and canceling rendering states on the first type game objects determined according to the object identifications according to the time nodes indicated by the rendering information.
According to the technical scheme, the server can determine the first-class game object associated with the interactive action according to the interactive information of the interactive action, and the state corresponding to the interactive action needs to be rendered on the first-class game object. Sending object identification and state identification to a terminal corresponding to the determined first game object, wherein the object identification can enable the first game object to determine the game object needing rendering state, the state identification corresponds to rendering information of the state corresponding to the interactive object, the rendering information comprises a time node for starting rendering state and a time node for canceling rendering state, so that aiming at the implemented interactive action, the object for indicating the rendering state determined by the terminal and various types of data such as starting rendering and canceling time of state rendering can be realized by issuing the object identification and the state identification once, the data interaction times of the server and the terminal are effectively reduced, the network utilization efficiency is improved, even if network fluctuation and other conditions occur, the issued object identification and the state identification are influenced most, and the starting and canceling time of state rendering performed by the terminal based on the identification are less influenced, the method can better ensure the synchronization between the terminal state rendering and the interaction action under the condition of network fluctuation, and improves the smoothness and the real-time performance of the game.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
FIG. 1 is a schematic diagram of a state processing method;
FIG. 2 is a schematic diagram of a state processing method;
fig. 3 is a schematic view of an application scenario of a method for state processing according to an embodiment of the present application;
fig. 4 is a signaling interaction diagram of a state processing method according to an embodiment of the present application;
fig. 5 is a schematic diagram of a state processing method according to an embodiment of the present application;
fig. 6 is a schematic diagram of a state processing method according to an embodiment of the present application;
FIG. 7 is a state ratio diagram according to an embodiment of the present disclosure;
fig. 8 is a schematic flow ratio diagram provided in an embodiment of the present application;
fig. 9 is a signaling interaction diagram of a state processing method according to an embodiment of the present application;
fig. 10a is a structural diagram of a state processing apparatus according to an embodiment of the present application;
fig. 10b is a structural diagram of a state processing device according to an embodiment of the present application;
fig. 10c is a structural diagram of a state processing apparatus according to an embodiment of the present application;
fig. 11 is a structural diagram of a state processing apparatus according to an embodiment of the present application;
fig. 12 is a block diagram of an implementation apparatus for state processing according to an embodiment of the present application;
fig. 13 is a block diagram of a server according to an embodiment of the present application;
fig. 14 is a block diagram of a state processing system according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the accompanying drawings.
In the network game, the interactive synchronization between different players is realized through the network connection of the terminal and the server, for example, the synchronization of various states between the terminal and the server. The action of applying the state needs to render a corresponding effect through the terminal so that a user can perceive the effect from a visual angle.
Referring to fig. 1, fig. 1 is a schematic view showing an applied state in a conventional manner. If a plurality of buffs are required to be applied to the interactive action required to be carried out by the game object, the buffs are respectively buff1001, buff 2001, buff3001 and buff4001, wherein the buff1001 corresponds to the playing locking special effect, the buff 2001 corresponds to the playing hitting and positioning special effect, the buff3001 corresponds to the playing hitting and flying special effect, and the buff4001 corresponds to the falling and continuous blood loss special effect. The upper buff represents that the buff is rendered, namely the special effect corresponding to the buff is played, and the lower buff represents that the rendering of the special effect is finished, namely the special effect corresponding to the buff is stopped being played. As can be seen from the figure, if the interactive action is a released skill, after the server replies that the released skill starts, the rendering start and the rendering end of each state require the server to send instructions to the terminal, and the instructions of the rendering start and the rendering end are sent by the server respectively. For example, the server sends an instruction of Buff1001 to the terminal, and starts playing the lock special effect (starts rendering Buff 1001); the server sends an instruction of the next buff1001 to the terminal, and ends playing the lock special effect (ends rendering the buff 1001).
When the interactive action only corresponds to one state, the server needs to send a rendering start instruction and a rendering end instruction of the state to the terminal; when the interactive action corresponds to a combination of a plurality of states, the server needs to continuously send rendering start instructions and rendering end instructions of different states to the terminal. Therefore, no matter the interaction action corresponds to one or more states, the server needs to send the state rendering indication to the terminal for multiple times, so that when the Network has multiple time delays (Network latency) and fluctuations (Network Churn), the state rendering may be affected for multiple times, and the bad experience of multiple times of incarceration is brought to the user. The network delay refers to the time required for transmitting a message or a packet from one end of a network to the other end, and in a network game, the network delay refers to the time from a request sent by a terminal to the time when the terminal receives a reply returned by a server; network fluctuation refers to a large variation of network delay in a short time.
As shown in fig. 2, when the state rendering is performed, due to multiple fluctuations and delays (fluctuations and delays occur when the upper and lower buffs 1001 and 3001 are sent) occurring on the network, two indication delays occur when the server sends the state rendering indication to the terminal, so that the terminal rendering locking special effect starting time delay and the rendering shooting flying floating ending time delay are caused, so that the user sees that the skill is hit in the game, but is locked after a while, and after being shot flying floating, even if the shooting flying floating time is ended, the character is still in the shooting flying state, which greatly affects the experience of the user. Therefore, in the prior art, the server is required to send the indication to the terminal for multiple times when the state is rendered, and when the network state is poor and multiple fluctuation and delay conditions occur, the situation that the state rendering process on the terminal is delayed and blocked for multiple times can be caused at a high probability, so that the fluency and real-time experience of the user on the network game are directly influenced.
Meanwhile, the server frequently interacts with the terminal and occupies more bandwidth, so that the possibility of network fluctuation and delay is increased, and the situation of state rendering stagnation is easier to occur.
In order to solve the above technical problem, the present application provides a state processing method, which can be applied to the system architecture shown in fig. 3. The system architecture comprises a server 101 and at least one terminal 102.
The terminal 102 may be installed with a Game Client (Game Client), which is a program corresponding to a Game server and providing local services to clients, and is generally installed on a common user machine and needs to cooperate with the server 101 to operate. The terminal 102 may be, for example, an intelligent terminal, a computer, a Personal Digital Assistant (PDA), a tablet computer, or other devices; the Server 101 may be, for example, a Game-compliant network Game Server (Game Server). The network game server is a software program which corresponds to the network game client, is installed in an Internet Data Center (IDC for short), and provides Data forwarding and logic processing services for the network game client.
After generating the interaction, the server 101 may acquire interaction information of the interaction. The interactive action may be made by a game Character controlled by a Player, or may be made by a prop or a Non-Player Character (NPC) in the game. The interactive action can affect the game object, the interactive information is used for reflecting the influence condition of the interactive action on the game object, and the related information of the game object affected by the interactive action can be called as interactive information. Such as the skills used to generate the interactive action, the game object implementing the interactive action, whether the interactive action is effective on the game object, etc. The game object may be an object included in the game, such as a game character performing an interactive action, a game character hit by an interactive action, a prop in the game, an NPC, and the like.
After acquiring the interaction information of the interaction, the server 101 determines a first type game object associated with the interaction according to the interaction information. The game object may be associated with an interactive action in the sense that the interactive action is released by the game object or the interactive action is effected on the game object, so that the game objects of the first type may be game objects releasing the interactive action, game objects hit by the interactive action, etc.
Since the first type game object is directly associated with the interaction, the state corresponding to the interaction may be rendered on the first type game object in order to make the user visually perceive the state applied as the first type game object through the interaction.
For example, when the interactive action releases one bomb for the game object a, in order to make the user visually perceive that the game object a releases one bomb, a state corresponding to the release of the bomb may be imposed on the game object a. At this time, the interaction information may be what the implemented interaction is, the releaser of the bomb game object a, etc., and then the first type game object may be the game object that releases the bomb. After the server 101 acquires the interaction information, the game object which releases the bomb is determined according to the releaser of the bomb in the interaction information, and the game object of the type I can be rendered to release the special action, the special light effect and the like of the bomb.
When the bomb explodes, other game objects may be damaged, and a corresponding state may be applied to the other game objects in order to allow the user to visually perceive the damage of the bomb explosion to the other game objects. At this time, the interactive information may include what the implemented interactive action is, the releaser of the bomb, the explosion range of the bomb, the explosion time, the throwing place, and the like, and the game object exploded by the bomb among the first type game objects, which may be rendered in a state of losing blood volume, being exploded, being floated, and the like, is determined according to the throwing place, the explosion range, and the explosion time.
After determining the first type of game object, the server 102 may send the object identifier and the state identifier to the terminal corresponding to the first type of game object. The object identifier is generated according to the first type game object and is used for identifying the first type game object, so that the terminal knows which game objects need to be rendered. The state identifier is generated according to the interactive action and corresponds to rendering information of a state corresponding to the interactive action. The rendering information includes a time node to start rendering state and a time node to cancel state rendering. For example, when the game object of the first type is a bomb-blasted game object, the rendering information includes a time node of a rendering start state and a time node of a rendering cancel state of a state of losing blood volume, being blasted, floating, and the like.
The method comprises the steps that the state corresponding to the interactive action to be rendered of the first-class game object is sent to the terminal of the first-class game object, and the state corresponding to the interactive action to be rendered of the first-class game object is sent to the terminal of the first-class game object. For example, after the bomb explodes, the rendering scene in which all the bomb-exploded game objects float can be seen in the terminal corresponding to each bomb-exploded game object. After the terminal receives the state identifiers, all states corresponding to the interactive actions, the time nodes of the rendering starting state and the rendering canceling state of each state are determined through the state identifiers, and therefore sequential rendering of a series of states is conducted.
The server can send the object identification and the state identification to the terminal, and the state identification can enable the terminal to determine all states corresponding to the interactive action, time nodes of rendering starting states of all the states and time nodes of rendering canceling the states, so that the server only needs to issue to the terminal once, namely, only once interaction is carried out between the server and the terminal, and then rendering of all the states corresponding to the interactive action can be completed on the terminal, therefore, when the network fluctuates and delays for many times, the state rendering on the terminal can be influenced once at most, the influence degree of network fluctuation and delay is greatly reduced, and the fluency and real-time experience of a user on a network game are improved.
Next, a state processing method provided by an embodiment of the present application will be described with reference to the drawings.
Referring to fig. 4, this figure is a flowchart of a state processing method provided in an embodiment of the present application, where the method includes the following steps:
s401: the server acquires the interaction information of the interaction action.
After the interactive action is generated, the server can acquire corresponding interactive information according to the interactive action. The releaser of the interaction may be a game character controlled by the player, or a game item or NPC. For example, when the interactive action is that the player controls a game character to throw a bomb, the releaser of the interactive action is the game character controlled by the player; when the interactive action is that a certain trap in a game scene hits a game role controlled by a player, the releaser of the interactive action is the trap in the game, namely the game prop; when the interactive action attacks the player's game character for a monster in the game, the releaser of the interactive action is the NPC.
The interaction information corresponding to the interaction action is used for reflecting the influence condition of the interaction action on the game object, and the influence condition of the interaction action on the game object can comprise the influence condition of the interaction action on an interaction action releaser or the influence condition of the interaction action on a hit player. For example, when the interactive action is that a game character controlled by a player releases a fireball, in order to ensure that the fireball can cause a correct explosion effect in the explosion range of the fireball, the interactive information that the server needs to acquire may include information such as the explosion range, the explosion time, and the explosion position point of the fireball, and the game object that can be affected by the explosion may be determined according to the information.
Since the releaser and the hitter of the interactive action may both be player-controlled game characters, game items, NPCs, etc., the game objects affected by the interactive action may include player-controlled game characters, game items, NPCs, etc. For example, when a skill released by a game character breaks a pot in a scene, the game object affected by the interactive action is a game prop; when the skill released by the game character hits other player-controlled game characters, the game object influenced by the interactive action is the player-controlled game character; when the skill released by the game character hits a monster in the map, the game object affected by the interactive action is NPC.
In this embodiment, the manner in which the server acquires the interaction information of the interactive action is different depending on the game object that releases the interactive action.
When releasing the interactive action, if the releaser of the interactive action is a game character controlled by a player, at the moment, because the player needs to control the game character through a terminal, when the player controls the game character through the terminal to release the interactive action, the terminal sends an interactive action implementation request to a server, the server judges whether the interactive action is allowed to be released or not after receiving the interactive action implementation request, and if the interactive action is allowed, the server replies an instruction for implementing the interactive action to the terminal. As shown in fig. 5, the first arrow from top to bottom in fig. 5 represents that after the player triggers to release the skill (interactive action), the terminal sends an interactive action implementation request to the server in order to confirm whether the skill can be released. And the second arrow represents that after the server receives the interactive action implementation request and a series of judgments are carried out on the game object releasing the interactive action, the server replies an instruction for implementing the interactive action to the terminal after confirming that the skill can be released.
In this case, the way for the server to obtain the interaction information of the interaction action may be: the server obtains an interactive action implementation request sent by a terminal corresponding to a target game object releasing the interactive action, wherein the interactive action implementation request can comprise a target object identifier of the target game object and an action identifier of the interactive action. When the server receives the interactive action implementation request, the server may obtain the identifier of the to-be-acted and the identifier of the target object from the interactive action implementation request, where the action identifier is used to identify the interactive action implemented by the player. Therefore, the server can determine the interaction information of the interaction according to the target object identifier and the action identifier in the interaction implementation request.
If the releaser of the interactive action is NPC and the game prop, the server is only required to obtain the interactive information from the inside and does not need to interact with the terminal because the NPC and the game prop are directly controlled by the server.
In addition, the way in which the server obtains the interaction information in different scenes is different according to the type of the interaction. Interactions can be divided into two categories:
the first type of interaction has directionality, i.e., the interaction has already determined the game object that was hit when implemented. When the interactive action is taken as a first type of interactive action, the server can obtain the interactive information of a releaser and the interactive information of a hit game object from the interactive action implementation request after obtaining the interactive action implementation request, so that the interactive information can be obtained only once in the complete state processing process, namely the interactive information can be obtained from the interactive action implementation request under the scene that the interactive action is released by a game role controlled by a player and the state is rendered on the game role, and under the scene that the interactive action is released by the game role controlled by the player and the state is rendered on the game object hit by the interactive action;
the second category of interactions does not have directionality, i.e., the interaction does not have a clear hit on the game object when implemented. When the interaction is performed as the second type of interaction, the server cannot determine the hit game object when the interaction is performed, and therefore the server needs to obtain the interaction information of the hit game object after the interaction hits the game object. Then, in a scene where the interactive action is released by the game character controlled by the player and a state is rendered on the game character, the server may acquire the interactive information according to the interactive action implementation request; in a scenario where an interaction is released by a game character controlled by a player and a state is rendered on a game object hit by the interaction, the server may obtain interaction information from the server after the interaction hits the game object. For example, when the interactive action is releasing a fireball for the game character (i.e. releasing the interactive action), when the release of the fireball is requested (sending an interactive action implementation request), the server obtains the interactive information of the game character releasing the fireball so as to be used for rendering the state of the game character releasing the fireball; when the fireball explodes, the server acquires the interaction information of the game object exploded by the fireball from the server so as to be used for rendering the state of the game character hit by the fireball.
S402: the server determines a first type of game object associated with the interactive action according to the interactive information.
After obtaining the interaction information, the server may determine a first type of game object associated with the interaction action according to the interaction information. The first type game objects may include game objects releasing interactive actions or game objects hit by interactive actions, the first type game objects may be different in different game scenes, and the first type game objects determined in different game scenes may include cases shown in the following table, which shows examples in different interactive scenes.
As can be seen from the table, when the interactive actions are different, the types of the corresponding first-type game objects may also be different, but all the first-type game objects satisfy the precondition that the interactive actions are associated with the interactive actions, and therefore, in order to reflect the generated interactive actions on the first-type game objects, the states corresponding to the interactive actions need to be rendered on the first-type game objects.
Figure BDA0002250536950000111
S403: and the server sends the object identification and the state identification to the terminal corresponding to the first game object.
After determining the first type game object, the server sends the object identification and the state information to the terminal corresponding to the first type game object in order to render the state corresponding to the interactive action on the first type game object. The object identifier is used for identifying the first type game object, so that the terminal knows which game objects need to be subjected to state rendering, and the object identifier is generated according to the first type game object. The state identification corresponds to rendering information of a state corresponding to the interactive action, and the rendering information comprises a time node for starting the rendering state and a time node for canceling the rendering state. The terminal can determine which states need to be rendered on which game objects and the time nodes for rendering the start and end of the states according to the received object identifiers and the state identifiers, thereby finishing the rendering process of the states corresponding to the interactive actions.
In the first case, the state identifier may directly include rendering information of all states corresponding to the interactive action, such as a rendering start time node and a rendering end time node of all states; in the second case, the state identifier may only include an identifier of a state corresponding to the interactive action, such as a number, a symbol, and the like, and the identifier may enable the terminal to find rendering information of the state corresponding to the interactive action from locally stored state rendering information for rendering.
The time node may be absolute time or relative time. Absolute time is a direct expression of time information, such as "13: 00: 00 starts rendering state 1 ", where" 13: 00: 00 "is absolute time; the relative time is not directly expressed time information, but is time information that needs to be expressed in consideration of a reference object, for example, "rendering state 1 starts after 1 s", where "after 1 s" means that the terminal receives the state identifier transmitted by the server 1s later, and the reference object is a time when the terminal receives the state identifier transmitted by the server, and thus is a relative time.
It is understood that the server may send the object identifier and the state identifier to the terminal corresponding to the first type game object in a broadcast manner. Through the broadcasting mode, when a plurality of terminals which need to send the same object identification and state identification exist, only one object identification and state identification need to be sent, and the server does not need to send the object identification and the state identification to the terminals one by one, so that the sending efficiency is improved.
S404: the terminal determines rendering information of a state corresponding to the interactive action according to the state identifier
After receiving the object identifier and the state identifier sent by the server, the terminal can determine rendering information of a state corresponding to the interactive action according to the state identifier.
It can be understood that, since the state identifier may have different forms, and the form of the state identifier is different, the way in which the terminal determines the rendering information corresponding to the interactive action according to the state identifier may be different. In this embodiment, the manner in which the terminal determines the rendering information corresponding to the interactive action according to the state identifier may include the following two cases:
in the first case, when the state identifier itself contains rendering information of all states corresponding to the interactive action, the terminal can directly obtain the rendering information from the state identifier after receiving the state identifier, that is, find the state to be rendered and the time node related to rendering from the content contained in the state identifier;
in the second case, the rendering information of the state corresponding to the interactive action is stored in the terminal, for example, the terminal may include a local state list in which the rendering information of all the states corresponding to the interactive action is stored, at this time, the state identifier may only include identifiers, such as numbers, symbols, and the like, of the rendering information of all the states corresponding to the interactive action, and after the terminal acquires the state identifier, the terminal searches the local state list through the state identifier to determine the rendering information of the state corresponding to the interactive action.
The states corresponding to the interactive actions are divided into a logic state (L logic buff) and an expression state (Behavior buff), wherein the logic state is a state with functional logic and directly influences the fighting numerical value, such as functional states of wound reduction, blood return, rebound and the like, and can be classified into the logic state, the expression state is not provided with the functional logic and does not influence the fighting numerical value, and only influences the fighting expression, such as special effect, body setting, batting, halo and the like, and can be classified into the expression state.
It can be understood that, since the logic state directly affects the fighting value, if the rendering information corresponding to the logic state is stored in the terminal in the second case of determining the rendering information corresponding to the interactive action according to the state identifier, the terminal is easily modified maliciously by being hung externally, and when the rendering information corresponding to the logic state is modified, the balance of the game is seriously damaged, which interferes with the normal progress of the game. Therefore, in order to avoid modification of rendering information corresponding to the logic state, ensure balance of the game, and ensure normal progress of the game, in a possible implementation manner, when the rendering information corresponding to the interactive action is determined according to the state identifier, the state corresponding to the interactive action is a state in which the value of the game object is not changed in the state to be rendered by implementing the interactive action.
For example, when the interactive action is that player A-controlled game character A releases an offensive skill to player B-controlled game character B, the skill can add logic state of continuous blood dropping to the game role B controlled by the player B and performance states of fly, special light effect and rotation to the game role B, at the moment, in the state identification sent by the server to the terminal, may include an identification of rendering information corresponding to the rendering state of a fly-through, special light effects, rendering, etc., while the logic state of a persistent blood loss is sent by the server directly to the terminal rendering information of the state, thereby ensuring that the rendering information of the logic state of continuous blood loss which can influence the normal operation of the game is sent to the terminal by the server, the game is not acquired by the terminal in the information stored by the terminal, so that the possibility of malicious modification by plug-in is reduced, and the balance of the game and the normal running of the game are ensured.
Meanwhile, in a game state, the proportion of the performance state is often much larger than that of the logic state, and as shown in fig. 7, in a certain game, the proportion of the performance state is about three times that of the logic state. At the moment, the expression state is transmitted in a state identification form of the identification, so that a large amount of game running bandwidth can be saved, and the game running is smoother.
S405: and the terminal starts and cancels the rendering state according to the time node indicated by the rendering information on the first type game object determined according to the object identifier.
After the terminal receives the object identifier sent by the server, the terminal can determine a first type of game object needing to be subjected to state rendering according to the object identifier, and starts and cancels the rendering state on the first type of game object according to the time node indicated by the rendering information.
As shown in fig. 5, fig. 5 is a schematic diagram of the terminal starting and canceling the rendering state according to the time node indicated by the rendering information. It is understood that buffs required to be applied in fig. 5 can be described by referring to corresponding contents in fig. 1, and buffs 1001 to 4001 also need to be rendered in sequence in fig. 5. It can be known from the figure that, when performing state rendering, no matter how many states need to be rendered, the server only needs to send the object identifier and the state identifier to the terminal once to complete all the state rendering processes. Even if the network has time delay and fluctuation, as shown in fig. 6, the network has fluctuation and time delay when the server sends the object identifier and the state identifier to the terminal, and even if the network has multiple times of time delay and fluctuation, the network can only affect one transmission of the object identifier and the state identifier transmitted from the server to the terminal at most, and the rendering of the state of the terminal is affected at most once. And after the terminal receives the object identifier and the state identifier, a series of state renderings can be carried out according to the identifier information, and the server is not required to send information to the terminal in the process, so that even if network delay and fluctuation occur in the process, the state renderings carried out on the terminal cannot be influenced because of no information interaction between the terminal and the server.
In fig. 5 and fig. 6, the different buffs are rendered sequentially, that is, the next buff is rendered after the previous buff is rendered. It is understood that the buff rendering may also start the rendering of the next buff when the rendering of the previous buff is not finished, and the time of the buff rendering is determined only by the state identifier corresponding to the interaction action and is not in a fixed order.
It is understood that in the network game, there may be some viewing game objects around the first type game objects, and the first type game objects may be displayed within the visual field display range of the game objects. Therefore, when rendering the state of the first type game objects, the players corresponding to the game objects should also see the state rendering process performed by the first type game objects through the corresponding terminals. For example, when the player controls the game character C to throw a bomb to the game character D, the state rendering of the game character D is required to embody the effect of the bomb, and the game character E and the game character F see a series of scenes of the bomb-to-bomb explosion within the visual field display range, so that the same state rendering is required to be performed on the terminals of the game character E and the game character F. The interactive action is embodied on the first type game objects, and in order to judge which game objects can be displayed in the visual field display range of the first type game objects, whether the game objects are the second type game objects can be determined by whether the positions of the first type game objects are included in the visual field display range of the game objects, namely the second type game objects are determined according to the position information of the first type game objects. After acquiring the state identifier and the object identifier, the server may send the object identifier and the state identifier to a terminal corresponding to the first type game object and a terminal corresponding to the second type game object, so that the second type game object can see the state rendering of the interactive action on the first type game object. And after the terminal of the second type game object receives the object identifier and the state identifier, performing state rendering corresponding to the interaction action on the first game object according to the rendering step.
According to the technical scheme, the server can determine the first-class game object associated with the interactive action according to the interactive information of the interactive action, and the state corresponding to the interactive action needs to be rendered on the first-class game object. Sending object identification and state identification to the terminal corresponding to the determined first game object, the object identification can make the first game object determine the game object needing rendering state, the state identification corresponds to the rendering information of the state corresponding to the interactive object, the rendering information includes the time node of the rendering state starting and the rendering state canceling, so that aiming at the implemented interactive action, the object of the rendering state determined by the terminal and the various data of the rendering state starting and the rendering canceling can be indicated by issuing the object identification and the state identification once, the data interaction times of the server and the terminal are effectively reduced, the network utilization efficiency is improved, even if the network fluctuation occurs, the issued object identification and the state identification are influenced most, and the influence on the starting and the canceling time of the rendering state based on the identification is smaller, the method can better ensure the synchronization between the terminal state rendering and the interaction action under the condition of network fluctuation, and improves the smoothness and the real-time performance of the game.
Meanwhile, in a Role playing Online game (MMORPG for short), players seek higher and higher quality of an interactive function, requirements on performance of skills are more and more gorgeous, and more gorgeous skill performance corresponds to more complex state components, so that in the total flow of the game, the flow rate required by state synchronization is huge. As shown in fig. 8, in a certain network game, the fighting traffic includes state synchronization traffic, skill broadcast traffic, mobile broadcast traffic, attribute synchronization traffic, and other traffic, wherein the traffic required for state synchronization accounts for 10% of the total fighting traffic of the game; in addition, the demands of players on the multi-player fighting experience are also rapidly increasing. The traditional Game has the limitation of the hardware of the network Game Server (Game Server)/the network Game Client (Game Client), and the number of people in the on-line battle on the same screen can be controlled under a smaller scale, such as 50 people on the same screen, 100 people on the same screen, and the like. However, as the requirements of players are higher and the hardware technology is improved, the scene that 500 or 1000 people can fight on the same screen is often required nowadays so as to really experience the charm of large fighting. At this time, because the number of players is huge, the number and the variety of states that need to be reflected on the players are numerous, and in the traditional state applying mode, the server needs to send rendering start time nodes and rendering end time nodes of all states corresponding to the interaction action to the terminal corresponding to each player, so the transmission amount of state data information between the server and the client is large, and under the condition that the local bandwidth capacity of the players is limited, the time required for transmitting a large amount of state data information is prolonged, thereby easily causing the problems of network delay and fluctuation; meanwhile, a large amount of state data information needs to occupy a large amount of bandwidth, so that the transmission space of other data is limited, and the problem of large-batch disconnection of players is easily caused; in addition, huge bandwidth occupation can bring high bandwidth cost to operators, greatly improve the operation cost of games and reduce the profit margin of products.
According to the technical scheme in the second case of determining the rendering information corresponding to the interactive action according to the state identifier, the server can only send the state identifier which enables the terminal to find the rendering information corresponding to the interactive action in the local state list to replace sending all the state information, so that the data volume needing to be sent to the terminal by the server is greatly reduced, the bandwidth occupied by state data transmission is effectively reduced, the probability of time delay and fluctuation of a network is reduced, the possibility of disconnection of a player is reduced, the bandwidth cost required by an operator is reduced, and the profit rate of products is improved.
Next, a state processing method provided in the embodiment of the present application will be described with reference to an actual application scenario. In the application scenario, the game is a multi-player shooting game, the releaser of the interactive action is a gunner controlled by the player, the interactive action is that the gunner throws a grenade, and the game object hit by the interactive action is other gunners controlled by other players in the game. The flow chart of the state processing method is shown in fig. 9, and the method comprises the following steps:
s901: the releaser terminal sends an interactive action enforcement request to throw a grenade to the server.
When the player generates the demand of throwing the grenade, the corresponding key is pressed down to trigger the operation of throwing the grenade. The releaser terminal may, in response to the triggering operation, send an interactive action enforcement request to the server to throw a grenade.
S902: the server acquires interactive information of the throwing grenade.
The interactive information includes the remaining amount of the grenade of the thrower, whether the game character of the thrower is alive or not, the game character information of the grenade throwing and the like.
S903: the server determines a state corresponding to the grenade throwing and determines a game role of the grenade throwing according to the interactive information.
The server determines the game role of the throwing party according to the interaction information and determines the corresponding state of the throwing party according to the interaction of throwing the grenade.
S904: the server replies to the releaser terminal to start throwing the grenade and transmits the object identifier of the throwing player game character and the state identifier corresponding to the throwing grenade.
The server judges whether the grenade throwing can be carried out or not according to the acquired information such as the grenade residual quantity of the thrower, the thrower game role and the like in the interactive information, and after judging that the grenade throwing can be carried out, the server replies information for starting throwing to the releaser terminal and transmits the object identification of the thrower game role and the state identification corresponding to the throwing grenade.
S905: the releaser terminal determines rendering information of a state corresponding to the throwing grenade according to the state identifier
The corresponding state of throwing the grenade can comprise that the game role emits red light and has special sound effect.
S906: and the releaser terminal starts and cancels the rendering state according to the time node indicated by the rendering information on the thrower game role determined according to the object identifier.
S907: the server obtains the interactive information when the grenade hits.
After the game role throws away the grenade, the grenade explodes through a period of time, and when the explosion, the server acquires mutual information, and mutual information includes: location of explosion, extent of explosion, etc., for determining game objects hit by the grenade.
S908: the server determines the hit status and determines the game character hit by the grenade based on the interaction information.
The server determines the hit game role according to the information of the throwing time, the throwing place, the explosion range and the like of the grenade in the interaction information, and the hit state can comprise being hit to fly, being hit down and the like.
S909: the server sends the object identification of the hit game role and the state identification corresponding to the hit terminal.
S910: and the hit person terminal determines the rendering information of the hit corresponding state according to the state identifier.
S911: and the hit terminal starts and cancels the rendering state according to the time node indicated by the rendering information on the hit game role determined according to the object identifier.
Based on the state processing method provided by the foregoing embodiment, the present embodiment provides a state processing apparatus, referring to fig. 10a, the apparatus includes a first obtaining unit 1001, a first determining unit 1001, and a sending unit 1003:
a first obtaining unit 1001, configured to obtain interaction information of an interaction, where the interaction information is used to reflect an influence situation of the interaction on a game object;
a first determining unit 1002, configured to determine, according to the interaction information, a first type of game object associated with the interaction action, where the first type of game object is to be rendered and corresponds to the interaction action;
a sending unit 1003, configured to send an object identifier and a state identifier to a terminal corresponding to a first type game object; the object identification is used for identifying a first type game object, the state identification corresponds to rendering information of a state corresponding to the interaction action, and the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state.
In one implementation, referring to fig. 10b, the state processing implementation apparatus further includes a second determining unit 1004:
and a second determining unit 1004 for determining a second type of game object according to the position information of the first type of game object, wherein the first type of game object is displayed in the visual field display range of the second type of game object.
In this implementation manner, the sending unit 1003 is specifically configured to send the object identifier and the state identifier to the terminal corresponding to the first class game object and the terminal corresponding to the second class game object.
In one implementation, the state corresponding to the interactive action is a state in which the value of the game object is not changed in the state to be rendered by implementing the interactive action.
In one implementation, referring to fig. 10c, the state processing implementation apparatus further includes a second obtaining unit 1005:
a second obtaining unit 1005, configured to obtain an interactive action implementation request sent by a terminal corresponding to a target game object; the interactive action implementation request includes a target object identification of the target game object and an action identification of the interactive action.
In this implementation manner, the acquiring of the interaction information of the interaction specifically includes determining the interaction information of the interaction according to the target object identifier and the action identifier.
In one implementation, the sending unit 1003 sends the object identifier and the state identifier to the terminal corresponding to the first type game object by broadcasting.
Based on a state processing method provided by the foregoing embodiment, the present embodiment provides a state processing apparatus 1100, referring to fig. 11, the apparatus including an acquisition unit 1101, a determination unit 1102, and a rendering unit 1103:
an obtaining unit 1101, configured to obtain a state identifier and an object identifier sent by a server; the object identification is used for identifying a first type game object associated with the interactive action;
a determining unit 1102, configured to determine rendering information of a state corresponding to the interactive action according to the state identifier, where the rendering information includes a time node for starting a rendering state and a time node for canceling the rendering state;
a rendering unit 1103, configured to start and cancel a rendering state according to a time node indicated by the rendering information on the first type game object determined according to the object identifier.
In one implementation, the determining unit 1102 is configured to:
and searching a local state list through the state identifier to determine rendering information of the state corresponding to the interactive action.
The embodiment of the present application further provides an implementation apparatus for state processing, and the implementation apparatus for state processing is described below with reference to the accompanying drawings. Referring to fig. 12, an implementation apparatus 1200 for state processing is provided in this embodiment of the present application, where the apparatus 1200 may also be a terminal apparatus, and the terminal apparatus may be any intelligent terminal including a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a Point of Sales (POS), a vehicle-mounted computer, and the terminal apparatus is taken as a mobile phone as an example:
fig. 12 is a block diagram illustrating a partial structure of a mobile phone related to a terminal device provided in an embodiment of the present application. Referring to fig. 12, the cellular phone includes: radio Frequency (RF) circuit 1210, memory 1220, input unit 1230, display unit 1240, sensor 1250, audio circuit 1260, wireless fidelity (WiFi) module 1270, processor 1280, and power supply 1290. Those skilled in the art will appreciate that the handset configuration shown in fig. 12 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 12:
the RF circuit 1210 may be configured to receive and transmit signals during a message transmission or communication process, and in particular, receive downlink information of a base station and then process the downlink information of the base station to the processor 1280, and transmit design uplink data to the base station.
The memory 1220 may be used to store software programs and modules, and the processor 1280 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1220. The memory 1220 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1220 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1230 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 1230 may include a touch panel 1231 and other input devices 1232. The touch panel 1231, also referred to as a touch screen, can collect touch operations of a user (e.g., operations of the user on or near the touch panel 1231 using any suitable object or accessory such as a finger, a stylus, etc.) thereon or nearby, and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 1231 may include two portions, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 1280, and can receive and execute commands sent by the processor 1280. In addition, the touch panel 1231 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 1230 may include other input devices 1232 in addition to the touch panel 1231. In particular, other input devices 1232 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The Display unit 1240 may be used to Display information input by or provided to a user and various menus of the cellular phone, the Display unit 1240 may include a Display panel 1241, and optionally, the Display panel 1241 may be configured in the form of a liquid crystal Display (L iquid Display, abbreviated as L CD), an Organic light-Emitting Diode (Organic L lighting-Emitting Diode, abbreviated as O L ED), and the like, further, the touch panel 1231 may cover the Display panel 1241, and when the touch panel 1231 detects a touch operation on or near the touch panel 1231, the touch panel 1231 may be transmitted to the processor 1280 to determine the type of the touch event, and then the processor 1280 may provide a corresponding visual output on the Display panel 1241 according to the type of the touch event.
The cell phone may also include at least one sensor 1250, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1241 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1241 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 1260, speaker 1261, and microphone 1262 can provide an audio interface between a user and a cell phone. The audio circuit 1260 can transmit the received electrical signal converted from the audio data to the speaker 1261, and the audio signal is converted into a sound signal by the speaker 1261 and output; on the other hand, the microphone 1262 converts the collected sound signals into electrical signals, which are received by the audio circuit 1260 and converted into audio data, which are processed by the audio data output processor 1280, and then passed through the RF circuit 1210 to be transmitted to, for example, another cellular phone, or output to the memory 1220 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 1270, and provides wireless broadband internet access for the user. Although fig. 12 shows the WiFi module 1270, it is understood that it does not belong to the essential constitution of the handset, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 1280 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1220 and calling data stored in the memory 1220, thereby performing overall monitoring of the mobile phone. Optionally, processor 1280 may include one or more processing units; preferably, the processor 1280 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into the processor 1280.
The handset also includes a power supply 1290 (e.g., a battery) for powering the various components, and preferably, the power supply may be logically connected to the processor 1280 via a power management system, so that the power management system may manage the charging, discharging, and power consumption.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In this embodiment, the processor 1280 included in the terminal device further has the following functions:
acquiring a state identifier and an object identifier sent by a server; the object identification is used for identifying a first type of game object associated with the interactive action;
determining rendering information of a state corresponding to the interactive action according to the state identifier, wherein the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state;
and starting and canceling rendering states on the first type game objects determined according to the object identifications according to the time nodes indicated by the rendering information.
Referring to fig. 13, fig. 13 is a block diagram of a server 1300 provided in the embodiment of the present application, and the server 1300 may have a relatively large difference due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 1322 (e.g., one or more processors) and a memory 1332, and one or more storage media 1330 (e.g., one or more mass storage devices) storing an application program 1342 or data 1344. Memory 1332 and storage medium 1330 may be, among other things, transitory or persistent storage. The program stored on the storage medium 1330 may include one or more modules (not shown), each of which may include a sequence of instructions operating on a server. Still further, the central processor 1322 may be arranged in communication with the storage medium 1330, executing a sequence of instruction operations in the storage medium 1330 on the server 1300.
The server 1300 may also include one or more power supplies 1326, one or more wired or wireless network interfaces 1350, one or more input-output interfaces 1358, and/or one or more operating systems 1341 such as Windows ServerTM, Mac OS XTM, UnixTM, and &lTtTtranslation = L "&tttL &/T &gttinuxTM, FreeBSDTM, and the like.
In this embodiment, the server 1300 further has the following functions:
acquiring interactive information of the interactive action, wherein the interactive information is used for reflecting the influence condition of the interactive action on the game object;
determining a first type game object associated with the interactive action according to the interactive information, wherein the first type game object is in a state corresponding to the interactive action to be rendered;
sending an object identifier and a state identifier to a terminal corresponding to the first type of game object; the object identification is used for identifying the first game object, the state identification corresponds to rendering information of a state corresponding to the interaction action, and the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state.
The steps performed by the server in the above embodiment may be based on the server structure shown in fig. 13.
The embodiment of the present application further provides a computer-readable storage medium, configured to store a program code, where the program code is configured to execute any one implementation of the state processing implementation method described in the foregoing embodiments.
As shown in fig. 14, fig. 14 is a structural diagram of a state processing system 1400 provided in the embodiment of the present application, and includes a server 1401 and a terminal 1402:
a server 1401, configured to obtain interaction information of the interaction action, where the interaction information is used to reflect an influence situation of the interaction action on the game object; determining a first type game object associated with the interactive action according to the interactive information, wherein the first type game object is in a state corresponding to the interactive action to be rendered; sending the object identifier and the state identifier to the terminal 1402 corresponding to the first type game object; the object identification is used for identifying a first type game object, the state identification corresponds to rendering information of a state corresponding to the interactive action, and the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state;
a terminal 1402, configured to obtain a state identifier and an object identifier sent by a server 1401; the object identification is used for identifying a first type game object associated with the interactive action; determining rendering information of a state corresponding to the interactive action according to the state identifier, wherein the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state; and starting and canceling the rendering state according to the time node indicated by the rendering information on the first type game object determined according to the object identification.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium may be at least one of the following media: various media that can store program codes, such as read-only memory (ROM), RAM, magnetic disk, or optical disk.
It should be noted that, in the present specification, all the embodiments are described in a progressive manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus and system embodiments, since they are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for related points. The above-described embodiments of the apparatus and system are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only one specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. A method of state processing, the method comprising:
acquiring interaction information of an interaction action, wherein the interaction information is used for reflecting the influence condition of the interaction action on a game object;
determining a first type of game object associated with the interactive action according to the interactive information, wherein the first type of game object is a game object in a state corresponding to the interactive action to be rendered;
sending object identification and state identification to a terminal corresponding to the first type game object; the object identifier is used for identifying the first game object, the state identifier corresponds to rendering information of a state corresponding to the interaction action, and the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state.
2. The method of claim 1, further comprising:
determining a second type of game object according to the position information of the first type of game object, wherein the first type of game object is displayed in the visual field display range of the second type of game object;
the sending of the object identifier and the state identifier to the terminal corresponding to the first type game object includes:
and sending the object identification and the state identification to the terminal corresponding to the first class game object and the terminal corresponding to the second class game object.
3. The method according to claim 1, wherein the state corresponding to the interactive action is a state in which the value of the game object is not changed in the state required to be rendered by implementing the interactive action.
4. The method of claim 1, wherein prior to said obtaining interaction information for an interaction, the method further comprises:
acquiring an interactive action implementation request sent by a terminal corresponding to a target game object; the interactive action implementation request comprises a target object identification of the target game object and an action identification of the interactive action;
the acquiring of the interaction information of the interaction action comprises:
and determining the interaction information of the interaction action according to the target object identification and the action identification.
5. The method of claim 1, wherein the sending the object identifier and the state identifier to the terminal corresponding to the first type game object comprises:
and sending the object identification and the state identification to the terminal corresponding to the first type game object in a broadcasting mode.
6. A state processing apparatus, comprising a first acquisition unit, a first determination unit, and a transmission unit:
the first acquisition unit is used for acquiring interaction information of an interaction action, and the interaction information is used for reflecting the influence condition of the interaction action on a game object;
the first determining unit is configured to determine, according to the interaction information, a first type of game object associated with the interaction action, where the first type of game object is a game object in a state corresponding to the interaction action to be rendered;
the sending unit is used for sending the object identification and the state identification to the terminal corresponding to the first type game object; the object identifier is used for identifying the first game object, the state identifier corresponds to rendering information of a state corresponding to the interaction action, and the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state.
7. The apparatus according to claim 6, wherein the apparatus further comprises a second determining unit:
the second determining unit is used for determining a second type of game object according to the position information of the first type of game object, and the first type of game object is displayed in the visual field display range of the second type of game object;
the sending unit is configured to send the object identifier and the state identifier to a terminal corresponding to the first type game object and a terminal corresponding to the second type game object.
8. The apparatus according to claim 6, wherein the state corresponding to the interactive action is a state in which the value of the game object is not changed in the state required to be rendered by performing the interactive action.
9. The apparatus according to claim 6, wherein the sending unit is specifically configured to send the object identifier and the status identifier to the terminal corresponding to the first type game object in a broadcast manner.
10. A method of state processing, the method comprising:
acquiring a state identifier and an object identifier sent by a server; the object identification is used for identifying a first type of game object associated with the interactive action;
determining rendering information of a state corresponding to the interactive action according to the state identifier, wherein the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state;
and starting and canceling rendering states on the first type game objects determined according to the object identifications according to the time nodes indicated by the rendering information.
11. The method according to claim 10, wherein the determining rendering information of the state corresponding to the interaction according to the state identifier includes:
and searching a local state list through the state identifier to determine rendering information of the state corresponding to the interactive action.
12. A state processing apparatus, characterized in that the apparatus comprises an acquisition unit, a determination unit, a rendering unit:
the acquiring unit is used for acquiring the state identifier and the object identifier sent by the server; the object identification is used for identifying a first type of game object associated with the interactive action;
the determining unit is configured to determine rendering information of a state corresponding to the interactive action according to the state identifier, where the rendering information includes a time node for starting a rendering state and a time node for canceling the rendering state;
and the rendering unit is used for starting and canceling a rendering state according to the time node indicated by the rendering information on the first type game object determined according to the object identification.
13. An implementing device for state processing, the device comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the state processing method of any one of claims 1-5 or 10-11 according to instructions in the program code.
14. A computer-readable storage medium, characterized in that the computer-readable storage medium is configured to store a program code for executing the state processing method of any of claims 1-5 or claims 10-11.
15. A state processing system, characterized in that the system comprises a server and a terminal:
the server is used for acquiring interaction information of the interaction action, and the interaction information is used for reflecting the influence condition of the interaction action on the game object; determining a first type of game object associated with the interactive action according to the interactive information, wherein the first type of game object is a game object in a state corresponding to the interactive action to be rendered; sending object identification and state identification to a terminal corresponding to the first type game object; the object identifier is used for identifying the first game object, the state identifier corresponds to rendering information of a state corresponding to the interaction action, and the rendering information comprises a time node for starting a rendering state and a time node for canceling the rendering state;
the terminal is used for acquiring the state identifier and the object identifier sent by the server; and starting and canceling rendering states on the first type game objects determined according to the object identifications according to the time nodes indicated by the rendering information.
CN201911032443.8A 2019-10-28 2019-10-28 State processing method and related device Active CN110711380B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911032443.8A CN110711380B (en) 2019-10-28 2019-10-28 State processing method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911032443.8A CN110711380B (en) 2019-10-28 2019-10-28 State processing method and related device

Publications (2)

Publication Number Publication Date
CN110711380A CN110711380A (en) 2020-01-21
CN110711380B true CN110711380B (en) 2020-07-24

Family

ID=69214401

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911032443.8A Active CN110711380B (en) 2019-10-28 2019-10-28 State processing method and related device

Country Status (1)

Country Link
CN (1) CN110711380B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111672124B (en) * 2020-06-06 2022-02-22 腾讯科技(深圳)有限公司 Control method, device, equipment and medium of virtual environment
CN111800481B (en) * 2020-06-18 2022-11-22 广州图凌云科技有限公司 Internet-based network interaction method and device and server
CN112057849A (en) * 2020-09-15 2020-12-11 网易(杭州)网络有限公司 Game scene rendering method and device and electronic equipment
CN112221148B (en) * 2020-10-15 2024-03-22 网易(杭州)网络有限公司 Game skill release state synchronization method, server and readable storage medium
CN113181645A (en) * 2021-05-28 2021-07-30 腾讯科技(成都)有限公司 Special effect display method and device, electronic equipment and storage medium
CN116778079B (en) * 2023-05-26 2024-05-17 上海兴岩信息科技有限公司 Three-dimensional visual production management method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105214309A (en) * 2015-10-10 2016-01-06 腾讯科技(深圳)有限公司 A kind of information processing method, terminal and computer-readable storage medium
CN107656674A (en) * 2017-09-26 2018-02-02 网易(杭州)网络有限公司 Information interacting method, device, electronic equipment and storage medium
CN108245885A (en) * 2017-12-29 2018-07-06 网易(杭州)网络有限公司 Information processing method, device, mobile terminal and storage medium
CN109589605A (en) * 2018-12-14 2019-04-09 网易(杭州)网络有限公司 The display control method and device of game

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160023112A1 (en) * 2014-07-24 2016-01-28 King.Com Limited System for updating attributes

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105214309A (en) * 2015-10-10 2016-01-06 腾讯科技(深圳)有限公司 A kind of information processing method, terminal and computer-readable storage medium
CN107656674A (en) * 2017-09-26 2018-02-02 网易(杭州)网络有限公司 Information interacting method, device, electronic equipment and storage medium
CN108245885A (en) * 2017-12-29 2018-07-06 网易(杭州)网络有限公司 Information processing method, device, mobile terminal and storage medium
CN109589605A (en) * 2018-12-14 2019-04-09 网易(杭州)网络有限公司 The display control method and device of game

Also Published As

Publication number Publication date
CN110711380A (en) 2020-01-21

Similar Documents

Publication Publication Date Title
CN110711380B (en) State processing method and related device
CN111773696B (en) Virtual object display method, related device and storage medium
CN111182355B (en) Interaction method, special effect display method and related device
KR102319206B1 (en) Information processing method and device and server
US11673048B2 (en) Contextually aware communications system in video games
EP2811712A1 (en) Information processing apparatus, information processing system, information processing program and information processing method
WO2022247129A1 (en) Method and apparatus for generating special effect in virtual environment, and device and storage medium
CN113617028B (en) Control method, related device, equipment and storage medium for virtual prop
WO2022083451A1 (en) Skill selection method and apparatus for virtual object, and device, medium and program product
CN114911558A (en) Cloud game starting method, device and system, computer equipment and storage medium
CN114367111A (en) Game skill control method, related device, equipment and storage medium
CN114159789A (en) Game interaction method and device, computer equipment and storage medium
CN111803961B (en) Virtual article recommendation method and related device
CN112044072A (en) Interaction method of virtual objects and related device
US20220379208A1 (en) Method and apparatus for generating special effect in virtual environment, device, and storage medium
CN110743167A (en) Method and device for realizing interactive function
CN113599825A (en) Method and related device for updating virtual resources in game match
CN113797544A (en) Attack control method and device for virtual object, computer equipment and storage medium
CN113018857A (en) Game operation data processing method, device, equipment and storage medium
WO2024098984A1 (en) Virtual-prop control method and apparatus, and device and storage medium
KR102423817B1 (en) Method for game service and computing device for executing the method
JP2020014715A (en) Video game processing program and video game processing system
CN112843682B (en) Data synchronization method, device, equipment and storage medium
CN117919717A (en) Control method, related device, equipment and storage medium for virtual prop
CN117643723A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40021895

Country of ref document: HK