CN108786108B - Target object control method, device, storage medium and equipment - Google Patents

Target object control method, device, storage medium and equipment Download PDF

Info

Publication number
CN108786108B
CN108786108B CN201810595628.9A CN201810595628A CN108786108B CN 108786108 B CN108786108 B CN 108786108B CN 201810595628 A CN201810595628 A CN 201810595628A CN 108786108 B CN108786108 B CN 108786108B
Authority
CN
China
Prior art keywords
target object
movement
moving
track
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810595628.9A
Other languages
Chinese (zh)
Other versions
CN108786108A (en
Inventor
陈瑭羲
阳小波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810595628.9A priority Critical patent/CN108786108B/en
Publication of CN108786108A publication Critical patent/CN108786108A/en
Application granted granted Critical
Publication of CN108786108B publication Critical patent/CN108786108B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/355Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/53Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing
    • A63F2300/538Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing for performing operations on behalf of the game client, e.g. rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • A63F2300/646Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car for calculating the trajectory of an object

Abstract

The invention discloses a target object control method, a target object control device, a storage medium and equipment, and belongs to the technical field of internet. The method comprises the following steps: after opening a bureau, receiving a first moving track of a target object issued by a server, wherein the first moving track is obtained by the server according to moving data of the target object after collision after the target object collides with a first control object of an attack party; receiving a moving time parameter which is sent by a server and matched with the first moving track; controlling the target object to move on the display interface based on the first movement track and the movement time parameter; and the first movement track comprises positions which are passed by the target object after collision to the end of the current turn. The invention issues the moving track to the terminal in advance, can ensure the display effect of the terminal under the condition of poor network, and can not cause the problems of insufficient smoothness of a game interface and poor display effect caused by control disorder of the target object.

Description

Target object control method, device, storage medium and equipment
Technical Field
The present invention relates to the field of internet technologies, and in particular, to a method, an apparatus, a storage medium, and a device for controlling a target object.
Background
The network game is a common entertainment mode, and users can experience the fun of the game through the game interface displayed by the terminal without going out. The online games are various, for example, the ball hitting game is an online game widely enjoyed by users at present. In a form where the ball hitting game is similar to table tennis, a user participating in the game will be provided with a ball board for hitting the ball. Wherein the ball is also referred to herein as the target object and the ball board is also referred to as the manipulation object. In the game process, if the attacking party who initiates the attack hits the ball, the defending party who executes defending cannot intercept the ball through the ball board, and then the attacking party wins the game.
Taking a man-machine game mode in which a single user participates as an example, the related art can control the target object according to the following mode: after the game is opened, if the user A is the attacking party, after the user A hits the ball through the ball board, the server can obtain the moving track of the ball in real time and send the obtained moving track to the terminal of the user A in real time, and after the terminal receives the moving track, the terminal controls the ball to move on the game interface according to the moving track so as to finish game interface rendering.
For the above control method, under the condition of a poor network, the moving track acquired by the server may not be sent to the terminal in real time, and for the terminal, the moving track issued by the server in real time is not received, so that the situation of control disorder of the target object may occur, and further, the game interface is not smooth enough, and the display effect is not good.
Disclosure of Invention
The embodiment of the invention provides a target object control method, a target object control device, a storage medium and equipment, and solves the problems of insufficient smoothness of a game interface and poor display effect in the related art. The technical scheme is as follows:
in one aspect, a target object control method is provided, which is applied to a terminal, and the method includes:
after opening a bureau, receiving a first moving track of a target object issued by a server, wherein the first moving track is obtained by the server according to moving data of the target object after collision after the target object collides with a first control object of an attack party;
receiving a moving time parameter which is sent by the server and matched with the first moving track;
controlling the target object to move on a display interface based on the first movement track and the movement time parameter;
wherein the first movement locus comprises positions passed by the target object after collision to the end of the current turn.
In another aspect, a target object control method is provided, which is applied to a server, and includes:
after opening a bureau, when a target object collides with a first control object of an attacking party, acquiring the movement data of the target object after collision;
acquiring a first movement track of the target object according to the movement data, wherein the first movement track comprises positions of the target object after collision to the end of the current turn;
and sending the first movement track and the movement time parameter matched with the first movement track to a terminal, so that the terminal controls the target object to move on a display interface based on the first movement track and the movement time parameter.
In another aspect, there is provided a target object control apparatus applied to a terminal, the apparatus including:
the first receiving module is used for receiving a first moving track of a target object issued by a server after an opening, wherein the first moving track is acquired by the server according to moving data of the target object after collision after the target object collides with a first control object of an attack party;
the second receiving module is used for receiving a moving time parameter which is sent by the server and matched with the first moving track;
the control module is used for controlling the target object to move on a display interface based on the first movement track and the movement time parameter;
wherein the first movement locus comprises positions passed by the target object after collision to the end of the current turn.
In another aspect, there is provided a target object control apparatus applied to a server, the apparatus including:
the first obtaining module is used for obtaining the mobile data of a target object after collision when the target object collides with a first control object of an attacking party after opening an office;
a second obtaining module, configured to obtain a first movement trajectory of the target object according to the movement data, where the first movement trajectory includes positions through which the target object passes after a collision until the end of the current round;
and the sending module is used for sending the first moving track and the moving time parameter matched with the first moving track to a terminal so that the terminal controls the target object to move on a display interface based on the first moving track and the moving time parameter.
In another aspect, a storage medium is provided, where at least one instruction is stored, and the at least one instruction is loaded and executed by a processor to implement the target object control method of the terminal or the target object control method executed by the server.
In another aspect, an apparatus for controlling a target object is provided, where the apparatus includes a processor and a memory, and the memory stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement a target object control method of the terminal or a target object control method executed by the server.
At the moment that the target object is hit by the attack party through the control object, the server can obtain a corresponding moving track based on moving data of the target object when the target object is hit and sends the moving track to the terminal, the server is not needed to synchronize the current track information of the target object to the terminal in real time, and the obtained moving track comprises all positions of the target object after the target object is hit and passes through the terminal when the round is finished.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment related to a target object control method provided by an embodiment of the present invention;
fig. 2 is a flowchart of a target object control method according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a moving track of a first target object according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a moving trajectory of a second target object according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating a movement trajectory of a third target object according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a moving track of a fourth target object according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a movement track of a fifth target object according to an embodiment of the present invention;
FIG. 8 is a flowchart of a target object control method according to an embodiment of the present invention;
FIG. 9 is a flowchart of a target object control method according to an embodiment of the present invention;
fig. 10 is a schematic diagram of a moving track of a sixth target object according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of a target object control apparatus according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of a target object control apparatus according to an embodiment of the present invention;
fig. 13 is a schematic structural diagram of an apparatus for controlling a target object according to an embodiment of the present invention;
fig. 14 is a schematic structural diagram of an apparatus for controlling a target object according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Before explaining the embodiments of the present invention in detail, some terms related to the embodiments of the present invention are explained.
A player: referred to as a user in the embodiments of the present invention.
A player may also be referred to as a player, which is a term between the gaming industry and a game participant. Broadly speaking, a player broadly refers to a user playing a game, i.e., a person participating in any form of game.
In addition, players can be divided into an attacking party and a defending party, for example, for a batting game, the attacking party realizes attacking by serving, and the defending party realizes defending by catching a ball. It should be noted that a player may switch identities back and forth between offensive and defensive during different periods of time.
To summarize, players are the experiences, users, evaluators, and consumers of the game. The types of games that different players enjoy are different according to the differences in characters and preferences.
Target object: in the embodiments of the present invention, an object used by an attacker for initiating an attack is referred to. For example, a ball is a target object in a hitting game.
The target object may be in various representations, such as a cartoon character image, a cartoon animal image, a solid geometric image, and the like, which are not specifically limited in the embodiment of the present invention.
The control object is: in the embodiment of the invention, a defender is used for intercepting the object of the target object, and an attacker is used for striking the object of the target object. For example, in a batting game, a ball board is a control object.
The representation form of the manipulation object is various, and for example, the manipulation object may be any form of object having an ability to hit or intercept a target object, such as a solid geometric figure, which is also not specifically limited by the embodiment of the present invention. In the embodiment of the present invention, the control object of the attacking party may be referred to as a first control object, and the control object of the defending party may be referred to as a second control object.
It should be noted that the embodiment of the present invention is applicable to a man-machine mode game, a double-person mode game, and the like, and for the double-person mode game, both the attacking party and the defending party are composed of players, at this time, the attacking party may also be referred to as an attacking user, and the defending party may also be referred to as a defending user; for the man-machine mode game, the aggressor and the defender involve the player and the machine.
And (4) turning: in the embodiment of the invention, the attacker initiates an attack, and the defender defends the attack once, namely a round.
Taking a ball hitting game as an example, after an attacking party hits a ball, a defending party defends the ball serving once, namely a round. If the defender intercepts the ball in the round, the attack and defense roles are converted in the next round, namely the attacking party in the current round needs to intercept the rebounded ball in the next round; if the defender does not intercept the ball in the round, the attacking and defending role is not converted in the next round, and the attacking party in the current round still serves the ball.
And (3) frame aggregation: in the embodiment of the invention, the moving track of the target object is specifically issued to the terminals participating in the game in a frame set form by the server. That is, the movement trace information of the target object is carried in the frame set. A plurality of frames are included in one frame set.
Props: in games, refers to virtual items that provide a player with convenience. Taking a ball hitting game as an example, a player can receive items set in the game during the moving process of the ball, and the items can assist the player in playing the game.
The following describes an implementation environment related to a target object control method provided by an embodiment of the present invention.
Fig. 1 illustrates an implementation environment related to a target object control method provided by an embodiment of the present invention. Referring to fig. 1, the implementation environment includes: a terminal 101 and a server 102.
The terminal 101 is responsible for display logic, configured to control the target object to move on the display interface based on the received movement trajectory, that is, the terminal 101 is responsible for rendering the display interface. In addition, the terminal is also responsible for synchronizing the actions triggered by the player to the server.
It should be noted that the type of the terminal 101 includes, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, and the like, and this is not particularly limited in the embodiment of the present invention.
The server 102 is responsible for calculating logic, and is configured to obtain a movement trajectory of a target object controlled by a player, and send the obtained movement trajectory to the terminal 101.
In another embodiment, for example, when a player plays an HTML (HyperText Markup Language) game online, TCP (Transmission Control Protocol) is typically used for communication, so that the server transmits the current moving track of the ball to the terminal. Particularly, in the case of a weak network, the moving trajectory acquired by the server in real time often cannot be transmitted to the terminal in real time, which may visually cause phenomena such as ball delay and flickering.
For example, if the ball needs to pass through the position points 1-2-3-4 in sequence within 3 seconds, but the terminal fails to receive the trajectory information of 1-2 and 2-3 in real time after receiving the trajectory information of moving to the position point 1 in real time, for such a case, the ball will have a delay phenomenon, i.e., the ball will not move to a normal position within a normal time but stay at the position point 1. After the subsequent terminals receive the track information of 1-2, 2-3 and 3-4 at one time, the ball may be moved from the position point 1 to the position point 4 in a moment in vision, that is, it may take 3 seconds to move the ball from the position point 1 to the position point 4 under normal conditions, but in such a situation, it may only need 0.5 seconds to complete the movement, and the ball may be moved in a flickering manner in vision.
In summary, due to the high requirement of the related art on the network, in the case of a weak network, for the terminal, the display interface may not be smooth enough and the display effect may not be good due to the out-of-order control of the ball.
In order to solve the problem caused by the weak network, the embodiment of the invention adopts a mode of preloading the movement track of the target object. Wherein the meaning of preloading is: and calculating the movement track of the target object in advance through the server and returning the movement track to the terminal. Since the target object flies out from the attacking party side at the moment after the opening, the movement trajectory behind the target object can be acquired by the movement data of the target object when flying out. The moving data includes, but is not limited to, a moving speed and a moving direction of the target object.
In another expression, at the moment that the target object flies out from the side of the attacking party, the server immediately acquires the moving track of the target object and sends the moving track to the client, and the client directly performs interface rendering after receiving the moving track, without transmitting the current path of the target object to the terminal in real time by the server.
In another embodiment, the moving track of the target object before reaching the manipulation object on the side of the guardian is fixed, and when the manipulation object on the side of the guardian affects the moving track of the target object, the server further obtains the moving track of the target object again and synchronizes the newly obtained moving track to the terminal, so that the terminal can perform interface rendering according to the new moving track conveniently.
Under the condition of a weak network, the target object control method provided by the embodiment of the invention can effectively solve the delay and flicker phenomena of the target object, and the display interface has no strong delay and dullness from the expression of a product side and has better fluency. And because frequent communication between the terminal and the server is reduced, the pressure of the server is greatly reduced.
In addition, by adopting the preloading mode, the problem that the prop is not eaten when being displayed on the display interface but is actually eaten logically can be effectively solved.
Taking the moving track of the ball as a-b-c-d as an example, assuming that a prop is arranged on the track of b-c, but the track information terminal of b-c does not receive the moving track in real time, if the target object directly moves from the position point a to the position point d in the subsequent process, because the logical moving track of the target object is a-b-c-d, after the target object moves to the position point d, the player logically thinks that the player has already received the prop arranged on the track of b-c, but the terminal does not display a picture related to the player to receive the prop, and the prop suddenly disappears.
The target object control method provided by the embodiment of the present invention is explained in detail by specific embodiments below.
Fig. 2 is a flowchart of a target object control method provided in an embodiment of the present invention, where an interaction subject is a terminal and a server. Referring to fig. 2, a method flow provided by the embodiment of the present invention includes:
201. after the opening, when the target object collides with the first control object of the attacking party, the movement data of the target object after the collision is acquired.
In embodiments of the present invention, the game modes include, but are not limited to, a single-player mode and a double-player mode. Wherein, single mode is man-machine fight, and double mode is two players for fight.
Further, the start of the game usually requires a certain condition to trigger, for example, the terminal of the player starts the game in response to a game start instruction after receiving the game start instruction. The triggering of the game starting instruction includes, but is not limited to, the following modes: after determining the players to participate in the game, the game automatically starts after M seconds; alternatively, after determining the players participating in the game, the determination is triggered by any one of the players participating in the game, and this is not specifically limited by the embodiment of the present invention.
The target object collides with the first control object of the attacking party, that is, the target object starts from the starting point position of the attacking party. As shown in fig. 3 to 7, after the opening, the attacking party starts to control the movement of the operation object from the starting point position a, that is, the attacking party hits the target object at the position point a. In the embodiment of the present invention, the server calculates the movement trajectory after the target object based on the movement data of the target object when the target object starts at the starting position. The moving trajectory of the target object usually varies according to the moving data of the target object at the moment when the target object is hit by an attacking party, and generally, the moving data includes the moving speed and the moving direction of the target object.
202. And the server acquires a first movement track of the target object according to the movement data, wherein the first movement track comprises each position passed by the target object after collision to the end of the current round.
In another expression, the first movement path includes positions of the target object from the starting position to the end of the turn.
In the embodiment of the present invention, the server specifically calculates a movement trajectory formed by the target object in the movement according to the movement speed and the movement direction of the target object when the target object starts from the starting position, and the movement trajectory covers each position that the target object passes through after starting from the starting position to the end of the current round.
Generally speaking, the collision situation between the target object and the first manipulation object of the defender can be classified into the following cases:
case one, the target object normally collides with the front of the first manipulation object and then changes direction to bounce back.
In this case, the corresponding defender normally intercepts the target object through the first control object, and the target object rebounds to the attacking party. Wherein the movement trajectory in this case shows that the target object has a frontal collision with the first manipulation object and bounces back toward the attack direction after N1 boundary collisions.
For example, referring to fig. 3, since the movement data of the target object is known when the target object is sent from the position point a, the target object starts from the position point a, passes through the position point B, reaches the position point C located on the first manipulation object, and then bounces back to the trajectory of the position point D, the server is pre-acquirable, that is, the movement trajectory a-B-C-D of the target object is determined. In fig. 3, the value of N1 is 1, i.e. the target object impacts on the first manipulated object after 1 boundary collision at the location point B.
In which the target object has a collision with the boundaries on the left and right sides in fig. 3, which is referred to herein as a boundary collision.
In case two, the target object hits against the side of the first manipulation object and then directly falls.
This situation corresponds to the target object not being intercepted by the defender, i.e. the defending fails. Wherein the moving trajectory in this case shows that the target object collides with the left side of the first manipulation object after N1 boundary collisions and falls after changing the moving direction. As shown in fig. 4, the left side of the first manipulation object is referred to herein as a first side, and the right side is referred to as a second side. For the server, it may determine whether the target object passes through the left side of the first manipulation object, and if so, it is determined that the target object is not intercepted by the daemon party, and the target object falls after changing the moving direction, and this process server may also acquire the target object in advance.
And in the third situation, the target object collides against the side surface of the first control object and then directly falls off.
This situation also corresponds to the target object not being intercepted by the defender, i.e. defending fails. Wherein the movement locus in this case shows that the target object collides with one vertex of the first manipulation object and falls without changing the movement direction after having collided with the N1 boundaries.
For example, referring to fig. 5, after the target object is sent from the position point a, the target object passes through the position point B after a boundary collision, and finally falls to the position point D directly through the vertex of the first manipulation object, i.e., the position point C.
And in the fourth situation, the target object collides against the side surface of the first control object and then directly falls off.
This situation also corresponds to the target object not being intercepted by the defender, i.e. defending fails. Wherein the movement locus in this case shows that the target object collides with the second side face of the first manipulation object after N2 boundary collisions and falls after changing the moving direction.
For example, referring to fig. 6, after the target object is sent from the position point a, the target object passes through the position point B and the position point C after two boundary collisions, collides with the right side of the first manipulation object (position point D), and finally falls directly to the position point E. Namely, the moving track of the target object is A-B-C-D-E.
And in the fifth situation, the target object collides against the side surface of the first control object and then directly falls off.
This situation also corresponds to the target object not being intercepted by the defender, i.e. defending fails. Wherein the movement locus in this case shows that the target object collides with the first side face of the first manipulation object after N3 times of boundary collision and falls after changing the movement direction.
For example, referring to fig. 7, after the target object is sent from the position point a, the target object passes through the position point B, the position point C and the position point D through three boundary collisions, collides with the left side of the first manipulation object (the position point E), and finally directly falls to the position point D. Namely, the moving track of the target object is A-B-C-D-E-F.
N1 < N2 < N3, and N1, N2, and N3 are positive integers.
203. And the server sends the first movement track and the movement time parameter matched with the first movement track to the terminal.
In this context, a terminal as referred to herein is particularly a terminal of a player participating in a game.
In the embodiment of the present invention, when the server issues the track information of the target object to the terminal, the server may select to issue the track information to the terminal in a frame set manner, which is not specifically limited in the embodiment of the present invention.
In another embodiment, the server also carries a corresponding moving time parameter while sending the track information to the terminal, so that the terminal controls the target object to move according to the moving time parameter, thereby ensuring the interface display effect. Assuming that the moving trajectory of the target object is A-B-C-D-E, the moving time parameter may include Tab、Tbc、Tcd、TdeThe embodiment of the present invention is not particularly limited to this.
204. And the terminal controls the target object to move on the display interface based on the first movement track and the movement time parameter matched with the first movement track.
At the moment when the target object starts from the starting point position, the server acquires the moving track of the target object based on the moving data of the target object during starting and synchronizes the moving track to the terminal, so that the terminal can acquire the moving track of the target object after being hit in advance, the terminal can directly perform interface rendering based on the received moving track, and the server is not required to synchronize the current track information of the target object to the terminal in real time.
Specifically, after receiving a moving time parameter which is sent by a server and matched with a moving track of a target object, the terminal can further obtain first moving time according to the received moving time parameter and a local frame rate based on the quality consideration of the performance of the terminal; and then, the terminal controls the target object to move on the display interface based on the moving track sent by the server within the first moving time.
In summary, at the moment when the target object is hit by an attacker, the server may obtain a corresponding moving track based on the moving data of the target object at the starting point and issue the moving track to the terminal, and the server is not required to synchronize the current track information of the target object to the terminal in real time, so that the number of times of communication between the server and the terminal is reduced, the pressure of the server is reduced, and the interface display effect of the terminal under the weak network condition is ensured.
In another embodiment, after the target object is hit by the attacker and reaches the second control object of the defender, if the second control object moves and finally the target object collides with the second control object, the server needs to recalculate the movement trajectory of the target object and synchronize the movement trajectory with the terminal, so that the terminal controls the target object to move on the display interface according to the latest movement trajectory. Referring to fig. 8, the following is described in detail:
801. if the server detects that the second control object of the defender moves and the target object collides with the second control object, the server acquires a second movement track of the target object according to the movement data of the second control object and the movement data of the target object when the target object collides with the second control object.
Whether the second control object of the defender moves or not and whether the target object collides with the second control object can be detected by the terminal, and the target object is synchronized to the server after a corresponding event is detected.
In the embodiment of the invention, the server immediately acquires the vector x-x' of the movement of the second control object at the moment that the target object collides with the second control object. Wherein x denotes an initial position of the second manipulation object, and x' denotes a position of the second manipulation object after movement. If the second manipulation object is set to be movable only in the lateral direction in the ball hitting game, the y-axis movement speed of the target object is not changed, and the x-axis movement speed is calculated again.
Specifically, the server calculates a vector of the second control object moving in the x direction, and further obtains an x-axis moving speed of the second control object; then, adding the x-axis moving speed of the second control object and the x-axis moving speed of the target object when the target object collides with the second control object to obtain the latest x-axis moving speed of the target object; finally, the server acquires the latest movement track of the target object, namely the second movement track, based on the latest movement data of the target object.
802. And the server sends the second movement track to the terminal.
This step is the same as step 203, and will not be described herein again.
803. And after the target object collides with the second control object, the terminal controls the target object to move on the display interface based on the second movement track.
This step is similar to step 204, and will not be described herein again.
In another embodiment, after the target object is hit by the attacker and then reaches the second control object of the defender, if the second control object does not move and finally the target object collides with the second control object, after the target object rebounds, the server also recalculates the movement trajectory of the target object and synchronizes the movement trajectory to the terminal, so that the terminal controls the target object to move on the display interface according to the latest movement trajectory. For this situation, the steps executed are the same as the above steps 201 to 204, the two processes are completely opposite, and the attack and defense identities of the players are also exchanged. Referring to fig. 9, the following is described in detail:
901. and if the server detects that the second control object of the defender does not move and the target object collides with the second control object, the server acquires a third moving track of the target object according to the moving data of the target object after the target object collides with the second control object.
902. And the server sends the third moving track to the terminal.
903. And after the target object collides with the second control object, the terminal controls the target object to move on the display interface based on the third movement track.
In summary, if the target object is hit by the attacker, the defender cannot intercept the target object, the attacker still serves as the attacker in the next round, and if the defender intercepts the target object, the attacking and defending identities are exchanged, and the track information acquisition and interface rendering process is the same as that in the previous round.
In another embodiment, after the frame set delivered by the server reaches the terminal, the terminal first judges the current state of the target object, if the target object just runs the track indicated by the current frame set, the opportunity is perfect, and the target object is controlled to directly run the track indicated by the next frame set. If the track indicated by the current frame set is not run out due to network delay or terminal performance, the server sends the next frame set, and if the terminal directly pulls the target object to the first frame of the next frame set to start running, the target object is flicked, and the player obviously does not eat the prop but the prop disappears. In view of this drawback, the embodiment of the present invention also provides a corresponding solution, which is specifically as follows:
after receiving the next frame set sent by the server, the terminal judges whether the target object moves to the end position indicated by the current frame set; and if the target object does not move to the end position, the terminal determines a first position and a second position indicated by the next frame set. The first position and the second position are the first two positions where the target object passes in the next frame set; and then, the terminal acquires second moving time of the target object moving from the first position to the second position and acquires a target distance between the current position and the first position, and then, in the second moving time, the terminal controls the target object to move from the current position to the second position through the first position based on the target distance.
In another expression, the processing method adopted in the implementation of the present invention is to obtain the time from the first position point to the second position point in the next frame set, then calculate the distance from the current position where the target object is located to the first position point, and control the target object to rapidly move from the current position to the first position point and then reach the second position point within the same time. Thus, the phenomenon that the prop is suddenly hidden or has no flash movement is seen visually.
For example, assuming that the moving trajectory of the target object in the next frame set is D-E-F, and the terminal has not run out of the current frame set a-B-C when receiving the next frame set, the moving trajectory is TefThe control target object is accelerated from the current position to move to the position point F through the position point D in time.
In another example, the correct movement trajectory of the target object is shown in the black bold part in fig. 10, the trajectory of the target object is deviated due to network delay or local stuck, and the trend is shown in the black non-bold part. In the implementation of the present invention, the terminal obtains the distance difference between the position points B and C, and then starts at T from the position point CbdAnd in time, accelerating to move to the point B according to the distance difference, and then moving to the point D.
According to the method provided by the embodiment of the invention, at the moment when the target object is hit by the attack party through the control object, the server can obtain the corresponding movement track based on the movement data of the target object when the target object is hit and sends the movement track to the terminal, the server is not required to synchronize the current track information of the target object to the terminal in real time, and the obtained movement track comprises the positions of the target object after the target object is hit and passes through when the round is finished, so that the target object can be controlled based on the obtained track information even under the condition of a weak network, the display effect of the terminal is ensured, the problems of insufficient smoothness of a game interface and poor display effect caused by control disorder of the target object are avoided, and the pressure of the server is greatly reduced due to the reduction of frequent communication between the terminal and the server.
In the other expression mode, under the condition of a weak network, the phenomena of delay and flash shift of a target object can be effectively solved, and from the expression of a product side, the display interface of the terminal does not have strong delay and dullness, and the fluency is better. And, because the frequent communication between the terminal and the server is reduced, the pressure of the server is greatly reduced. In addition, the problem that the props disappear when the players clearly do not eat the props is solved, and the effect is good.
Fig. 11 is a schematic structural diagram of a target object control apparatus according to an embodiment of the present invention. Referring to fig. 11, the apparatus includes:
the first receiving module 1101 is configured to receive a first moving track of a target object issued by a server after an opening, where the first moving track is obtained by the server according to moving data of the target object after collision after the target object collides with a first control object of an attack party;
a second receiving module 1102, configured to receive a moving time parameter that is sent by the server and matches the first moving trajectory;
the control module 1103 is configured to control the target object to move on a display interface based on the first movement track and the movement time parameter;
wherein the first movement locus comprises positions passed by the target object after collision to the end of the current turn.
According to the device provided by the embodiment of the invention, at the moment when the target object is hit by the attack party through the control object, the server can obtain the corresponding movement track based on the movement data of the target object when the target object is hit and sends the movement track to the terminal, the server is not required to synchronize the current track information of the target object to the terminal in real time, and the obtained movement track comprises the positions of the target object after the target object is hit and passes through when the round is finished, so that the target object can be controlled based on the obtained track information even under the condition of a weak network, the display effect of the terminal is ensured, the problems of insufficient smoothness of a game interface and poor display effect caused by control disorder of the target object are avoided, and the pressure of the server is greatly reduced due to the reduction of frequent communication between the terminal and the server.
In another embodiment, the control module is further configured to obtain a first moving time according to the moving time parameter and a local frame rate; and controlling the target object to move on the display interface based on the first movement track within the first movement time.
In another embodiment, the first receiving module is further configured to receive at least one frame set sent by the server, where the at least one frame set includes the first moving trajectory of the target object.
In another embodiment, the control module is further configured to determine, after receiving a next frame set sent by the server, whether the target object has moved to an end position indicated by the current frame set; if the target object does not move to the end position, determining a first position and a second position indicated by the next frame set, wherein the first position and the second position are the first two positions passed by the target object in the next frame set; acquiring a second moving time of the target object from the first position to the second position; and controlling the target object to move from the current position to the second position via the first position within the second movement time.
In another embodiment, the first receiving module is further configured to receive a second moving track issued by the server, where the second moving track is obtained according to moving data of a second control object and moving data of the second control object when the target object collides with a second control object after the server moves the second control object in a guardian;
and the control module is further used for controlling the target object to move on the display interface based on the second movement track after the target object collides with the second control object.
In another embodiment, the first receiving module is further configured to receive a third moving track sent by the server, where the third moving track is obtained according to moving data of a target object after the target object collides with a second control object after the server does not move the second control object in a guardian;
and the control module is further used for controlling the target object to move on the display interface based on the third movement track after the target object collides with the second control object.
In another embodiment, the first movement trajectory shows that the target object has a frontal collision with the second manipulated object and bounces back toward the attack direction after N1 boundary collisions; or the like, or, alternatively,
the first movement track shows that the target object collides with the first side of the second manipulation object after N1 boundary collisions and falls after changing the movement direction; or the like, or, alternatively,
the first movement trajectory shows that the target object collides with one vertex of the second manipulation object and falls without changing the movement direction after N1 boundary collisions; or the like, or, alternatively,
the first movement track shows that the target object collides with the second side surface of the second manipulation object after N2 boundary collisions and falls after changing the movement direction; or the like, or, alternatively,
the first movement track shows that the target object collides with the first side of the second manipulation object after N3 boundary collisions and falls after changing the movement direction;
wherein N1 < N2 < N3, and N1, N2 and N3 are positive integers.
All the above optional technical solutions may be combined arbitrarily to form the optional embodiments of the present disclosure, and are not described herein again.
Fig. 12 is a schematic structural diagram of a target object control apparatus according to an embodiment of the present invention. Referring to fig. 12, the apparatus includes:
the first obtaining module is used for obtaining the mobile data of a target object after collision when the target object collides with a first control object of an attacking party after opening an office;
a second obtaining module, configured to obtain a first movement trajectory of the target object according to the movement data, where the first movement trajectory includes positions through which the target object passes after a collision until the end of the current round;
and the sending module is used for sending the first moving track and the moving time parameter matched with the first moving track to a terminal so that the terminal controls the target object to move on a display interface based on the first moving track and the moving time parameter.
According to the device provided by the embodiment of the invention, at the moment when the target object is hit by the attack party through the control object, the server can obtain the corresponding movement track based on the movement data of the target object when the target object is hit and sends the movement track to the terminal, the server is not required to synchronize the current track information of the target object to the terminal in real time, and the obtained movement track comprises the positions of the target object after the target object is hit and passes through when the round is finished, so that the target object can be controlled based on the obtained track information even under the condition of a weak network, the display effect of the terminal is ensured, the problems of insufficient smoothness of a game interface and poor display effect caused by control disorder of the target object are avoided, and the pressure of the server is greatly reduced due to the reduction of frequent communication between the terminal and the server.
In another embodiment, the movement data includes a movement rate and a movement direction of the target object; and the second acquisition module is further used for acquiring the first movement track according to the movement speed and the movement direction of the target object.
In another embodiment, the apparatus further comprises:
the second obtaining module is further configured to obtain a second moving track of the target object according to the moving data of the second control object and the moving data of the target object when the target object collides with the second control object if the second control object of the guardian moves and the target object collides with the second control object;
the sending module is further configured to send the second moving trajectory to the terminal, so that the terminal controls the target object to move on the display interface based on the second moving trajectory after the target object collides with the second control object.
In another embodiment, the apparatus further comprises:
the second obtaining module is further configured to obtain a third moving track of the target object according to moving data of the target object after the target object collides with the second control object if the second control object of the guardian is not moving and the target object collides with the second control object;
the sending module is further configured to send the third moving trajectory to the terminal, so that the terminal controls the target object to move on the display interface based on the third moving trajectory after the target object collides with the second control object.
All the above optional technical solutions may be combined arbitrarily to form the optional embodiments of the present disclosure, and are not described herein again.
It should be noted that: in the target object control device provided in the above embodiment, when controlling the target object, only the division of the above functional modules is exemplified, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the target object control apparatus provided in the above embodiments and the target object control method embodiment belong to the same concept, and specific implementation processes thereof are described in the method embodiments, and are not described herein again.
Fig. 13 is a block diagram illustrating a structure of an apparatus 1300 for controlling a target object according to an exemplary embodiment of the present invention. The device 1300 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Device 1300 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and so forth.
In general, the apparatus 1300 includes: a processor 1301 and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1301 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1301 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
Memory 1302 may include one or more computer-readable storage media, which may be non-transitory. The memory 1302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1302 is used to store at least one instruction for execution by processor 1301 to implement the target object control method provided by method embodiments herein.
In some embodiments, the apparatus 1300 may further optionally include: a peripheral interface 1303 and at least one peripheral. Processor 1301, memory 1302, and peripheral interface 1303 may be connected by a bus or signal line. Each peripheral device may be connected to the peripheral device interface 1303 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, touch display 1305, camera 1306, audio circuitry 1307, positioning component 1308, and power supply 1309.
Peripheral interface 1303 may be used to connect at least one peripheral associated with I/O (Input/Output) to processor 1301 and memory 1302. In some embodiments, processor 1301, memory 1302, and peripheral interface 1303 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1301, the memory 1302, and the peripheral device interface 1303 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1304 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1304 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1304 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1304 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1304 may communicate with other devices via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1304 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1305 is a touch display screen, the display screen 1305 also has the ability to capture touch signals on or over the surface of the display screen 1305. The touch signal may be input to the processor 1301 as a control signal for processing. At this point, the display 1305 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1305 may be one, providing the front panel of device 1300; in other embodiments, the display 1305 may be at least two, respectively disposed on different surfaces of the device 1300 or in a folded design; in still other embodiments, the display 1305 may be a flexible display disposed on a curved surface or on a folded surface of the device 1300. Even further, the display 1305 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display 1305 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 1306 is used to capture images or video. Optionally, camera assembly 1306 includes a front camera and a rear camera. Generally, a front camera is disposed on a front panel of the apparatus, and a rear camera is disposed on a rear surface of the apparatus. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1306 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1307 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1301 for processing, or inputting the electric signals to the radio frequency circuit 1304 for realizing voice communication. The microphones may be multiple and placed at different locations on the device 1300 for stereo sound capture or noise reduction purposes. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1301 or the radio frequency circuitry 1304 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 1307 may also include a headphone jack.
The positioning component 1308 is operable to locate a current geographic Location of the device 1300 for navigation or LBS (Location Based Service). The Positioning component 1308 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
The power supply 1309 is used to supply power to the various components in the device 1300. The power source 1309 may be alternating current, direct current, disposable or rechargeable. When the power source 1309 comprises a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the device 1300 also includes one or more sensors 1310. The one or more sensors 1310 include, but are not limited to: acceleration sensor 1311, gyro sensor 1312, pressure sensor 1313, fingerprint sensor 1314, optical sensor 1315, and proximity sensor 1316.
The acceleration sensor 1311 can detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the apparatus 1300. For example, the acceleration sensor 1311 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1301 may control the touch display screen 1305 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1311. The acceleration sensor 1311 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1312 may detect the body direction and the rotation angle of the device 1300, and the gyro sensor 1312 may cooperate with the acceleration sensor 1311 to acquire a 3D motion of the device 1300 by the user. Processor 1301, based on the data collected by gyroscope sensor 1312, may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 1313 may be disposed on the side bezel of the device 1300 and/or underneath the touch display 1305. When the pressure sensor 1313 is disposed on the side frame of the device 1300, a user's holding signal to the device 1300 may be detected, and the processor 1301 performs left-right hand recognition or shortcut operation according to the holding signal acquired by the pressure sensor 1313. When the pressure sensor 1313 is disposed at a lower layer of the touch display screen 1305, the processor 1301 controls an operability control on the UI interface according to a pressure operation of the user on the touch display screen 1305. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1314 is used for collecting the fingerprint of the user, and the processor 1301 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 1314, or the fingerprint sensor 1314 identifies the identity of the user according to the collected fingerprint. When the identity of the user is identified as a trusted identity, the processor 1301 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 1314 may be disposed on the front, back, or side of the device 1300. When a physical key or vendor Logo is provided on the device 1300, the fingerprint sensor 1314 may be integrated with the physical key or vendor Logo.
The optical sensor 1315 is used to collect the ambient light intensity. In one embodiment, the processor 1301 can control the display brightness of the touch display screen 1305 according to the intensity of the ambient light collected by the optical sensor 1315. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1305 is increased; when the ambient light intensity is low, the display brightness of the touch display 1305 is turned down. In another embodiment, the processor 1301 can also dynamically adjust the shooting parameters of the camera assembly 1306 according to the ambient light intensity collected by the optical sensor 1315.
A proximity sensor 1316, also known as a distance sensor, is typically disposed on a front panel of the device 1300. The proximity sensor 1316 is used to gather the distance between the user and the front face of the device 1300. In one embodiment, the processor 1301 controls the touch display 1305 to switch from the bright screen state to the dark screen state when the proximity sensor 1316 detects that the distance between the user and the front surface of the device 1300 gradually decreases; the touch display 1305 is controlled by the processor 1301 to switch from the rest state to the bright state when the proximity sensor 1316 detects that the distance between the user and the front surface of the device 1300 is gradually increasing.
Those skilled in the art will appreciate that the configuration shown in fig. 13 is not intended to be limiting of the apparatus 1300 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Fig. 14 is a schematic structural diagram of an apparatus for controlling a target object according to an embodiment of the present invention, where the apparatus 1400 may generate relatively large differences due to different configurations or performances, and may include one or more processors (CPUs) 1401 and one or more memories 1402, where the memory 1402 stores at least one instruction, and the at least one instruction is loaded and executed by the processors 1401 to implement the target object control method provided by the foregoing method embodiments. Of course, the device may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the device may also include other components for implementing the functions of the device, which are not described herein again.
In an exemplary embodiment, there is also provided a computer-readable storage medium, such as a memory, including instructions executable by a processor in a terminal to perform the target object control method in the above-described embodiments. For example, the computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (15)

1. A target object control method is applied to a terminal, and the method comprises the following steps:
after an opening, receiving a first moving track of a target object issued by a server, wherein the first moving track is obtained by the server according to moving data of the target object after collision when the target object collides with a first control object of an attack party, and the moving data comprises a moving speed and a moving direction of the target object;
receiving a moving time parameter which is sent by the server and matched with the first moving track;
controlling the target object to move on a display interface based on the first movement track and the movement time parameter;
wherein the first movement locus comprises positions passed by the target object after collision to the end of the current turn.
2. The method of claim 1, wherein the controlling the target object to move on the display interface based on the first movement trajectory and the movement time parameter comprises:
acquiring first moving time according to the moving time parameter and a local frame rate;
and controlling the target object to move on the display interface based on the first movement track within the first movement time.
3. The method of claim 1, wherein the receiving the first movement trajectory of the target object sent by the server comprises:
and receiving at least one frame set sent by the server, wherein the at least one frame set comprises a first moving track of the target object.
4. The method according to claim 3, wherein when controlling the target object to move on the display interface based on the first movement track, the method comprises:
after receiving the next frame set sent by the server, judging whether the target object moves to the end position indicated by the current frame set;
if the target object does not move to the end position, determining a first position and a second position indicated by the next frame set, wherein the first position and the second position are the first two positions passed by the target object in the next frame set;
acquiring a second moving time of the target object from the first position to the second position;
and controlling the target object to move from the current position to the second position via the first position within the second movement time.
5. The method according to any one of claims 1 to 4, further comprising:
receiving a second moving track issued by the server, wherein the second moving track is obtained according to moving data of a second control object and moving data of the second control object when the target object collides with the second control object after the server moves the second control object of a defender and the target object collides with the second control object;
after the target object collides with the second control object, the target object is controlled to move on the display interface based on the second movement track.
6. The method according to any one of claims 1 to 4, further comprising:
receiving a third moving track issued by the server, wherein the third moving track is obtained according to moving data of a target object after the target object collides with a second control object after the server does not move the second control object of a defender and the target object collides with the second control object;
after the target object collides with the second control object, controlling the target object to move on the display interface based on the third movement track.
7. The method according to any one of claims 1 to 4,
the first movement trajectory shows that the target object has a frontal collision with a second manipulated object and bounces back to the attack after N1 boundary collisions; or the like, or, alternatively,
the first movement track shows that the target object collides with the first side of the second manipulation object after N1 boundary collisions and falls after changing the movement direction; or the like, or, alternatively,
the first movement trajectory shows that the target object collides with one vertex of the second manipulation object and falls without changing the movement direction after N1 boundary collisions; or the like, or, alternatively,
the first movement track shows that the target object collides with the second side surface of the second manipulation object after N2 boundary collisions and falls after changing the movement direction; or the like, or, alternatively,
the first movement track shows that the target object collides with the first side of the second manipulation object after N3 boundary collisions and falls after changing the movement direction;
wherein N1 < N2 < N3, and N1, N2 and N3 are positive integers.
8. A target object control method is applied to a server, and the method comprises the following steps:
after opening a bureau, when a target object collides with a first control object of an attacking party, acquiring movement data of the target object after collision, wherein the movement data comprises the movement speed and the movement direction of the target object;
acquiring a first movement track of the target object according to the movement data, wherein the first movement track comprises positions of the target object after collision to the end of the current turn;
and sending the first movement track and the movement time parameter matched with the first movement track to a terminal, so that the terminal controls the target object to move on a display interface based on the first movement track and the movement time parameter.
9. The method of claim 8, wherein the obtaining a first movement trajectory of the target object according to the movement data comprises:
and acquiring the first movement track according to the movement speed and the movement direction of the target object.
10. The method according to claim 8 or 9, characterized in that the method further comprises:
if a second control object of the defender moves and the target object collides with the second control object, acquiring a second movement track of the target object according to the movement data of the second control object and the movement data of the target object when the target object collides with the second control object;
and sending the second movement track to the terminal, so that the terminal controls the target object to move on the display interface based on the second movement track after the target object collides with the second control object.
11. The method according to claim 8 or 9, characterized in that the method further comprises:
if a second control object of the defender does not move and the target object collides with the second control object, acquiring a third movement track of the target object according to movement data of the target object after the target object collides with the second control object;
and sending the third movement track to the terminal, so that the terminal controls the target object to move on the display interface based on the third movement track after the target object collides with the second control object.
12. A target object control apparatus, applied to a terminal, the apparatus comprising:
the first receiving module is used for receiving a first moving track of a target object issued by a server after opening an office, wherein the first moving track is acquired according to moving data of the target object after collision when the server collides with a first control object of an attack party, and the moving data comprises the moving speed and the moving direction of the target object;
the second receiving module is used for receiving a moving time parameter which is sent by the server and matched with the first moving track;
the control module is used for controlling the target object to move on a display interface based on the first movement track and the movement time parameter;
wherein the first movement locus comprises positions passed by the target object after collision to the end of the current turn.
13. A target object control apparatus, applied to a server, the apparatus comprising:
the first obtaining module is used for obtaining the moving data of a target object after collision when the target object collides with a first control object of an attacking party after opening a bureau, wherein the moving data comprises the moving speed and the moving direction of the target object;
a second obtaining module, configured to obtain a first movement trajectory of the target object according to the movement data, where the first movement trajectory includes positions through which the target object passes after a collision until the end of the current round;
and the sending module is used for sending the first moving track and the moving time parameter matched with the first moving track to a terminal so that the terminal controls the target object to move on a display interface based on the first moving track and the moving time parameter.
14. A storage medium having stored therein at least one instruction which is loaded and executed by a processor to implement the target object control method of any one of claims 1 to 7 or the target object control method of any one of claims 8 to 11.
15. An apparatus for controlling a target object, the apparatus comprising a processor and a memory, the memory having stored therein at least one instruction, the at least one instruction being loaded and executed by the processor to implement the target object control method of any one of claims 1 to 7 or the target object control method of any one of claims 8 to 11.
CN201810595628.9A 2018-06-11 2018-06-11 Target object control method, device, storage medium and equipment Active CN108786108B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810595628.9A CN108786108B (en) 2018-06-11 2018-06-11 Target object control method, device, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810595628.9A CN108786108B (en) 2018-06-11 2018-06-11 Target object control method, device, storage medium and equipment

Publications (2)

Publication Number Publication Date
CN108786108A CN108786108A (en) 2018-11-13
CN108786108B true CN108786108B (en) 2022-01-25

Family

ID=64088286

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810595628.9A Active CN108786108B (en) 2018-06-11 2018-06-11 Target object control method, device, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN108786108B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111475573B (en) * 2020-04-08 2023-02-28 腾讯科技(深圳)有限公司 Data synchronization method and device, electronic equipment and storage medium
CN112121437B (en) * 2020-09-21 2022-11-22 腾讯科技(深圳)有限公司 Movement control method, device, medium and electronic equipment for target object
CN116688493B (en) * 2023-07-31 2023-10-24 厦门真有趣信息科技有限公司 Interactive control method for football game

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5101080B2 (en) * 2006-10-19 2012-12-19 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME CONTROL METHOD
US7988555B2 (en) * 2007-07-27 2011-08-02 Empire Of Sports Developments Ltd. Method and device for controlling a motion-sequence within a simulated game or sports event
TWI407992B (en) * 2008-10-23 2013-09-11 Univ Nat Cheng Kung Virtual sports system
KR101546666B1 (en) * 2015-03-25 2015-08-25 주식회사 리얼야구존 A screen baseball system operating method
CN105509735B (en) * 2015-11-30 2019-11-08 小米科技有限责任公司 Information cuing method, device and terminal
CN107678652B (en) * 2017-09-30 2020-03-13 网易(杭州)网络有限公司 Operation control method and device for target object

Also Published As

Publication number Publication date
CN108786108A (en) 2018-11-13

Similar Documents

Publication Publication Date Title
US20210306700A1 (en) Method for displaying interaction information, and terminal
CN107982918B (en) Game game result display method and device and terminal
CN110141859B (en) Virtual object control method, device, terminal and storage medium
CN109729411B (en) Live broadcast interaction method and device
CN110102053B (en) Virtual image display method, device, terminal and storage medium
CN111918090B (en) Live broadcast picture display method and device, terminal and storage medium
CN111124133A (en) Method, device, equipment and storage medium for danger prompt information in virtual scene
CN111589127B (en) Control method, device and equipment of virtual role and storage medium
CN110300274B (en) Video file recording method, device and storage medium
CN109917910B (en) Method, device and equipment for displaying linear skills and storage medium
CN108786108B (en) Target object control method, device, storage medium and equipment
CN111541928B (en) Live broadcast display method, device, equipment and storage medium
CN110401898B (en) Method, apparatus, device and storage medium for outputting audio data
CN112221142B (en) Control method and device of virtual prop, computer equipment and storage medium
CN113318442A (en) Live interface display method, data uploading method and data downloading method
CN111672106A (en) Virtual scene display method and device, computer equipment and storage medium
CN110180176B (en) Display method, device and equipment of war newspaper display interface and readable storage medium
TWI817208B (en) Method and apparatus for determining selected target, computer device, non-transitory computer-readable storage medium, and computer program product
WO2022237076A1 (en) Method and apparatus for controlling avatar, and device and computer-readable storage medium
CN113599810B (en) Virtual object-based display control method, device, equipment and medium
CN112367533B (en) Interactive service processing method, device, equipment and computer readable storage medium
CN111672107B (en) Virtual scene display method and device, computer equipment and storage medium
CN109885235B (en) Interaction method and device based on virtual tag card, storage medium and terminal
CN112843703A (en) Information display method, device, terminal and storage medium
JP2019051360A (en) Game program, game providing method, and information processor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant