CN112107862A - Game character movement following method and device, storage medium and computer equipment - Google Patents

Game character movement following method and device, storage medium and computer equipment Download PDF

Info

Publication number
CN112107862A
CN112107862A CN202010993884.0A CN202010993884A CN112107862A CN 112107862 A CN112107862 A CN 112107862A CN 202010993884 A CN202010993884 A CN 202010993884A CN 112107862 A CN112107862 A CN 112107862A
Authority
CN
China
Prior art keywords
following
frame
target
role
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010993884.0A
Other languages
Chinese (zh)
Inventor
徐华龙
何文辉
董星辰
冯越宇
李聪
罗天成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Perfect Tianzhiyou Technology Co ltd
Original Assignee
Chengdu Perfect Tianzhiyou Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Perfect Tianzhiyou Technology Co ltd filed Critical Chengdu Perfect Tianzhiyou Technology Co ltd
Priority to CN202010993884.0A priority Critical patent/CN112107862A/en
Publication of CN112107862A publication Critical patent/CN112107862A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/65Methods for processing data by generating or executing the game program for computing the condition of a game character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a method and a device for following the movement of a game role, a storage medium and computer equipment, wherein the method comprises the following steps: receiving a role following instruction, wherein the role following instruction comprises a target role identification and a following role identification, and the target role is a player role or an NPC role; acquiring target moving path frame data of the target role based on the target role identifier; and determining the following moving path frame data corresponding to the following role according to the preset following frame number difference corresponding to the following role and the target moving path frame data, and sending the following moving path frame data to a following role terminal corresponding to the following role identifier so that the following role can realize motion following of the target role. The method and the device have the advantages that the complex process of determining the data of the following moving path frame is processed by the high-performance server, the processing efficiency is greatly improved, and the problem that the NPC role can not be followed in the prior art is solved.

Description

Game character movement following method and device, storage medium and computer equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for following a motion of a game character, a storage medium, and a computer device.
Background
In a conventional MMORPG game (multiplayer online role-playing game), there are a large number of multi-player collaborative plays (such as multi-player copy, multi-player mission, large game battlefield, etc.), and these plays have a common point: the method needs to be started by multiple persons going to a starting point at the same time, a long distance is often needed to be traveled in the process of going to the starting point of the playing method, the process of catching up the way is usually very boring, the game experience of a player is greatly reduced by the boring process, however, the catching up experience is not rare in the game, and therefore, how to improve the game experience of the player in catching up the way in each playing method becomes very important.
With the rapid development of the social requirement and the personal acceptance requirement of the game, most of the existing games adopt the following function to improve the experience of the player, and the following playing method in the game shows explosive growth in recent time. The current popular solution for the movement following play method is as follows: when the second role needs to follow the first role, the first role client side can determine a following path of the second role according to the moving path of the first role, the first role client side sends the following path to the server, the server forwards the following path to the second role client side, and the second role client side controls the second role to move according to the received following path so as to show the following effect of the second role on the first role. The above solution has two drawbacks: firstly, a following path is calculated by a first role client, and the client has performance pressure, so that the client is easy to jam and lack of performance; second, the player cannot follow the NPC character within the game.
In conclusion, the defects of the following schemes are urgently needed to be solved.
Disclosure of Invention
In view of this, the present application provides a method and an apparatus for following a motion of a game character, a storage medium, and a computer device.
According to one aspect of the application, a motion following method of a game character is provided, which is applied to a server and comprises the following steps:
receiving a role following instruction, wherein the role following instruction comprises a target role identification and a following role identification, and the target role is a player role or an NPC role;
acquiring target moving path frame data of the target role based on the target role identifier;
and determining the following movement path frame data corresponding to the following role according to the preset following frame number difference corresponding to the following role and the target movement path frame data, wherein the following movement path frame data comprises a data frame determined by performing interpolation processing on the basis of the position corresponding to the target movement path frame data, and sending the following movement path frame data to a following role terminal corresponding to the following role identifier so that the following role can realize motion following of the target role.
Specifically, each target data frame included in the target movement path frame data corresponds to a target frame time and a target frame position; the determining the following movement path frame data corresponding to the following role specifically includes:
and determining a synchronous following data frame corresponding to the target data frame in the following moving path frame data based on the preset following frame number difference and the target data frame, wherein the following frame time corresponding to the synchronous following data frame is the same as the target frame time corresponding to the target data frame, the following frame position corresponding to the synchronous following data frame is determined based on the position of the target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which the forward phase difference of the target data frame is the preset following frame number difference.
Specifically, the determining the following movement path frame data corresponding to the following role specifically includes:
acquiring a first target frame position and a first target frame time corresponding to a first target data frame in the target movement path frame data, and acquiring a first following frame position of the following role, wherein the first target data frame is a first frame in the target movement path frame data;
performing interpolation based on the first target frame position and the first following frame position to obtain an initial following frame position matched with the difference quantity of the preset following frames;
marking initial following frame time corresponding to the initial following frame position, and determining an initial following data frame in the following movement path frame data, wherein the first initial following frame time is the first target frame time.
Specifically, the determining the following movement path frame data corresponding to the following role specifically includes:
acquiring a first target frame position corresponding to a first target data frame and a second target frame position corresponding to a second target data frame in the target moving path frame data, wherein the second target data frame is a data frame which is different from the first target data frame by the preset following frame number difference quantity;
interpolating based on the first target frame position and the second target frame position to obtain a starting following frame position matched with the difference quantity of the preset following frames, wherein the starting following frame position comprises the second target frame position;
marking the starting following frame time corresponding to the starting following needle position according to the first target frame time and the second target frame time, and determining starting following frame data in the following movement path frame data.
Specifically, before performing interpolation based on the first target frame position and the second target frame position, the method further includes:
acquiring all starting target frame positions corresponding to the first target data frame and the second target data frame;
and if the starting target frame position is repeated, performing interpolation based on the first target frame position and the second target frame position.
Specifically, each target data frame included in the target moving path frame data further corresponds to a target frame state, and the following frame state corresponding to the synchronous following data frame is a target frame state corresponding to a target data frame which is different by the preset following frame number difference before the target data frame of time synchronization.
Specifically, each target data frame included in the target movement path frame data also corresponds to a target frame rate; the determining the following movement path frame data corresponding to the following role specifically includes:
when the target frame speed corresponding to any third target data frame in the target data frames is zero, acquiring the state of a stopping target frame corresponding to a stopping target data frame which is equal to the difference of the preset following frame number before the third target data frame;
when the number of the states of the stopping target frames is multiple, acquiring a fourth target data frame with changed state, and determining a fourth target frame position corresponding to the fourth target data frame;
acquiring a second following frame position corresponding to the synchronous following data frame with synchronous time according to a third target frame time corresponding to the third target data frame;
and determining a stopping following data frame corresponding to the following role based on the distance between the second following frame position and the fourth target frame position.
Specifically, the determining the step-down following data frame corresponding to the following role specifically includes:
and if the distance between the second following frame position and the fourth target frame position is less than or equal to a preset stopping following distance, taking the fourth target frame position as a first stopping position of the following role, and determining the stopping following data frame according to the second following frame position and the first stopping position.
Specifically, the determining the step-down following data frame corresponding to the following role specifically includes:
if the distance between the second following frame position and the fourth target frame position is greater than the preset stopping following distance, determining a second stopping position of the following role based on a third target frame position corresponding to a third target data frame, and determining the stopping following data frame according to the second following frame position and the second stopping position.
Specifically, the receiving of the role following instruction specifically includes:
receiving the role following instruction sent by a target role terminal; and/or
And receiving the role following instruction sent by the role following terminal.
Specifically, if the following role includes a plurality of following roles, before determining the following movement path frame data corresponding to the following role, the method further includes:
respectively determining the following sequence of each following role, and respectively determining the number difference of preset following frames corresponding to each following role according to the following sequence;
the sending the following moving path frame data to the following role terminal corresponding to the following role identifier specifically includes:
and sending the following movement path frame data to the corresponding following role terminals.
According to another aspect of the present application, there is provided a motion following apparatus for a game character, applied to a server, comprising:
the following instruction receiving module is used for receiving a role following instruction, wherein the role following instruction comprises a target role identifier and at least one following role identifier, and the target role is a player role or an NPC role;
the target data frame acquisition module is used for acquiring target moving path frame data of a target role based on the target role identifier;
and the following data frame determining module is used for determining the following moving path frame data corresponding to the following role according to the preset following frame number difference corresponding to the following role and the target moving path frame data, wherein the following moving path frame data comprises a data frame determined by performing interpolation processing on the position corresponding to the target moving path frame data, and the following moving path frame data is sent to the following role terminal corresponding to the following role identifier, so that the following role can realize motion following of the target role.
Specifically, each target data frame included in the target movement path frame data corresponds to a target frame time and a target frame position; the following data frame determining module specifically includes:
and the synchronous following data frame determining unit is used for determining a synchronous following data frame corresponding to the target data frame in the following moving path frame data based on the preset following frame number difference and the target data frame, wherein the following frame time corresponding to the synchronous following data frame is the same as the target frame time corresponding to the target data frame, the following frame position corresponding to the synchronous following data frame is determined based on the position of the target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which the forward direction of the target data frame is different from the preset following frame number difference.
Specifically, the following data frame determining module specifically includes:
an initial frame time obtaining unit, configured to obtain a first target frame position and a first target frame time corresponding to a first target data frame in the target movement path frame data, and obtain a first following frame position of the following role, where the first target data frame is a first frame in the target movement path frame data;
an initial frame position determining unit, configured to perform interpolation based on the first target frame position and the first following frame position to obtain an initial following frame position that matches the difference between the preset following frames;
and the initial following frame determining unit is used for marking the initial following frame time corresponding to the initial following frame position and determining an initial following data frame in the following moving path frame data, wherein the first initial following frame time is the first target frame time.
Specifically, the following data frame determining module specifically includes:
a first synchronization frame data obtaining unit, configured to obtain a first target frame position corresponding to a first target data frame and a second target frame position corresponding to a second target data frame in the target moving path frame data, where the second target data frame is a data frame that differs by the preset number difference of following frames after the first target data frame;
a starting frame position determining unit, configured to perform interpolation based on the first target frame position and the second target frame position to obtain a starting following frame position that matches the preset following frame difference number, where the starting following frame position includes the second target frame position;
and the starting frame data determining unit is used for marking the starting following frame time corresponding to the starting following needle position according to the first target frame time and the second target frame time and determining the starting following frame data in the following moving path frame data.
Specifically, the following data frame determination module further includes:
a second starting frame data obtaining unit, configured to obtain all starting target frame positions corresponding to the first target data frame and the second target data frame before performing interpolation based on the first target frame position and the second target frame position;
and the starting judging unit is used for carrying out interpolation based on the first target frame position and the second target frame position if the starting target frame position has repetition.
Specifically, each target data frame included in the target moving path frame data further corresponds to a target frame state, and the following frame state corresponding to the synchronous following data frame is a target frame state corresponding to a target data frame which is different by the preset following frame number difference before the target data frame of time synchronization.
Specifically, each target data frame included in the target movement path frame data also corresponds to a target frame rate; the following data frame determining module specifically includes:
a step-out frame state obtaining unit, configured to obtain, when a target frame speed corresponding to any one third target data frame in the target data frames is zero, a step-out target frame state corresponding to a step-out target data frame before the third target data frame, which is equivalent to the preset following frame number difference;
the state switching frame acquisition unit is used for acquiring a fourth target data frame with changed state and determining a fourth target frame position corresponding to the fourth target data frame when the stop target frame states comprise a plurality of states;
a stop frame position obtaining unit, configured to obtain, according to a third target frame time corresponding to the third target data frame, a second following frame position corresponding to a synchronized following data frame that is time-synchronized;
and the step-out following frame determining unit is used for determining a step-out following data frame corresponding to the following role based on the distance between the second following frame position and the fourth target frame position.
Specifically, the step-down following frame determining unit specifically includes:
and the first stopping following frame determining subunit is configured to, if the distance between the second following frame position and the fourth target frame position is less than or equal to a preset stopping following distance, use the fourth target frame position as the first stopping position of the following role, and determine the stopping following data frame according to the second following frame position and the first stopping position.
Specifically, the step-down following frame determining unit specifically includes:
a second stopping following frame determining subunit, configured to determine, if a distance between the second following frame position and the fourth target frame position is greater than the preset stopping following distance, a second stopping position of the following role based on a third target frame position corresponding to the third target data frame, and determine the stopping following data frame according to the second following frame position and the second stopping position.
Specifically, the following instruction receiving module specifically includes:
the first following instruction receiving unit is used for receiving the role following instruction sent by the target role terminal; and/or
And the second following instruction receiving unit is used for receiving the role following instruction sent by the role following terminal.
Specifically, the following data frame determination module further includes:
a following frame number difference determining unit, configured to determine a following sequence of each following role before determining following movement path frame data corresponding to the following role if the following role includes a plurality of following roles, and determine a preset following frame number difference corresponding to each following role according to the following sequence;
and the data sending unit is used for sending the following movement path frame data to the corresponding following role terminals.
According to still another aspect of the present application, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the motion following method of a game character described above.
According to still another aspect of the present application, there is provided a computer apparatus including a storage medium, a processor, and a computer program stored on the storage medium and executable on the processor, the processor implementing the motion following method of the game character described above when executing the program.
By means of the technical scheme, the game role movement following method and device, the storage medium and the computer equipment, the server obtains target movement path frame data corresponding to a target role according to a role following instruction, accordingly, the following movement path frame data corresponding to the following role is determined on the basis of the target movement path frame data according to a preset following frame number difference, and the following movement path frame data are sent to a following role terminal, so that the following role terminal controls the following role to move frame by frame according to the following movement path frame data to follow the target role. Compared with the prior art, the method and the device have the advantages that the complex process of determining the data of the following moving path frame is handed to the high-performance server for processing, the processing efficiency is greatly improved, the problems that the client performance pressure is too large to cause blocking, frame loss and performance loss and even the problem of following loss in the following process are solved, on the other hand, the problem that the NPC role in the game scene cannot be followed in the prior art is solved, technical support is provided for increasing game playing methods in the game development stage, guarantee is provided for players to experience more playing methods in the game experience stage, and the game experience of the players and the competitiveness of game products are improved.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a flow chart illustrating a method for following a movement of a game character according to an embodiment of the present disclosure;
FIG. 2 is a flow chart of another method for following the movement of a game character according to an embodiment of the present application;
FIG. 3 is a flow chart of another method for following the movement of a game character according to an embodiment of the present application;
FIG. 4 is a flow chart illustrating another method for following the movement of a game character according to an embodiment of the present disclosure;
FIG. 5 is a flow chart illustrating another method for following the movement of a game character according to an embodiment of the present disclosure;
FIG. 6 is a schematic structural diagram of a motion following device for a game character according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram illustrating a motion following device for a game character according to another embodiment of the present application.
Detailed Description
The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
In this embodiment, a method for following a movement of a game character is provided, which is applied to a server, and as shown in fig. 1, the method includes:
step 101, receiving a role following instruction, wherein the role following instruction comprises a target role identification and at least one following role identification, and the target role is a player role or an NPC role;
102, acquiring target moving path frame data of a target role based on a target role identifier;
and 103, determining following movement path frame data corresponding to the following role according to the preset following frame number difference corresponding to the following role and the target movement path frame data, wherein the following movement path frame data comprises a data frame determined by performing interpolation processing on the target movement path frame data, and sending the following movement path frame data to a following role terminal corresponding to the following role identifier so that the following role can realize motion following of the target role.
The method and the device are applied to the game server, after the server receives the character following instruction from the player client, the server analyzes the character following instruction to obtain the target character identification and the following character identification, and therefore the followed character indicated by the instruction, namely the target character and the following character, are determined. The following character can be one or more, the character following instruction can come from a target character client, for example, in a multi-player cooperative game, a player performs team formation in a game scene, a team leader initiates a team following instruction to instruct one or more team members in a team to follow the player, or the character following instruction can come from a following character client, for example, a player a clicks an avatar of a player B in the game scene to initiate following, and the player a follows the player B to move in the game scene. In addition, the character following instruction may be the following of the player by the player, for example, the player a follows the player B, or the following of the NPC in the game scene by the player, for example, the player a follows the NPC character "small fishing village" in the game scene.
After receiving the character following instruction, the server determines a target character client according to a character identifier corresponding to the target character, and then acquires target movement path frame data corresponding to the target character from the target character client, where the target movement path frame data is a movement path of the target character recorded in units of frames at the beginning of and after the creation of the character following instruction, for example, the frame rate in a game scene is 10Hz, that is, 1 second is represented by a 10-frame picture, and the character following instruction is created at 1 click, then acquires data frames of the character after 1 click and 1 click, that is, data frames corresponding to 1 click, 1 click 0 minute 0.1 second, and the data frames at least should include target character position data and an identifier capable of representing the time corresponding to the data frames, for example, the identifier corresponding to the time of the data frames may be in a number form, specifically, the 1 st frame may be used to represent a data frame with a target character at 1 o 'clock, and the 2 nd frame may be used to identify a data frame … … with a target character at 1 o' clock and 0 min and 0.1 sec, or may be represented by actual time corresponding to the data frame or encoded data for the actual time. In addition, in the target role client, the client can record all moving path frame data of the target role in real time, so that the server can call the historical data of the target role at any time; the moving path frame data of the target role can be recorded after the role following instruction is created, and data support is provided for other roles to follow the target role; the latest movement path frame data generated by the target character may be recorded, for example, the movement path frame data within 1 minute may be recorded, and the earlier data may be deleted.
After acquiring target moving path frame data corresponding to a target role, a server determines moving path frame data of a following role based on the data and a preset following frame number difference, wherein the preset following frame number difference refers to the number of frames of a difference between a moving path of the following role and a moving path of the target role, when the following role comprises a plurality of following frames, the following frame number differences corresponding to a plurality of following roles can be sequentially determined according to the following sequence of the following roles, the following frame number differences can be sequentially increased along with the following sequence, for example, the preset following frame number difference corresponding to a first following role is 5 frames, the preset following frame number difference corresponding to a second following role is 8 frames, and the like, assuming that the target role corresponds to a following role, the preset following frame number difference corresponding to the following role is 5 frames, and in a following state, if the current position of the target role is A, then the follower character arrives at a after 0.5 seconds (preset follower frame number difference/frame rate 5/10 0.5). After the following movement path frame data is determined, the server sends the following movement path frame data to the following character client side, the following character client side can control the following character to move according to the received following movement path frame data, the effect that the following character can move along with the target character is achieved, a player of the following character can move along with the target character, and operation is not needed to be carried out along with the player.
In addition, in some special following stages, the following movement path frame data of the following character can be determined by interpolation based on the position corresponding to the target movement path frame data and the position corresponding to the following character. For example, in the initial following stage of the following character, the character following instruction is issued at this time, but due to the difference of the preset following frame number, the following character does not start to follow the target character yet, interpolation can be performed through the initial position of the target character corresponding to the target movement path frame data of the target character based on the position of the following character, and thus the initial following data frame of the following character is determined based on the interpolation result. For another example, the target character may show a sudden and slow moving speed in several consecutive frames, for example, the first frame moves 10 meters to the second frame, the second frame moves 1 meter to the third frame, and the third frame moves 8 meters to the fourth frame … …, at this time, the positions of the following data frames that can show a smooth moving effect may be obtained by interpolating the positions corresponding to the first frame and the last frame in the several frames of data.
By applying the technical scheme of the embodiment, the server acquires target movement path frame data corresponding to the target role according to the role following instruction, so that the following movement path frame data corresponding to the following role is determined on the basis of the target movement path frame data according to the preset following frame number difference, and the following movement path frame data is sent to the following role terminal, so that the following role terminal controls the following role to move frame by frame according to the following movement path frame data to follow the target role. Compared with the prior art, the method and the device have the advantages that the complex process of determining the data of the following moving path frame is handed to the high-performance server for processing, the processing efficiency is greatly improved, the problems that the client performance pressure is too large to cause blocking, frame loss and performance loss and even the problem of following loss in the following process are solved, on the other hand, the problem that the NPC role in the game scene cannot be followed in the prior art is solved, technical support is provided for increasing game playing methods in the game development stage, guarantee is provided for players to experience more playing methods in the game experience stage, and the game experience of the players and the competitiveness of game products are improved.
Further, as a refinement and an extension of the specific implementation of the above embodiment, in order to fully explain the specific implementation process of the embodiment, another method for following the movement of a game character is provided, as shown in fig. 2, the method includes:
in order to improve the performance effect of the following function, embodiments of the present application provide multiple ways of determining the following movement path frame data, and multiple implementations may be used in different stages of target following, where the specific step 103 may include:
in the first mode, in the synchronous following phase of following roles, as shown in fig. 2, step 103 specifically includes:
step 103-1, determining a synchronous following data frame corresponding to the target data frame in the following moving path frame data based on the preset following frame number difference and the target data frame, wherein the following frame time corresponding to the synchronous following data frame is the same as the target frame time corresponding to the target data frame, the following frame position corresponding to the synchronous following data frame is determined based on the position of the target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which the forward phase difference of the target data frame is the preset following frame number difference.
In the above embodiment, each target data frame included in the target movement path frame data corresponds to a target frame time and a target frame position. The synchronous following phase refers to a phase in which the following character completely moves according to the movement track of the target character, and includes an asynchronous following phase, for example, in a starting phase of the target character, the target character may have a stop-and-go phenomenon, in order to improve the following performance effect, the following character may move not according to the stop-and-go manner of the target character but in a smoother manner, and at this time, the following character may not move according to the target frame position of the target data frame corresponding to the target character.
For the synchronous following stage, the synchronous following data frame is corresponding to the following moving path frame data, most following processes belong to synchronous following in the following process in the actual game scene, in the embodiment of the application, each target data frame is corresponding to a target frame time used for reflecting the actual time or game world time corresponding to the frame data, the following frame time corresponding to the synchronous following data frame is the same as the target frame time corresponding to the target data frame,
for example, the following character follows the target character at the beginning of 1 click, the moving path of the target character after 1 click for zero 0.5 seconds is synchronously followed, and if the preset following frame number difference is 5, the game frame rate is 10Hz, and the following character and the target character have a following time difference of 0.5 seconds, the following character should follow the moving feature of the target character at 1 click for zero 0.5 seconds at 1 click for zero 1 second. The synchronous following start time of the following role is determined to be 1 point zero 1 second, a data frame corresponding to 1 point zero 1 second is found in target moving path frame data, 1 point zero 1 second is determined to be the following frame time corresponding to the synchronous following frame, then 5 frames are shifted forward, the target frame position corresponding to the target data frame of 1 point zero 0.5 second is obtained, the position is determined to be the position of the synchronous following data frame, the following frame time corresponding to the synchronous following frame is 1 point zero 1 second, for example, the following frame time corresponding to the synchronous following frame is 1 point zero 1.1 second, and then the following frame position corresponding to the synchronous following frame is the position corresponding to the target frame of 1 point zero 0.6 second. And by analogy, all synchronous following frames are determined, and then the following moving path frame data is determined.
In a second manner, in the initial following phase of following the role, as shown in fig. 3, step 103 specifically includes:
103-2-1, acquiring a first target frame position and a first target frame time corresponding to a first target data frame in the target moving path frame data, and acquiring a first following frame position of a following role, wherein the first target data frame is a first frame in the target moving path frame data;
103-2-2, performing interpolation based on the first target frame position and the first following frame position to obtain an initial following frame position matched with the difference quantity of the preset following frames;
and 103-2-3, marking initial following frame time corresponding to the initial following frame position, and determining an initial following data frame in the following movement path frame data, wherein the first initial following frame time is a first target frame time.
In the above embodiment, the initial following stage of following the character refers to a stage in which the character follows the command, but the following character does not start following the target character due to the existence of the preset following frame number difference, for example, the preset following frame number difference is 5, and if the following character wants to follow the target character to move, the following character needs to wait until the target character moves for 5 frames and then starts to move. In order to solve the problem that the following display effect is poor due to the fact that the following role is in a waiting state in the display time of the 5 frames of images, the initial following data frame is set in the initial following stage, so that the following role can move according to the initial following data frame, and in-place waiting is avoided. Specifically, a first frame in the target movement path frame data is acquired as a first target data frame, a first target frame position and a first target frame time corresponding to the first target data frame, namely an initial position of a target character, the first target frame time is taken as a following start time of a following character, the following character enters a following state from the time, the following character data frame corresponding to the time is a first following frame of the following character, the position of the following character at the following start time, namely the first following frame position, is further acquired, interpolation is carried out on the first target frame position and the first following frame position, the initial following data frame of the following character is determined according to an interpolation result, the interpolation result is used as position information of the initial following data frame, and the time information of the initial following data frame is synchronous with the target following data frame.
For example, if the following frame number difference is set to 5, the time for acquiring the first target data frame corresponding to the target character is 1 click, the position is a, and the position B of the following character at 1 click is acquired, then the position a and the position B are interpolated to obtain 4 interpolated positions including the position B, the interpolated 1, the interpolated 2, the interpolated 3, and the interpolated 4 are respectively labeled with corresponding time information according to the order of the distance between the interpolated position and the position a from large to small, the time corresponding to the position B is 1 click, the time corresponding to the interpolated 1 is 1 point zero 0.1 second, and the time corresponding to the interpolated 2 is 1 point zero 0.2 second … …. Therefore, the initial following data frame is determined, the following character moves according to the initial following data frame, the following character can be shown to be in a moving state to approach the target character in the first 5 frames of images at the beginning of following, the stay waiting effect of the following character is prevented from being shown by the client side of the following character, and the following state expressive force in a game scene is improved.
In a third mode, in a starting following stage of following a role, as shown in fig. 4, step 103 specifically includes:
103-3-1, acquiring a first target frame position corresponding to a first target data frame and a second target frame position corresponding to a second target data frame in the target moving path frame data, wherein the second target data frame is a data frame which is different from the first target data frame by a preset following frame number difference quantity;
103-3-2, performing interpolation based on the first target frame position and the second target frame position to obtain a starting following frame position matched with the difference quantity of the preset following frames, wherein the starting following frame position comprises the second target frame position;
and 103-3-3, marking the starting following frame time corresponding to the starting following needle position according to the first target frame time and the second target frame time, and determining starting following frame data in the following movement path frame data.
In the above embodiment, the starting and following phase of the following character refers to a following phase of the following character to the starting and moving path of the target character, for example, after the character following command is issued, the target character starts to change from an original stop state to a moving state, the target character may be in an untimely state of stop-and-go within a short time when the target character starts to move, and if the following character completely follows the target character from frame to frame according to the moving path of the target character, the following character will also present the state, which results in a decrease in the following expressive force.
Specifically, a first frame in the target movement path frame data is acquired as a first target data frame and a second target data frame which is different from the first target data frame by a preset following frame number difference, and a first target frame position corresponding to the first target data frame and a second target frame position corresponding to the second target data frame are acquired, for example, the preset following frame number difference is 5, the first target data frame is a data frame corresponding to 1-point integer, and the second target data frame is a data frame corresponding to 1-point zero 0.5 second. Further, interpolation is carried out based on the first target frame position and the second target frame position to obtain 5 interpolation positions (5 is a preset following frame number difference) including the second target frame position as starting following frame positions, then corresponding time information is sequentially marked at each starting following frame position to obtain a complete starting following frame, the starting following frame is used as a part of following moving path frame data to be sent to a following role terminal to guide a following moving path of a following role, so that the movement of the following role in a starting stage tends to be smooth, and the following effect is improved.
In addition, in the above embodiment, if the initiating character follows and the target character does not show a start state of stop-and-go, determining the following data frame of the following character according to the target data frame corresponding to the target character directly without smoothing the position information in the target data frame, and specifically, after step 103-3-1 and before step 103-3-2, the method may further include: acquiring all starting target frame positions corresponding to the first target data frame and the second target data frame; if the starting target frame position is repeated, the step 103-3-2 is executed. If the obtained starting target frame positions are repeated, the target character is shown as a starting state of stop-and-go after the starting of the following instruction, and the starting state is subjected to smoothing processing according to the method.
In a fourth way, in the step-down following phase of the following character, as shown in fig. 5, step 103 specifically includes:
103-4-1, when the target frame speed corresponding to any third target data frame in the target data frames is zero, acquiring the state of the stop target frame corresponding to the stop target data frame corresponding to the number of the preset following frame number difference before the third target data frame;
103-4-2, when the number of the states of the stopping target frames is multiple, acquiring a fourth target data frame with the changed state, and determining a fourth target frame position corresponding to the fourth target data frame;
103-4-3, acquiring a second following frame position corresponding to the synchronous following data frame of time synchronization according to a third target frame time corresponding to a third target data frame;
and step 103-4-4, determining the step-down following data frame corresponding to the following role based on the distance between the second following frame position and the fourth target frame position.
In the above embodiment, it should be noted that each target data frame included in the target moving path frame data further corresponds to a target frame state, and the following frame state corresponding to the synchronous following data frame is a target frame state corresponding to a target data frame which is different by a preset following frame number difference before the target data frame of the time synchronization. That is, the following data frame follows the state of the target data frame in addition to the position of the corresponding target data frame, where the target frame state may include a flight state, a walking state, etc. of the target character. The target character is in a flying state at the position A, and the following character is in a flying state when following to the position A, and the target character is switched from the flying state to a walking state at the position B, and the following character is switched from the flying state to the walking state when following to the position B. On the basis, there are some special cases, for example, the target character stops at the position D soon after the position C is switched from the flight state to the walking state, if the following character should also stop at the position D after the position C is switched from the flight state to the walking state in a synchronous following manner, but if the following character stops at the same position as the target character in this manner, this may cause poor overlapping display effect of multiple characters in the game scene, therefore, when the target character stops moving, the following character should stop at a distance from the target character, and if the state of the target character changes in the stopping stage, the stopping following data frame of the following character may be determined through the above-described embodiment.
Specifically, first, a data frame corresponding to the target character in the step-down stage is acquired, and if the speed corresponding to the target data frame is zero, it indicates that the target character stops moving at the corresponding position of the frame, so that a third target data frame with target frame data of zero in the target data frames and the previous 5 data frames (5 is a preset following frame number difference) including the third target data frame are acquired as the step-down target data frames; secondly, judging whether the target role is switched in the stop stage or not by acquiring the stop target frame state corresponding to the stop target data frame, and if the stop target frame state comprises a plurality of descriptions that the target role is switched in the stop stage; then, after the target role is determined to have state switching in the step-down stage, a fourth target data frame with state switching and a fourth target frame position corresponding to the fourth target data frame are obtained, namely the position of the target role with state switching; then, acquiring the position of the following character in a third target frame time (the third target frame time is the time when the target character stops moving), namely the position of the second following frame; and finally, setting a stopping following data frame of the following role according to the distance between the second following position and the fourth target frame position, namely determining the stopping following data frame according to the distance between the position of the state switching of the target role and the position of the following role at the time when the target role stops moving.
In step 103-4-4 of the above embodiment, specifically, if the distance between the second following frame position and the fourth target frame position is less than or equal to the preset stopping following distance, the fourth target frame position is used as the first stopping position of the following character, and the stopping following data frame is determined according to the second following frame position and the first stopping position. And if the distance between the second following frame position and the fourth target frame position is greater than the preset stopping following distance, determining a second stopping position of the following role based on a third target frame position corresponding to a third target data frame, and determining a stopping following data frame according to the second following frame position and the second stopping position.
In this embodiment, it is assumed that the target character is switched from the flight state to the walking state in the step-down phase, the target character stops at the position D immediately after the flight state is switched from the flight state to the walking state at the position C, the following character moves to the position E when the target character stops at the position D, and the step-down following data frame of the following character is determined according to the distance between the position E and the position D because the target character does not move after stopping at the position D. If the distance between the position E and the position D is shorter (less than or equal to the preset stop following distance), the following character is closer to the position D when the target character stops moving, in order to show that the state of the following character is switched with the state of the target character, the following character is moved to the position C and then stops moving, namely the fourth target frame position (the state switching position of the target character) is used as the first stop position of the following character, a stop following data frame is set based on the fourth target frame position (the position corresponding to the last frame of the stop following data frame is ensured to be the fourth target frame position), and the following character can be shown as the following character to move to the position C and stop according to the stop following data frame movement and is shown as a flight state before the position C because the state of the following character follows the state of the target character, a walking state is represented at the position C.
If the position E is far away from the position D (larger than the preset stopping following distance), which indicates that the target character is far away from the following character when the target character stops moving, in order to reflect the following of the target character on the level of position and state by the following character and enable the following character not to overlap with the target character when the following character stops, the stopping position of the following character is determined based on the position D (third target frame position), namely the stopping following data frame of the following character is determined according to the moving stopping position of the target character, for example, the position of the target character path with a certain distance from the position D is determined as a second stopping position F (for example, the position with a distance of 0.5 m from the position D in a game scene is determined as a stopping position) based on the moving direction of the target character, and the stopping following data frame is set based on the second stopping position, because the state of the following character follows the state of the target character, the following character moving in the step-out following data frame may appear as the following character moving to position F and stopping following the target character, and the following character appears in a flying state before position C and in a walking state after and at position C.
The following data frame is determined through the four embodiments, so that the following role can be represented to move to the target role instead of stay in place at the initial stage, smooth starting can be realized at the starting stage, the situation that the following role completely follows the target role and possibly represents a stop-and-go state can be avoided at the starting stage, the following role can be completely moved along with the position and the state of the target role at the synchronization stage, the state switching process of the target role can be completely reflected at the stopping stage and stopped at the position where the following role is not overlapped with the target role, the technical effect of improving the representation effect of the following role in the following process is achieved through the above modes, the game experience of a user is improved, the calculation is carried out by matching with a high-performance server, the calculation accuracy is ensured, and the pressure on the performance can not be brought to a client.
In addition, in any embodiment of the present application, specifically, if the following roles include a plurality of following roles, the following order of each following role is determined respectively, and the preset following frame number difference corresponding to each following role is determined according to the following order. Correspondingly, the following movement path frame data are sent to the corresponding following role terminals.
In this embodiment, one target character may correspond to a plurality of following characters, for example, a game team in a game scene includes 3 persons, a team leader initiates the following in the team, two other team leaders may follow the team leader, and in order to avoid that each team member does not overlap in the following process, a plurality of preset following frame number differences may be set for this situation, for example, the following frame number difference corresponding to the following character 1 is 5, and the following frame number difference corresponding to the following character 2 is 10. On this basis, according to the method for determining the following movement path frame data provided by the embodiment of the application, the following movement path frame data corresponding to each following role is respectively determined and sent to the corresponding following role terminal, so that role overlapping does not occur when a plurality of roles follow the target role, and the following display effect is improved.
Further, as a specific implementation of the method in fig. 1, an embodiment of the present application provides a motion following apparatus for a game character, which is applied to a server, and as shown in fig. 6, the apparatus includes:
a following instruction receiving module 61, configured to receive a role following instruction, where the role following instruction includes a target role identifier and at least one following role identifier, and the target role is a player role or an NPC role;
a target data frame obtaining module 62, configured to obtain target movement path frame data of a target role based on the target role identifier;
and a following data frame determining module 63, configured to determine following movement path frame data corresponding to the following role according to a preset following frame number difference corresponding to the following role and the target movement path frame data, where the following movement path frame data includes a data frame determined by performing interpolation processing based on a position corresponding to the target movement path frame data, and send the following movement path frame data to a following role terminal corresponding to the following role identifier, so that the following role realizes motion following of the target role.
In a specific application scenario, as shown in fig. 7, each target data frame included in the target movement path frame data corresponds to a target frame time and a target frame position; the following data frame determining module 63 specifically includes:
the synchronous following data frame determining unit 6301 is configured to determine, based on the preset following frame number difference and the target data frame, a synchronous following data frame corresponding to the target data frame in the following moving path frame data, where a following frame time corresponding to the synchronous following data frame is the same as a target frame time corresponding to the target data frame, a following frame position corresponding to the synchronous following data frame is determined based on a position of the target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which a forward direction of the target data frame is different by the preset following frame number difference.
In a specific application scenario, as shown in fig. 7, the following data frame determining module 63 specifically includes:
an initial frame time obtaining unit 6302, configured to obtain a first target frame position and a first target frame time corresponding to a first target data frame in target movement path frame data, and obtain a first following frame position of a following role, where the first target data frame is a first frame in the target movement path frame data;
an initial frame position determining unit 6303, configured to perform interpolation based on the first target frame position and the first following frame position to obtain an initial following frame position matching the difference between the preset following frames;
the initial following frame determining unit 6304 is configured to mark an initial following frame time corresponding to the initial following frame position, and determine an initial following data frame in the following movement path frame data, where the first initial following frame time is a first target frame time.
In a specific application scenario, as shown in fig. 7, the following data frame determining module 63 specifically includes:
a first synchronization frame data acquiring unit 6305, configured to acquire a first target frame position corresponding to a first target data frame and a second target frame position corresponding to a second target data frame in target moving path frame data, where the second target data frame is a data frame that differs by a preset number difference of following frames after the first target data frame;
a starting frame position determining unit 6306, configured to perform interpolation based on the first target frame position and the second target frame position to obtain a starting following frame position that matches the preset following frame difference number, where the starting following frame position includes the second target frame position;
the starting frame data determining unit 6307 is configured to label, according to the first target frame time and the second target frame time, a starting following frame time corresponding to the starting following pin position, and determine starting following frame data in the following movement path frame data.
In a specific application scenario, as shown in fig. 7, the following data frame determining module 63 further includes:
a second starting frame data obtaining unit 6308, configured to obtain all starting target frame positions corresponding to the first target data frame and the second target data frame before performing interpolation based on the first target frame position and the second target frame position;
a starting determining unit 6309, configured to perform interpolation based on the first target frame position and the second target frame position if there is a repetition in the starting target frame position.
In a specific application scenario, each target data frame included in the target moving path frame data also corresponds to a target frame state, and the following frame state corresponding to the synchronous following data frame is the target frame state corresponding to the target data frame which is different by a preset following frame number difference before the target data frame of time synchronization.
In a specific application scenario, as shown in fig. 7, each target data frame included in the target movement path frame data also corresponds to a target frame rate; the following data frame determining module 63 specifically includes:
a stop frame state acquiring unit 6310, configured to acquire, when a target frame speed corresponding to any one third target data frame in the target data frames is zero, a stop target frame state corresponding to a stop target data frame before the third target data frame, where the number of the stop target frame before the third target data frame is equal to the preset following frame number difference;
a state switching frame acquiring unit 6311, configured to acquire a fourth target data frame with a changed state when the number of states of the step-down target frames is multiple, and determine a fourth target frame position corresponding to the fourth target data frame;
a stop frame position acquiring unit 6312, configured to acquire, according to a third target frame time corresponding to a third target data frame, a second following frame position corresponding to a synchronized following data frame in time synchronization;
a step-down following frame determining unit 6313, configured to determine a step-down following data frame corresponding to the following character based on a distance between the second following frame position and the fourth target frame position.
In a specific application scenario, not shown in the figure, the step-down following frame determining unit 6313 specifically includes:
the first stopping following frame determining subunit 63131 is configured to, if the distance between the second following frame position and the fourth target frame position is less than or equal to the preset stopping following distance, use the fourth target frame position as the first stopping position of the following role, and determine the stopping following data frame according to the second following frame position and the first stopping position.
In a specific application scenario, not shown in the figure, the step-down following frame determining unit specifically includes:
a second stopping following frame determining subunit 63132, configured to determine, if the distance between the second following frame position and the fourth target frame position is greater than the preset stopping following distance, a second stopping position of the following role based on a third target frame position corresponding to a third target data frame, and determine a stopping following data frame according to the second following frame position and the second stopping position.
In a specific application scenario, as shown in fig. 7, the following instruction receiving module 61 specifically includes:
a first following instruction receiving unit 6101, configured to receive a role following instruction sent by the target role terminal; and/or
A second following instruction receiving unit 6102, configured to receive the character following instruction sent by the following character terminal.
In a specific application scenario, as shown in fig. 7, the following data frame determining module 63 further includes:
a following frame number difference determining unit 6314, configured to determine a following sequence of each following role before determining that the following movement path frame data corresponding to the following role includes multiple following frames, and determine a preset following frame number difference corresponding to each following role according to the following sequence;
a data transmitting unit 6315, configured to transmit the following movement path frame data to the respective corresponding following character terminals.
It should be noted that other corresponding descriptions of the functional units related to the motion following device for a game character provided in the embodiment of the present application may refer to the corresponding descriptions in the methods of fig. 1 to fig. 5, and are not described herein again.
Based on the method shown in fig. 1 to 5, correspondingly, the embodiment of the present application further provides a storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the motion following method for the game character shown in fig. 1 to 5.
Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the implementation scenarios of the present application.
Based on the method shown in fig. 1 to fig. 5 and the virtual device embodiment shown in fig. 6 to fig. 7, in order to achieve the above object, the present application further provides a computer device, which may specifically be a personal computer, a server, a network device, and the like, where the computer device includes a storage medium and a processor; a storage medium for storing a computer program; a processor for executing a computer program to implement the above-described motion following method of the game character as shown in fig. 1 to 5.
Optionally, the computer device may also include a user interface, a network interface, a camera, Radio Frequency (RF) circuitry, sensors, audio circuitry, a WI-FI module, and so forth. The user interface may include a Display screen (Display), an input unit such as a keypad (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, etc. The network interface may optionally include a standard wired interface, a wireless interface (e.g., a bluetooth interface, WI-FI interface), etc.
It will be appreciated by those skilled in the art that the present embodiment provides a computer device architecture that is not limiting of the computer device, and that may include more or fewer components, or some components in combination, or a different arrangement of components.
The storage medium may further include an operating system and a network communication module. An operating system is a program that manages and maintains the hardware and software resources of a computer device, supporting the operation of information handling programs, as well as other software and/or programs. The network communication module is used for realizing communication among components in the storage medium and other hardware and software in the entity device.
Through the description of the above embodiment, those skilled in the art can clearly understand that the present application can be implemented by software plus a necessary general hardware platform, and also can obtain target movement path frame data corresponding to a target character through a hardware implementation server according to a character following instruction, thereby determining the following movement path frame data corresponding to the following character on the basis of the target movement path frame data according to a preset following frame number difference, and sending the following movement path frame data to a following character terminal, so that the following character terminal controls the following character to move frame by frame according to the following movement path frame data to follow the target character. Compared with the prior art, the method and the device have the advantages that the complex process of determining the data of the following moving path frame is handed to the high-performance server for processing, the processing efficiency is greatly improved, the problems that the client performance pressure is too large to cause blocking, frame loss and performance loss and even the problem of following loss in the following process are solved, on the other hand, the problem that the NPC role in the game scene cannot be followed in the prior art is solved, technical support is provided for increasing game playing methods in the game development stage, guarantee is provided for players to experience more playing methods in the game experience stage, and the game experience of the players and the competitiveness of game products are improved.
Those skilled in the art will appreciate that the figures are merely schematic representations of one preferred implementation scenario and that the blocks or flow diagrams in the figures are not necessarily required to practice the present application. Those skilled in the art will appreciate that the modules in the devices in the implementation scenario may be distributed in the devices in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the present implementation scenario with corresponding changes. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The above application serial numbers are for description purposes only and do not represent the superiority or inferiority of the implementation scenarios. The above disclosure is only a few specific implementation scenarios of the present application, but the present application is not limited thereto, and any variations that can be made by those skilled in the art are intended to fall within the scope of the present application.

Claims (14)

1. A motion following method of a game character is applied to a server and is characterized by comprising the following steps:
receiving a role following instruction, wherein the role following instruction comprises a target role identification and at least one following role identification, and the target role is a player role or an NPC role;
acquiring target moving path frame data of the target role based on the target role identifier;
and determining the following movement path frame data corresponding to the following role according to the preset following frame number difference corresponding to the following role and the target movement path frame data, wherein the following movement path frame data comprises a data frame determined by performing interpolation processing on the basis of the position corresponding to the target movement path frame data, and sending the following movement path frame data to a following role terminal corresponding to the following role identifier so that the following role can realize motion following of the target role.
2. The method of claim 1, wherein the target movement path frame data comprises target data frames each corresponding to a target frame time and a target frame position; the determining the following movement path frame data corresponding to the following role specifically includes:
and determining a synchronous following data frame corresponding to the target data frame in the following moving path frame data based on the preset following frame number difference and the target data frame, wherein the following frame time corresponding to the synchronous following data frame is the same as the target frame time corresponding to the target data frame, the following frame position corresponding to the synchronous following data frame is determined based on the position of the target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which the forward phase difference of the target data frame is the preset following frame number difference.
3. The method according to claim 2, wherein the determining the following movement path frame data corresponding to the following role specifically includes:
acquiring a first target frame position and a first target frame time corresponding to a first target data frame in the target movement path frame data, and acquiring a first following frame position of the following role, wherein the first target data frame is a first frame in the target movement path frame data;
performing interpolation based on the first target frame position and the first following frame position to obtain an initial following frame position matched with the difference quantity of the preset following frames;
marking initial following frame time corresponding to the initial following frame position, and determining an initial following data frame in the following movement path frame data, wherein the first initial following frame time is the first target frame time.
4. The method according to claim 2, wherein the determining the following movement path frame data corresponding to the following role specifically includes:
acquiring a first target frame position corresponding to a first target data frame and a second target frame position corresponding to a second target data frame in the target moving path frame data, wherein the second target data frame is a data frame which is different from the first target data frame by the preset following frame number difference quantity;
interpolating based on the first target frame position and the second target frame position to obtain a starting following frame position matched with the difference quantity of the preset following frames, wherein the starting following frame position comprises the second target frame position;
marking the starting following frame time corresponding to the starting following needle position according to the first target frame time and the second target frame time, and determining starting following frame data in the following movement path frame data.
5. The method of claim 4, wherein prior to said interpolating based on said first target frame position and said second target frame position, said method further comprises:
acquiring all starting target frame positions corresponding to the first target data frame and the second target data frame;
and if the starting target frame position is repeated, performing interpolation based on the first target frame position and the second target frame position.
6. The method according to claim 2, wherein each target data frame included in the target moving path frame data further corresponds to a target frame status, and the following frame status corresponding to the synchronous following data frame is a target frame status corresponding to a target data frame that differs by the preset following frame number difference before the target data frame that is time-synchronized.
7. The method of claim 6, wherein the target movement path frame data further comprises a target frame rate for each target data frame; the determining the following movement path frame data corresponding to the following role specifically includes:
when the target frame speed corresponding to any third target data frame in the target data frames is zero, acquiring the state of a stopping target frame corresponding to a stopping target data frame which is equal to the difference of the preset following frame number before the third target data frame;
when the number of the states of the stopping target frames is multiple, acquiring a fourth target data frame with changed state, and determining a fourth target frame position corresponding to the fourth target data frame;
acquiring a second following frame position corresponding to the synchronous following data frame with synchronous time according to a third target frame time corresponding to the third target data frame;
and determining a stopping following data frame corresponding to the following role based on the distance between the second following frame position and the fourth target frame position.
8. The method according to claim 7, wherein the determining the step-out following data frame corresponding to the following role specifically includes:
and if the distance between the second following frame position and the fourth target frame position is less than or equal to a preset stopping following distance, taking the fourth target frame position as a first stopping position of the following role, and determining the stopping following data frame according to the second following frame position and the first stopping position.
9. The method according to claim 7, wherein the determining the step-out following data frame corresponding to the following role specifically includes:
if the distance between the second following frame position and the fourth target frame position is greater than the preset stopping following distance, determining a second stopping position of the following role based on a third target frame position corresponding to a third target data frame, and determining the stopping following data frame according to the second following frame position and the second stopping position.
10. The method of claim 7, wherein the receiving of the role-following instruction specifically comprises:
receiving the role following instruction sent by a target role terminal; and/or
And receiving the role following instruction sent by the role following terminal.
11. The method according to claim 1, wherein if the following character includes a plurality of following characters, before determining the following movement path frame data corresponding to the following character, the method further comprises:
respectively determining the following sequence of each following role, and respectively determining the number difference of preset following frames corresponding to each following role according to the following sequence;
the sending the following moving path frame data to the following role terminal corresponding to the following role identifier specifically includes:
and sending the following movement path frame data to the corresponding following role terminals.
12. A motion following device for a game character, applied to a server, comprising:
the following instruction receiving module is used for receiving a role following instruction, wherein the role following instruction comprises a target role identifier and at least one following role identifier, and the target role is a player role or an NPC role;
the target data frame acquisition module is used for acquiring target moving path frame data of a target role based on the target role identifier;
and the following data frame determining module is used for determining the following moving path frame data corresponding to the following role according to the preset following frame number difference corresponding to the following role and the target moving path frame data, wherein the following moving path frame data comprises a data frame determined by performing interpolation processing on the position corresponding to the target moving path frame data, and the following moving path frame data is sent to the following role terminal corresponding to the following role identifier, so that the following role can realize motion following of the target role.
13. A storage medium on which a computer program is stored, characterized in that the computer program realizes a motion following method of a game character according to any one of claims 1 to 11 when executed by a processor.
14. A computer device comprising a storage medium, a processor, and a computer program stored on the storage medium and executable on the processor, wherein the processor implements the motion following method of a game character according to any one of claims 1 to 11 when executing the computer program.
CN202010993884.0A 2020-09-21 2020-09-21 Game character movement following method and device, storage medium and computer equipment Pending CN112107862A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010993884.0A CN112107862A (en) 2020-09-21 2020-09-21 Game character movement following method and device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010993884.0A CN112107862A (en) 2020-09-21 2020-09-21 Game character movement following method and device, storage medium and computer equipment

Publications (1)

Publication Number Publication Date
CN112107862A true CN112107862A (en) 2020-12-22

Family

ID=73800208

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010993884.0A Pending CN112107862A (en) 2020-09-21 2020-09-21 Game character movement following method and device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN112107862A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112604288A (en) * 2020-12-29 2021-04-06 珠海金山网络游戏科技有限公司 Game role control method and device
CN113559514A (en) * 2021-07-30 2021-10-29 网易(杭州)网络有限公司 Control method and device in game, electronic equipment and storage medium
CN113694512A (en) * 2021-08-11 2021-11-26 网易(杭州)网络有限公司 Control method and device of virtual role and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110039618A1 (en) * 2009-08-11 2011-02-17 Namco Bandai Games Inc. Information storage medium and image generation system
CN106362397A (en) * 2016-10-14 2017-02-01 网易(杭州)网络有限公司 Carrier processing method and device
JP2017047028A (en) * 2015-09-03 2017-03-09 株式会社カプコン Game program and game apparatus
CN108717372A (en) * 2018-05-24 2018-10-30 网易(杭州)网络有限公司 Virtual object control method and device in a kind of scene of game
US20180359427A1 (en) * 2015-10-05 2018-12-13 Woncheol Choi Virtual flying camera system
CN110292773A (en) * 2019-07-04 2019-10-01 珠海西山居移动游戏科技有限公司 A kind of role movement follower method and device calculate equipment and storage medium
CN110694266A (en) * 2019-10-23 2020-01-17 网易(杭州)网络有限公司 Game state synchronization method, game state display method and game state display device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110039618A1 (en) * 2009-08-11 2011-02-17 Namco Bandai Games Inc. Information storage medium and image generation system
JP2017047028A (en) * 2015-09-03 2017-03-09 株式会社カプコン Game program and game apparatus
US20180359427A1 (en) * 2015-10-05 2018-12-13 Woncheol Choi Virtual flying camera system
CN106362397A (en) * 2016-10-14 2017-02-01 网易(杭州)网络有限公司 Carrier processing method and device
CN108717372A (en) * 2018-05-24 2018-10-30 网易(杭州)网络有限公司 Virtual object control method and device in a kind of scene of game
CN110292773A (en) * 2019-07-04 2019-10-01 珠海西山居移动游戏科技有限公司 A kind of role movement follower method and device calculate equipment and storage medium
CN110694266A (en) * 2019-10-23 2020-01-17 网易(杭州)网络有限公司 Game state synchronization method, game state display method and game state display device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112604288A (en) * 2020-12-29 2021-04-06 珠海金山网络游戏科技有限公司 Game role control method and device
CN113559514A (en) * 2021-07-30 2021-10-29 网易(杭州)网络有限公司 Control method and device in game, electronic equipment and storage medium
CN113694512A (en) * 2021-08-11 2021-11-26 网易(杭州)网络有限公司 Control method and device of virtual role and electronic equipment

Similar Documents

Publication Publication Date Title
CN111921201B (en) Method and device for generating frame data, storage medium and computer equipment
CN112107862A (en) Game character movement following method and device, storage medium and computer equipment
US11484802B2 (en) Interactive gameplay playback system
JP6144738B2 (en) Video game processing program, video game processing system, and video game processing method
CN110062271A (en) Method for changing scenes, device, terminal and storage medium
CN111659120B (en) Virtual role position synchronization method, device, medium and electronic equipment
JP7431497B2 (en) Game provision method and system based on video calls and object recognition
JP2022503919A (en) Establishing and managing multiplayer sessions
CN111643903B (en) Control method and device of cloud game, electronic equipment and storage medium
JP2017074162A (en) Game program and game system
CN112698767B (en) Multimedia file sending method and device and electronic equipment
CN112999652A (en) Efficient network synchronization method, device and system
CN116747514A (en) Game scene preloading method, game scene preloading device, medium and equipment
CN111790144A (en) Game method and device based on live interface and game live interaction system
CN111111192A (en) Game role moving method and device
JP2002292116A (en) Game apparatus, game control method and recording medium and computer program therefor
CN114887329A (en) Virtual object control method, device, equipment and storage medium
JP2023547721A (en) Screen display methods, devices, equipment, and programs in virtual scenes
CN113975802A (en) Game control method, device, storage medium and electronic equipment
CN113577760A (en) Game operation guiding method and device, electronic equipment and storage medium
CN108415749B (en) Display processing method, medium, device and computing equipment
JP6526101B2 (en) Video game processing program, video game processing system and video game processing method
CN111135578B (en) Game character moving method and device
WO2024055811A1 (en) Message display method and apparatus, device, medium, and program product
KR102497432B1 (en) Method and system for providing game using switching between continuous automatic battle function and manual battle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination