CN112843682A - Data synchronization method, device, equipment and storage medium - Google Patents

Data synchronization method, device, equipment and storage medium Download PDF

Info

Publication number
CN112843682A
CN112843682A CN202110243989.9A CN202110243989A CN112843682A CN 112843682 A CN112843682 A CN 112843682A CN 202110243989 A CN202110243989 A CN 202110243989A CN 112843682 A CN112843682 A CN 112843682A
Authority
CN
China
Prior art keywords
skill
server
target object
flying object
client
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110243989.9A
Other languages
Chinese (zh)
Other versions
CN112843682B (en
Inventor
赵坤
王忆暄
杨宇宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110243989.9A priority Critical patent/CN112843682B/en
Publication of CN112843682A publication Critical patent/CN112843682A/en
Application granted granted Critical
Publication of CN112843682B publication Critical patent/CN112843682B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Abstract

The application discloses a data synchronization method, a data synchronization device, data synchronization equipment and a storage medium, and belongs to the field of data synchronization. The method comprises the following steps: receiving skill release operation, and sending a skill release request to a server, wherein the skill release request carries a skill identifier, and the server is used for creating a first flyer corresponding to the skill according to the skill identifier; receiving a first data packet sent by a server, wherein the first data packet is generated by the server according to a first flying object and comprises a first motion parameter of the first flying object; and calculating the display position of the first flying object according to the first motion parameter, and displaying a first virtual environment picture according to the display position of the first flying object, wherein the motion path of the first flying object is displayed on the first virtual environment picture. The client independently performs logic operation on the first flyer, so that the corresponding picture of the client is smoother in the process of displaying the movement of the first flyer.

Description

Data synchronization method, device, equipment and storage medium
Technical Field
The present application relates to the field of data synchronization, and in particular, to a data synchronization method, apparatus, device, and storage medium.
Background
When users play Online games, for example, a Massive Multiplayer Online Role Playing Game (MMORPG), a large number of users may participate in the Game Online at the same time.
During the game, the controlled virtual character attacks the target object by using the skill, and the target object is followed by the flyer (such as dart, light wave and the like) corresponding to the skill, so that the hit target object reduces part of the life value. The server realizes data synchronization of each game client in a state synchronization mode, namely the server calculates the display position of the flyer in each frame of picture according to the moving path and the moving speed of the flyer in the moving process from the starting direction to the target side, the server synchronizes the display position to all game clients participating in the game every time the server calculates the display position in one frame of picture, and the game clients load and display the moving process of the flyer in the mode.
In the above technical solution, the game client can load and display the corresponding game picture only after receiving the calculation result corresponding to each frame of picture. When the target object continuously moves, the game client end is easy to receive a calculation result of lag, so that the game picture of the flyer in the moving process has certain delay or pause.
Disclosure of Invention
The embodiment of the application provides a data synchronization method, a data synchronization device, data synchronization equipment and a data synchronization storage medium, wherein a client independently performs logic operation on a first flyer, so that a corresponding picture of the client is smoother in the process of displaying the movement of the first flyer, and meanwhile, the server and the client perform synchronous operation to avoid errors in skill release results. The technical scheme is as follows:
according to an aspect of the present application, there is provided a data synchronization method, the method including:
receiving skill release operation, and sending a skill release request to a server, wherein the skill release request carries a skill identifier, and the server is used for creating a first flyer corresponding to a skill according to the skill identifier;
receiving a first data packet sent by the server, wherein the first data packet is generated by the server according to the first flying object and comprises a first motion parameter of the first flying object;
and calculating the display position of the first flying object according to the first motion parameter, and displaying a first virtual environment picture according to the display position of the first flying object, wherein the motion path of the first flying object is displayed on the first virtual environment picture.
According to another aspect of the present application, there is provided a data synchronization method, the method including:
receiving the skill release request sent by a first client, wherein the skill release request carries a skill identifier;
creating a first flyer corresponding to the skill according to the skill identification;
generating a first data packet according to the first flying object, wherein the first data packet comprises a first motion parameter of the first flying object;
and sending the first data packet to at least two clients, wherein the at least two clients are used for respectively loading and displaying a first virtual environment picture according to the first data packet, and the at least two clients comprise the first client.
According to another aspect of the present application, there is provided a data synchronization apparatus, the apparatus including:
the system comprises a receiving module, a skill releasing module and a server, wherein the receiving module is used for receiving skill releasing operation and sending a skill releasing request to the server, the skill releasing request carries a skill identifier, and the server is used for creating a first flyer corresponding to a skill according to the skill identifier;
the receiving module is configured to receive a first data packet sent by the server, where the first data packet is generated by the server according to the first flying object, and the first data packet includes a first motion parameter of the first flying object;
and the display module is used for calculating the display position of the first flying object according to the first motion parameter, and displaying a first virtual environment picture according to the display position of the first flying object, wherein the motion path of the first flying object is displayed on the first virtual environment picture.
According to an aspect of the present application, there is provided a data synchronization apparatus, the apparatus including:
the receiving module is used for receiving the skill release request sent by the first client, and the skill release request carries a skill identifier;
the creating module is used for creating a first flyer corresponding to the skill according to the skill identification;
the processing module is used for generating a first data packet according to the first flying object, and the first data packet comprises a first motion parameter of the first flying object;
and the sending module is used for sending the first data packet to at least two clients, the at least two clients are used for respectively loading and displaying a first virtual environment picture according to the first data packet, and the at least two clients comprise the first client.
According to another aspect of the present application, there is provided a computer device comprising: a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by the processor to implement a data synchronization method as described above.
According to another aspect of the present application, there is provided a computer readable storage medium having stored therein at least one instruction, at least one program, set of codes, or set of instructions that is loaded and executed by a processor to implement the data synchronization method as described above.
According to another aspect of the application, a computer program product or computer program is provided, comprising computer instructions stored in a computer readable storage medium. The computer instructions are read from the computer-readable storage medium by a processor of a computer device, and the processor executes the computer instructions to cause the computer device to perform the data synchronization method as described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
the client receives the first motion parameter sent by the server to load and display the first flyer, the server does not need to calculate the operation result corresponding to the first flyer first, then the operation result is synchronized to each client, each client can load and display the virtual environment picture according to the motion parameter, even if the target object aimed by the flyer is dynamic, the client can load the corresponding virtual environment picture in time, and certain delay or blockage of the game picture of the first flyer in the moving process is avoided.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 2 is a schematic diagram of a state synchronization technique provided by an exemplary embodiment of the present application;
FIG. 3 is a flow chart of a method of data synchronization provided by an exemplary embodiment of the present application;
FIG. 4 is a flow chart of a method of data synchronization provided by another exemplary embodiment of the present application;
FIG. 5 is a diagram of a virtual environment screen provided by an exemplary embodiment of the present application;
FIG. 6 is a schematic diagram illustrating logical operations of a client and a server according to an exemplary embodiment of the present application;
FIG. 7 is a flow chart of a method of data synchronization provided by another exemplary embodiment of the present application;
FIG. 8 is a diagram of a virtual environment screen provided by another exemplary embodiment of the present application;
FIG. 9 is a schematic diagram of a behavior tree corresponding to a flying object provided by an exemplary embodiment of the present application;
FIG. 10 is a block diagram of a data synchronization apparatus provided in an exemplary embodiment of the present application;
FIG. 11 is a block diagram of a data synchronization apparatus provided in another exemplary embodiment of the present application;
FIG. 12 is a block diagram of a computer device according to an exemplary embodiment of the present application;
fig. 13 is a schematic device structure diagram of a server according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application are described:
virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional environment, or a pure fictional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application. The following embodiments are illustrated with the virtual environment being a three-dimensional virtual environment.
Controlled virtual roles: refers to a movable object in a virtual environment. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in a three-dimensional virtual environment. Optionally, the controlled virtual character is a three-dimensional volumetric model created based on animated skeletal techniques. Each controlled virtual character has a shape and a volume in the three-dimensional virtual environment, and occupies a part of the space in the three-dimensional virtual environment. A controlled virtual character broadly refers to one or more virtual characters in a virtual environment. In the embodiment of the application, the controlled virtual role is a virtual role for releasing skills in a virtual environment.
Skill: the controlled virtual role has a certain ability in the virtual environment, and after the controlled virtual role releases the skill, a certain effect is generated in the virtual environment, for example, the life value of the target object is reduced, and for example, the life value of the controlled virtual role is protected from being reduced within a period of time. In some embodiments, the controlled virtual character may release skills at any time; in other embodiments, the controlled virtual character releases the skill by determining whether a cooling time (CD) of the skill is over, and the controlled virtual character can normally release the skill after the cooling time is over; in other embodiments, the controlled virtual character needs to consume some resources in exchange for the authority to release the skill, for example, the controlled virtual character consumes the normal value to obtain the authority to release the skill a, and the normal value needs to be consumed each time the controlled virtual character releases the skill a. The resources may be legal power, magic power, energy, gold coins, and the like, which are not limited in the embodiments of the present application.
Massively Multiplayer Online Role-Playing Game (MMORPG): the method refers to the fact that a user controls a controlled virtual character to perform various activities in a virtual environment, and the controlled virtual character can simulate various activities and growth tracks of human beings in the real world. In the process of the game, the controlled virtual Character can interact with other controlled virtual characters in real time, and some Non-user playing type characters (NPCs) also exist in the virtual environment, and the NPCs generally carry out activities such as selling virtual articles, providing tasks and the like in the virtual environment.
The method provided in the present application may be applied to a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS game, a Multiplayer Online tactical athletic game (MOBA), an MMORPG game, etc., and the following embodiments are exemplified by applications in Games.
A game based on a virtual environment is composed of one or more maps of game worlds, the virtual environment in the game simulates a scene of a real world, and a user performs various activity events in the virtual environment by controlling a virtual character (referred to as a controlled virtual character for short), the activity events including but not limited to: walking, running, jumping, shooting, fighting, driving, placing virtual props, using virtual props, attacking by other virtual characters, hurting by virtual environment, attacking by other virtual characters (such as other user-controlled virtual characters or NPCs), shopping, exploring, communicating with NPCs, learning by a bailmer (i.e. training, such as force training and intelligence training), establishing or joining in a group (the group refers to a group of at least two controlled virtual character components in a game, a controlled virtual character can complete a task together with other controlled virtual characters in the group or attack other groups, and can set a role grade in the group), participating in a game held in the virtual environment, raising pets (pets including animals, plants, animals, etc.), cooking, making articles (including making tools, furniture, virtual props, etc.), playing, Establishing a marital relationship, cultivating a next generation virtual role and the like.
When the controlled virtual characters fight in the virtual environment, the server needs to send the logical operation related to the fight to all the game clients participating in the fight, so that all the game clients synchronously participate in the game, and under some conditions, the network environment of part of the game clients is poor, and the game pictures are easy to be stuck.
FIG. 1 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 100 includes: a first terminal 120, a server 140, and a second terminal 160.
The first terminal 120 is installed and operated with an application program supporting a virtual environment. The application program may be any one of a Virtual Reality application program, a three-dimensional map program, a military simulation program, a First-Person Shooting Game (FPS), an MOBA Game, a Massively Multiplayer Online Role Playing Game (MMORPG), a Multiplayer gunfight type living Game, a large fleeing and killing type Shooting Game, a Virtual Reality (VR) application program, and an Augmented Reality (AR) program. The first terminal 120 is a terminal used by a first user who uses the first terminal 120 to control a first virtual character located in a virtual environment for activities including, but not limited to: adjusting at least one of body posture, walking, running, jumping, riding, aiming, picking up items, releasing skills, attacking, being attacked by other virtual characters in the virtual environment. Alternatively, the first avatar may be a virtual character, virtual animal, or virtual plant, such as a simulated character or an animated character.
The first terminal 120 is connected to the server 140 through a wireless network or a wired network.
The server 140 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and the like. Illustratively, the server 140 includes a processor 144 and a memory 142, the memory 142 in turn including a receiving module 1421, a processing module 1422, and a transmitting module 1423. Illustratively, the receiving module 1421 is configured to receive the skill identifier sent by the client; the processing module 1422 is configured to create a first flying object corresponding to the skill according to the skill identifier, and determine a motion parameter of the first flying object, so as to generate a data packet corresponding to the first flying object; the sending module 1423 is configured to send a data packet corresponding to the first flying object to each client. The server 140 is also used to provide background services, such as information preservation services, for applications that support a three-dimensional virtual environment. Alternatively, the server 140 undertakes primary computational work and the first and second terminals 120, 160 undertake secondary computational work; alternatively, the server 140 undertakes the secondary computing work and the first terminal 120 and the second terminal 160 undertakes the primary computing work; alternatively, the server 140, the first terminal 120, and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
The server 140 may adopt a synchronization technology to make the screen representations of the multiple clients consistent, and the data synchronization method provided in the embodiment of the present application is implemented by using a state synchronization technology.
State synchronization techniques.
In an alternative embodiment based on fig. 1, the server 140 employs a state synchronization technique to synchronize with multiple clients. In the state synchronization technique, as shown in fig. 2, the combat logic runs in the server 140. When a state change occurs in a virtual role in the virtual environment, the server 140 sends the state synchronization result to all clients, such as clients 1 to 10.
In one example, client 1 sends a request to server 140 requesting virtual character 1 to attack virtual character 2, server 140 determines whether virtual character 1 can attack virtual character 2, and calculates the remaining life value of virtual character 2 when virtual character 1 performs an operation of attacking virtual character 2. Then, the server 140 sends the calculated remaining life value of the virtual character 2 to all the clients, and all the clients update the local data and the interface representation according to the remaining life value of the virtual character 2.
The second terminal 160 is connected to the server 140 through a wireless network or a wired network.
The second terminal 160 is installed and operated with an application program supporting a virtual environment. The application program may be any one of a Virtual Reality application program, a three-dimensional map program, a military simulation program, an FPS game, an MOBA game, an MMORPG game, a multi-player gunfight type survival game, a large fleeing and killing type shooting game, a Virtual Reality (VR) application program, and an Augmented Reality (AR) program. The second terminal 160 is a terminal used by a second user who uses the second terminal 160 to control a second virtual character located in the virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, walking, running, jumping, riding, aiming, releasing skills, picking up items, attacking, being attacked by other virtual characters in the virtual environment. Alternatively, the first avatar may be a virtual character, virtual animal, or virtual plant, such as a simulated character or an animated character.
Optionally, the first virtual character and the second virtual character are in the same virtual environment. Optionally, the first avatar character and the second avatar character may belong to the same team, the same organization, the same camp, have a friend relationship, or have temporary communication rights. Alternatively, the first avatar and the second avatar may belong to different camps, different teams, different organizations, or have a hostile relationship.
Optionally, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application on different operating system platforms (android or IOS). The first terminal 120 may generally refer to one of a plurality of terminals, and the second terminal 160 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 120 and the second terminal 160. The device types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer. The following embodiments are illustrated with the terminal comprising a smartphone.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
Fig. 3 is a flowchart illustrating a data synchronization method according to an exemplary embodiment of the present application, which may be applied to the first terminal 120 or the second terminal 160 in the computer system shown in fig. 1 or other terminals in the computer system. The method comprises the following steps:
step 301, receiving a skill release operation, and sending a skill release request to a server, where the skill release request carries a skill identifier, and the server is configured to create a first flyer corresponding to the skill according to the skill identifier.
The method comprises the steps that an application program supporting a virtual environment is operated on a terminal used by a user, when the application program is operated by the user, a picture when the application program is used is correspondingly displayed on a display screen of the terminal, the picture is a virtual environment picture, and the virtual environment picture comprises a picture obtained by observing the virtual environment at a first person view angle or a third person view angle of a controlled virtual role. Optionally, the virtual environment displayed by the virtual environment screen includes: at least one element selected from the group consisting of mountains, flat ground, rivers, lakes, oceans, deserts, sky, plants, buildings, and vehicles.
In this embodiment, taking a game application as an example, the controlled virtual character can execute various activity events in the virtual environment, such as formation with other controlled virtual characters to attack the enemy controlled virtual character, fight with NPC, and the like.
When the terminal used by the user is a terminal with a touch display screen, such as a smart phone and a tablet computer, the skill release operation comprises at least one of the following operations: click operations, double-click operations (including single-finger double-click and double-finger double-click), slide operations, drag operations, long-press operations, and combinations thereof.
When the terminal used by the user is a terminal connected with an external input device, such as a desktop computer and a notebook computer, the skill release operation is an operation generated by the external input device, such as the user clicking a mouse left key to release the skill of the controlled virtual character, or the user pressing a letter 'E' on a keyboard to key the release skill of the controlled virtual character.
Illustratively, a control is superimposed and displayed on the virtual environment screen, and the user controls the controlled virtual character to execute various activity events in the virtual environment by triggering the control, such as formation with other controlled virtual characters to attack the controlled virtual character of an enemy, fight with an NPC, and the like.
And the control comprises a skill release control, and the user clicks the skill release control to control the controlled virtual character to release the skill in the virtual environment. Illustratively, the skills include at least one of an offensive skill, a defensive skill, a compensatory skill, and other skills, the offensive skill being used to reduce a vital value of the target subject; the defense skill is used for avoiding the reduction of the life value of the controlled virtual character; compensatory skills are used to restore the vital value of the target object; other skills refer to skills specific to a certain type of controlled virtual character, such as stealth skills, seniority skills, and the like.
In the virtual environment, the controlled virtual character corresponds to a life value, the life value is used for simulating the survival time of the controlled virtual character in the virtual environment, namely the 'life' of the controlled virtual character, and when the life value is zero, the life of the controlled virtual character in the virtual environment is finished. The life value is usually represented by a progress bar of life values, which are displayed around the controlled virtual character, for example, the complete life value of each controlled virtual character in the virtual environment is 100.
In the virtual environment, the NPC also corresponds to a life value, which is used to simulate the survival time of the NPC in the virtual environment, i.e. the "life" of the NPC, and when the life value is zero, the life of the NPC in the virtual environment is ended, which is also usually represented by a progress bar of the life value, displayed in the surrounding area of the NPC.
In the virtual environment, some virtual articles also correspond to life values, such as stones, trees, and the like, and in some embodiments, the controlled virtual character obtains additional rewards, such as rare virtual props, gold coins, experience values, and the like, by reducing the life values of the virtual articles to zero.
When a user triggers the skill release control, the game application program sends a skill release request to the server, wherein the skill release request carries a user account (namely the user account when the user plays the game) and a skill identifier. And when receiving the skill release request, the server determines the skill type triggered by the user according to the skill identification. Different skills correspond to different flyers, which are the products of skill evolution at the time of release. The flyer may be a virtual prop, such as when the controlled virtual character releases skill 1, skill 1 evolves into a dart (first flyer) moving towards the target object; the flyer can also be sound wave or light wave, for example, when the controlled virtual character releases the skill 2, the skill 2 evolves to move the light beam to the target object.
The first flyer in the embodiment of the present application refers to a flyer generated when skills are released.
Step 302, a first data packet sent by a server is received, where the first data packet is generated by the server according to a first flying object, and the first data packet includes a first motion parameter of the first flying object.
The server encapsulates the parameters related to the first flying object in the first data packet, wherein the parameters related to the first flying object comprise at least one of attribute parameters (including shape, color, size, number and the like), motion parameters (such as speed, direction and the like), form parameters (such as layout and the like), target objects and skill special effects (such as highlight special effects). And the server searches related parameters according to the skill identification and the corresponding relation.
Schematically, the corresponding relation among the skill identification, the first flyer and the related parameters is represented by table one.
Watch 1
Figure BDA0002963384700000101
The skill mark 2021020815380001 indicates the 0001 st skill released at 15 o' clock 38/08/02/2021, which is skill a, the first flying object corresponding to skill a is a fireball, and when the skill a is released by the controlled virtual character, a fireball is displayed on the virtual environment screen to move at a speed of 1 m/sec toward the target object (NPC). When the controlled avatar releases skill B, a light beam is displayed on the virtual environment screen, and moves to the controlled avatar 1 at a speed of 10 m/s. The skill identifiers in the above table are only numbers, and the embodiments of the present application do not limit the types of the skill identifiers.
And when the server sends the first data packet, the server sends the corresponding first data packet to the game client participating in the game according to the user account.
Step 303, calculating a display position of the first flying object according to the first motion parameter, and displaying a first virtual environment picture according to the display position of the first flying object, wherein the first virtual environment picture displays a motion path of the first flying object.
And after receiving the first data packet sent by the server, the game client loads and displays the first virtual environment picture according to the content in the first data packet. Illustratively, the first flying object corresponding to skill 1 is a dart, the speed of the dart is 1 m/s, a planar rectangular coordinate system is established with the controlled virtual character as the origin, the coordinates of the controlled virtual character are (0, 0), and the coordinates of the target position are (10, 0). The terminal calculates the display position (coordinates in each frame of the picture) of the dart on each frame of the picture according to the moving speed of the dart and the time of flying to the target position, and connects the display positions on the frames of the pictures of the frames to form the motion path of the first flying object in the first virtual environment picture. Thus, the terminal displays the following screens: the dart flies to the target position at a speed of 1 m/s, and the flight path of the dart is a straight line. In some embodiments, the dart will return to the controlled avatar after flying to the target location.
In summary, in the method provided in this embodiment, the client receives the first motion parameter sent by the server to load and display the first flying object, the server does not need to calculate the operation result corresponding to the first flying object first, and then synchronize the operation result to each client, and each client can load and display the virtual environment picture according to the motion parameter, so that even if the target object aimed by the first flying object is dynamic, the client can load the corresponding virtual environment picture in time, and a game picture of the first flying object in the moving process is prevented from having a certain delay or pause.
The moving speed of the flyer corresponding to the skill is generally higher, and the flying distance of the flyer is the action range of the skill. The basic rules for the movement of a flying object are: in the process of the movement of the flyer, when the distance between the virtual character (or the virtual article) and the flyer is smaller than a certain threshold value, the virtual character (or the virtual article) is considered to be hit by the skill, and the hit effect is triggered. The motion trail of the flyer comprises two types: moving towards a stationary target object or moving towards a dynamic target object. The process of moving to the static target object is relatively simple, and after the server creates the flyer corresponding to the skill, the server controls the flyer to move to the static target object all the time, and damages virtual characters (or virtual articles) in the designated range in the moving process.
In contrast, the process of flying objects to track dynamic target objects is relatively complex. When the dynamic target object is tracked, because the behavior of the dynamic target object is unpredictable, the dynamic target object has flexible and various avoiding modes, can run linearly, can run back or run around, if the position of the flyer is calculated by the server and then sent to the client, the client can display the picture corresponding to the skill moving process due to network delay and other reasons, and the picture is not smooth enough. In some cases, the degree of jamming may be more pronounced as the speed of movement of the flying object increases.
Fig. 4 shows a flowchart of a data synchronization method provided in another exemplary embodiment of the present application. The method may be applied in the first terminal 120 or the second terminal 160 in a computer system as shown in fig. 1 or in other terminals in the computer system. The method comprises the following steps:
step 401, receiving a skill release operation, and sending a skill release request to a server, where the skill release request carries a skill identifier, and the server is configured to create a first flyer corresponding to the skill according to the skill identifier.
Illustratively, a user uses a smart phone, a game application program runs on the smart phone, a skill release control is displayed in a game interface when the game application program runs, the user clicks the skill release control, and a terminal (the smart phone) sends a skill release request to a server.
Step 402, receiving a first data packet sent by a server, wherein the first data packet is generated by the server according to a first flying object, and the first data packet comprises a first motion parameter of the first flying object.
The server receives a skill release request sent by a first client, creates a first flyer corresponding to the skill according to the skill identification, generates a first data packet according to parameters of the first flyer, the first data packet comprises first motion parameters of the first flyer, and sends the first data packet to at least two clients (clients participating in the game) after the server generates the first data packet, such as the first client and a second client, and the first client and the second client are respectively loaded and displayed with a first virtual environment picture according to the first data packet. Illustratively, the number of first flyers corresponding to the skill released by the controlled virtual character is 1.
And step 403, extracting a target position and a moving speed corresponding to the first flying object from the first motion parameters, wherein the target position is a position where a target object is located, the target object is an object located in an action range of the first flying object, and the target object corresponds to a first life value.
Illustratively, when releasing skills, a user will usually aim at a target object, the target object includes at least one of a virtual character and a virtual article, the virtual character can be a controlled virtual character controlled by the user or a non-user playing type character (NPC), and the target position of the target object is represented by coordinates. At this time, the life value of the target object is the first life value.
And step 404, calculating a motion path of the first flying object in the virtual environment according to the target position and the moving speed.
Illustratively, when determining the motion state of the first flying object in the virtual environment, the game application needs to determine according to the characteristics of the first flying object and factors in the virtual environment, for example, the first flying object is a spherical prop, and the motion path in the virtual environment is parabolic; for another example, when the virtual environment blows wind, the first flying object moves against the wind, and the motion path of the first flying object takes a roundabout curve shape.
The motion path of the first flying object is formed by position coordinates of all positions of the first flying object moving to the virtual environment in the motion process, the game application program converts all the position coordinates into corresponding display coordinates in a virtual environment picture, and the display coordinates are connected to form the motion path of the first flying object.
And step 405, determining the display position of the first flying object in the first picture frame of each frame according to the motion path.
The game application displays a first flyer in a first frame of each frame according to the motion path. The first picture frame is a picture frame constituting a first virtual environment picture.
And 406, displaying a first virtual environment picture according to the display position of the first flyer.
And obtaining a first virtual environment picture according to the display position in each frame of the first picture frame, namely arranging a plurality of first picture frames on which the first flyer is displayed to obtain the first virtual environment picture. The first virtual environment picture is a picture including a moving process of the first flying object.
As shown in fig. 5, in a first frame of a certain frame during the flight of the first flying object, the skill is released by the virtual character 1, the skill evolves into the first flying object 21, the first flying object 21 is a sword (the line behind the sword is an illustration of special effect beams, the number of the first flying objects 21 is 1), and the first flying object moves toward the target object 22. The game application calculates the display coordinates of the first flying object 21 in the first frame, and displays the first flying object 21 in the first frame according to the display coordinates. Illustratively, the target object 22 is a stationary target object.
Before the user controls the controlled virtual character to release the skill, the user can control the direction of the skill release, so as to control the moving direction of the first flyer in the virtual environment. In some embodiments, there may not be any target object in the direction of movement of the first flying object.
Step 407, in response to the first flyer hitting the target object, receiving a skill effect sent by the server, where the skill effect is generated by the server according to the received first prompt information, and the first prompt information is information generated by a client corresponding to the target object in response to the target object being located within the range of action of the first flyer.
The range of movement of the first flyer is the range of action of the skill, and when the distance between the first flyer and the target object is less than the distance threshold, the game application determines that the first flyer hits the target object. And the second client corresponding to the target object sends first prompt information to the server (the skill releasing client is the first client), the server generates a skill effect according to the first prompt information, and then the skill effect is respectively sent to the first client and the second client.
And in response to the first distance being smaller than or equal to the second distance, the server determines that the first verification of the first prompt message passes, and the server generates a skill effect according to the first verification of the first prompt message. The first distance is the distance between the first flying object and the target object calculated by the server, and the second distance is the distance between the first flying object and the target object calculated by the client corresponding to the target object. Illustratively, the first hint information carries the second distance. The server calculates a first distance, then compares the first distance with a second distance carried in the first prompt message, and when the first distance is smaller than or equal to the second distance, the server determines that the first verification of the first prompt message is passed.
When the client calculates the game logic according to the first motion parameter and displays the virtual environment screen, the server also performs logic operation according to the first motion parameter, as shown in fig. 6.
The user triggers the skill release control and sends a skill release request to the server. After receiving the skill release request, the server generates a first data packet corresponding to the first flyer, wherein the first data packet comprises a motion parameter and an aiming target, and further comprises at least one of the following parameters: the type of the skill, the acting range of the skill, the type of the flyer corresponding to the skill and the number of the flyers. The server sends the first data packet to the client.
After receiving the first data packet, the client calculates the position and the moving direction of the first flyer in each frame of picture according to the motion parameters in the first data packet, so that the plurality of frames of pictures form a virtual environment picture corresponding to the movement of the first flyer, the virtual environment picture displays a controlled virtual character A release skill 1, the first flyer corresponding to the skill 1 moves from a point A to a point B, and the moving speed of the first flyer is v.
As shown in fig. 6, a first flying object is represented by a bullet, after the bullet 11 is created at the server, parameters related to the bullet 11 are sent to the client, and the client independently loads a screen when the bullet 11 moves in the virtual environment. And the server performs synchronous logic operation on the bullet 11 in a state synchronization mode, and calculates the moving track and the moving speed of the bullet 11 in the virtual environment.
When the distance of the bullet 11 from a virtual character is smaller than a distance threshold, the virtual character is within the action range of the bullet 11. Illustratively, a controlled virtual character b (target object 12) is in an action range of a bullet 11, a client corresponding to the controlled virtual character b sends an injury acquisition request to a server, the injury acquisition request carries a user account for controlling the controlled virtual character b, the server calculates a reduction value of a life value of the controlled virtual character b according to the controlled virtual character b and the action range of the bullet 11, and determines an action effect in the action range, for example, if the controlled virtual character b is closer to the bullet 11, the controlled virtual character b is impacted to a far place in the action range of the bullet 11, and the life value is reduced by 50; for another example, the controlled virtual character b is far from the bullet 11, and the position of the controlled virtual character b does not change under the action range of the bullet 11, so that the life value is reduced by 10.
And receiving the skill effect sent by the server after the first distance between the first flying object and the target object is smaller than or equal to the first distance threshold and the first verification of the first prompt message by the server is passed. The first verification is that the server determines that the first distance is smaller than or equal to the second distance after receiving the first prompt message. The first distance is the distance between the first flying object and the target object calculated by the server, and the second distance is the distance between the first flying object and the target object calculated by the client corresponding to the target object.
In some embodiments, the server verifies the first prompt message sent by the client corresponding to the target object, and the verification process is as follows: the server calculates a first distance between the target object and the first flying object according to a first data packet corresponding to the first flying object, the first prompt information carries a second distance between the target object and the first flying object calculated by the client corresponding to the target object, the server compares the first distance with the second distance, and the server verifies that the first prompt information is passed in response to the first distance being smaller than or equal to the second distance. Namely, the server judges that the first flying object hits the target object, thereby generating the skill effect.
In some embodiments, in response to the first flyer hitting the target object, the client corresponding to the target object sends first prompt information to the server, and the server calculates the skill effect of the first flyer according to the first prompt information, wherein the first prompt information includes the type of the target object.
Illustratively, the first flyer triggers a corresponding skill effect according to the type of the target object.
For example, the target object is a static target object, the server calculates the skill effect of the first flying object as a first skill effect, and the first skill effect is: the first flying object moves toward the stationary target object by a straight-line distance between two points between the flying object itself and the stationary target object. Since the target object is stationary, the server determines that the flight path of the first flying object is: the server determines the lowest flying height of the first flying object according to the highest obstacle, and the first flying object moves to the static target object under the condition that the first flying object is not lower than the lowest flying height.
As another example, the target object is a dynamic target object, and the server calculates the skill effect of the first flyer as a second skill effect, where the second skill effect is: the first flying object follows the dynamic target object. Since the dynamic target object is in constant motion, the server determines that the flight path of the first flying object is: the server pre-judges a possible moving track and a possible arriving target position of the dynamic target object according to the surrounding environment in the moving process of the dynamic target object, and the first flyer moves to the target position so as to intercept the dynamic target object.
The client and the server respectively monitor the first flyer, the client independently loads the moving picture of the first flyer according to the data packet sent by the server, and meanwhile, the server calculates the moving process of the first flyer, so that the risks of operation errors and cheating of the client are avoided, and the picture of the client is ensured to achieve the effect close to the frame synchronization technology.
And 408, displaying a second picture frame according to the skill effect, wherein the second picture frame displays a second life value of the target object, the second life value is smaller than the first life value in the first picture frame, and the second picture frame is a subset picture frame of the first virtual environment picture.
When the target object is hit, the client corresponding to the target object sends first prompt information to the server, the server determines that the target object is hit through the first prompt information, and calculates a skill effect acting on the target object, wherein the skill effect comprises a style of collision between the skill and the target object and a life reduction value of the target object. The server sends the skill effect to the client participating in the game, and the client participating in the game displays the skill effect in the second picture frame. The first frame is a frame corresponding to the flight process of the first flying object, the life value (first life value) of the target object in the first frame is not changed, the second frame is a frame corresponding to the hit target object, and the life value (second life value) of the target object in the second frame is reduced compared with the first frame.
In summary, in the method provided in this embodiment, the client receives the first motion parameter sent by the server to load and display the first flying object, the server does not need to calculate the operation result corresponding to the first flying object first, and then synchronize the operation result to each client, and each client can load and display the virtual environment picture according to the motion parameter, so that even if the target object aimed by the first flying object is dynamic, the client can load the corresponding virtual environment picture in time, and a game picture of the first flying object in the moving process is prevented from having a certain delay or pause.
In the method of the embodiment, the motion path of the first flying object is accurately calculated through the target position and the moving speed corresponding to the first flying object extracted from the first motion parameter by the client, so that the client can accurately and smoothly display the first virtual environment picture, and delay or blockage of the first flying object in the moving process is avoided.
According to the method, the skill effect of the target object after being hit by the first flyer is calculated through the server, so that the accuracy of the skill effect is ensured on the premise that the client displays a smooth virtual environment picture, and a data synchronization effect similar to frame synchronization is realized in a state synchronization mechanism in a double-insurance mode.
According to the method, the skill effect is generated after the first prompt information is verified through the server, and the accuracy of the skill hit result is guaranteed.
It can be understood that, when the skill is released, the server sends the motion parameters of the first flyer to each client participating in the game, and each client loads the virtual environment picture according to the motion parameters; when the first flyer hits the target object, the server sends skill effect effects to the clients participating in the game.
In some embodiments, some of the skill releases are evolved into a second flyer, the second flyer including at least two sub-flyers.
Fig. 7 shows a flowchart of a data synchronization method according to another exemplary embodiment of the present application. The method may be applied in the first terminal 120 or the second terminal 160 in a computer system as shown in fig. 1 or in other terminals in the computer system. The method comprises the following steps:
step 701, receiving skill release operation, and sending a skill release request to a server, where the skill release request carries a skill identifier, and the server is configured to create a second flyer corresponding to the skill according to the skill identifier.
Illustratively, a user uses a desktop computer connected with a mouse, a game application program runs in the desktop computer, the desktop computer displays a game interface of the game application program during running, the game interface comprises a skill release control, and when the user clicks the skill release control by using the mouse, the desktop computer sends a skill release request to the server. And the server creates a second flyer according to the skill identification carried by the skill release request. The number of the first flyers is different from the number of the second flyers, and the second flyers include at least two sub-flyers, that is, the skills released by the controlled virtual character can evolve into at least two sub-flyers.
Step 702, receiving a second data packet sent by the server, where the second data packet is generated by the server according to a second flying object, and the second data packet includes second motion parameters corresponding to each sub-flying object.
The server encapsulates the parameters related to the second flying object in a second data packet, wherein the parameters related to the second flying object comprise at least one of attribute parameters (including shape, color, size, number and the like), motion parameters (such as speed, direction and the like), form parameters (such as layout and the like), target objects and skill special effects (such as highlight special effects). And the server searches related parameters according to the skill identification and the corresponding relation.
Schematically, the corresponding relation among the skill identification, the second flyer and the related parameters is represented by table two.
Watch two
Figure BDA0002963384700000171
The skill mark 2021021815380001 indicates the 0001 st skill released at 15 o' clock 38/18/02/2021, which is skill C, the second flying object corresponding to the skill C is a fan, and when the controlled virtual character releases the skill C, 5 fans are displayed on the virtual environment screen to fly in the directions indicated by the 5 vertices of the pentagon with the controlled virtual character as the starting point, the speed is 1 m/s, and the target object of the attack is NPC. When the controlled avatar releases skill D, 10 darts are displayed on the virtual environment screen, the 10 darts are distributed in a circle and move to the divergent direction at a speed of 10 m/s, and any one of the darts moves to the controlled avatar 1. The skill identifiers in the above table are only numbers, and the embodiments of the present application do not limit the types of the skill identifiers.
And when the server sends the second data packet, the server sends the second data packet to the game client participating in the game according to the user account.
And 703, calculating the display position of each sub-flying object according to the second motion parameter, and displaying a second virtual environment picture according to the display position of each sub-flying object, wherein the second virtual environment picture displays the motion path of each sub-flying object.
And after receiving a second data packet sent by the server, the game client loads and displays a second virtual environment picture according to the content in the second data packet. Illustratively, the second flying object corresponding to skill 2 is a fan, the moving speed of the fan is 1 m/s, a planar rectangular coordinate system is established with the controlled virtual character as the origin, the coordinates of the controlled virtual character are (0, 0), and the coordinates of the target position are (20, 0). And the terminal calculates the display position (coordinates in each frame of picture) of each fan on each frame of picture according to the speed and the flight time of each fan, and associates the display positions on the frames of the pictures of the frames to form the flight path of the second flying object in the second virtual environment picture. Thus, the terminal displays the following screens: each fan moves radially at a speed of 1 m/s towards a controlled virtual character remote from the release skills, one of which moves towards the target position. In some embodiments, the speed of movement of each fan is different, as is the display position on the picture frame.
The above step 703 may be replaced by the following steps:
step 7031, the number, the target position, and the moving speed corresponding to the sub-flying object are extracted from the second motion parameter, where the target position is a position where the target object is located, the target object is an object located within an action range of the second flying object, and the target object corresponds to the first life value.
When the server creates the second flyer, the server creates the second flyer and sends the parameters such as the number, the layout, the time characteristics and the like which can be split by the second flyer to the client, the client splits the second flyer according to the parameters, and each client calculates the logic operation of the second flyer in the same time due to the fact that the parameters have the time characteristics.
Illustratively, the user usually aims at a target object in releasing skill, the target object comprises at least one of a virtual character and a virtual article, the virtual character can be a controlled virtual character controlled by the user or an NPC, and the target position of the target object is represented by coordinates. At this time, the life value of the target object is the first life value.
And step 7032, calculating the motion path of each sub-flying object in the virtual environment according to the number, the target position and the moving speed.
Illustratively, the game application determines the motion state of the second flying object in the virtual environment according to the characteristics of the second flying object and the factors in the virtual environment, for example, the second flying object is a fan, 5 fans move radially in the virtual environment to the position far away from the controlled virtual character releasing the skill, and the motion trajectories and the motion speeds of the 5 fans are different when the wind speeds and the wind directions in different directions are different.
The motion path of the second flying object is formed by the position coordinates of each position of the second flying object moving to the virtual environment in the motion process, the game application program converts each position coordinate into the corresponding display coordinate in the virtual environment picture, and the display coordinates are connected to form the motion path of the second flying object.
And step 7033, determining the display position of each sub-flying object in the third picture frame of each frame according to the motion path.
The game application displays each sub-flying object in each frame of a third picture frame according to the motion path of each sub-flying object, the third picture frame is a picture frame forming a second virtual environment picture, and the second virtual environment picture is a picture including the moving process of each sub-flying object.
Step 7034, a second virtual environment picture is obtained according to the display position in the third picture frame of each frame.
And arranging a plurality of third picture frames displaying the sub-flyers to obtain a second virtual environment picture.
As shown in fig. 8, in a third frame of a certain frame of the second flying object flying, a skill is released by the virtual character 23, the skill is evolved into a second flying object 24, the second flying object includes a plurality of sub flying objects 24, such as a first sub flying object 24a, a second sub flying object 24b and a third sub flying object 24c, the plurality of sub flying objects 24 are swords (lines behind the swords are schematic of special effect light beams, and the number of the sub flying objects shown in the figure is only schematic), wherein the sub flying object 24a moves towards the target object 22, the game application calculates display coordinates of each sub flying object 24 in the third frame, and displays each sub flying object 24 in the third frame according to the display coordinates. Illustratively, the target object 22 is a dynamic target object.
Before the user controls the controlled virtual character to release the skill, the user can control the direction of the skill release, so as to control the moving direction of the first flyer in the virtual environment. In some are real, there may not be any target object in the direction of movement of the first flying object.
Step 704, responding to the target object hit by the sub-flying object, and receiving a sub-skill effect sent by the server, where the sub-skill effect is generated by the server according to the received second prompt information, and the second prompt information is information generated by a client corresponding to the target object in response to the target object being located within the range of action of the sub-flying object.
The range of motion of the sub-flyer is the range of action of the skill, and when the distance between the sub-flyer and the target object is less than the distance threshold, the game application determines that the sub-flyer hits the target object. Each sub-flying object corresponds to respective motion parameters, the sub-flying objects are not interfered with each other, and the target object can be tracked independently. And the second client corresponding to the target object sends second prompt information to the server (the skill releasing client is the first client), the server generates a sub-skill effect according to the second prompt information, and then the sub-skill effect is respectively sent to the first client and the second client.
And in response to the third distance being smaller than or equal to the fourth distance, the server determines that the second verification of the second prompt message passes, and the server generates a sub-skill action effect according to the second verification of the second prompt message. The third distance is the distance between the sub-flying object and the target object calculated by the server, and the fourth distance is the distance between the sub-flying object and the target object calculated by the client corresponding to the target object. Illustratively, the second prompt information carries a fourth distance. That is, the server calculates a third distance, compares the third distance with a fourth distance carried in the second prompt message, and determines that the second verification of the second prompt message passes when the third distance is less than or equal to the fourth distance.
And receiving the sub-skill effect sent by the server after the distance between the sub-flyer and the target object is smaller than or equal to the second distance threshold and the second verification of the second prompt message by the server is passed. The second verification pass means that the server determines that the third distance is smaller than or equal to a fourth distance after receiving the first prompt message, the third distance is the distance between the sub-flying object and the target object calculated by the server, and the fourth distance is the distance between the sub-flying object and the target object calculated by the client corresponding to the target object.
In some embodiments, the server verifies the second prompt message sent by the client corresponding to the target object, and the verification process is as follows: the server calculates a third distance between the target object and the sub-flying object according to the first data packet corresponding to the second flying object (when the second flying object is not decomposed, the third distance is the distance between the second flying object and the target object), the second prompt information carries a fourth distance between the target object and the sub-flying object calculated by the client corresponding to the target object, the server compares the third distance with the fourth distance, and the server passes verification of the second prompt information in response to the third distance being smaller than or equal to the fourth distance. That is, the server determines that the sub-flight (or the second flight) hits the target object, thereby generating the skill effect.
In some embodiments, in response to a second target object hitting the target object, the client corresponding to the target object sends second prompt information to the server, and the server calculates a skill effect of the second flying object according to the second prompt information, where the second prompt information includes a type of the target object.
Illustratively, the second flyer triggers a corresponding skill effect according to the type of the target object.
For example, the target object is a static target object, the server calculates the skill effect corresponding to the second flying object as a third skill effect, and the third skill effect is: the second flying object moves in a catapult manner toward the stationary target object. The ejection means that the second flying object obtains a larger speed when flying out, so that the second flying object moves a longer distance in a short time. The second flyer gradually changes from the ejection speed to the flying speed during moving. Illustratively, the ejection speed of the second flying object is configured in advance, or the ejection speed of the second flying object is determined according to the virtual environment where the second flying object is located, and if there are more obstacles around the second flying object, the ejection speed of the second flying object is greater. Illustratively, the ejection speed of the second flying object is constant or dynamically variable. In one example, the second flying object flies at a first ejection speed while moving towards the stationary target object and flies at a second ejection speed towards the stationary target object in response to the distance between the first flying object and the stationary target object being less than the set distance. When the second flying object moves toward the stationary target object, the second flying object is not decomposed into a plurality of sub-flying objects.
As another example, the target object is a dynamic target object, and the server calculates the skill effect of the second flight as a fourth skill effect, where the fourth skill effect is: the second flying object flies along the dynamic target object, changes the direction of the moving path in response to the dynamic target object, and decomposes the second flying object according to the direction of the moving path. The server pre-judges the possible moving track and the possible arriving target position of the dynamic target object according to the surrounding environment in the moving process of the dynamic target object, and the second flying object is decomposed according to the moving track of the dynamic target object. In one example, the dynamic target object moves along a straight line in the virtual environment, the second flying object moves along the straight line along with the dynamic target object, and when the dynamic target object changes the moving route (such as suddenly turning left to the other route), the second flying object is decomposed into two sub-flying objects, wherein one of the sub-flying objects moves towards the direction after the moving route is changed.
Step 705, a fourth frame is displayed according to the sub-skill effect, the fourth frame displays a second life value of the target object, the second life value is smaller than the first life value in the third frame, and the fourth frame is a subset frame of the second virtual environment frame.
When the sub-flying object hits the target object, the client corresponding to the target object sends second prompt information to the server, wherein the second prompt information comprises damage logic of the second flying object, and because the parameter has a time characteristic, each client calculates logic operation of the second flying object within the same time, and therefore the client corresponding to the target object reports the damage logic corresponding to the second flying object to the server. And the server calculates the sub-skill effect according to the injury logic of the second flyer, sends the sub-skill effect to the client participating in the game, and the client participating in the game displays the skill effect in the second picture frame. The third frame is a frame corresponding to the flight process of the second flying object, the life value (first life value) of the target object in the third frame is unchanged, and the life value (second life value) of the target object in the fourth frame is reduced compared with the third frame.
In summary, in the method provided in this embodiment, the client receives the second motion parameter sent by the server to load and display the second flying object, the server does not need to calculate the operation result corresponding to the second flying object first, and then synchronize the operation result to each client, each client can load and display the virtual environment picture according to the second motion parameter, even if the target object aimed by the second flying object is dynamic, the client can load the corresponding virtual environment picture in time, and therefore, a certain delay or pause exists in the game picture of the second flying object in the moving process is avoided.
In the method of the embodiment, the motion path of each sub-flying object is accurately calculated through the number, the target position and the moving speed of the sub-flying objects extracted from the second motion parameters by the client, so that the client can accurately and smoothly display the second virtual environment picture, and delay or blockage of each sub-flying object in the moving process is avoided.
According to the method, the skill effect of the target object after being hit by the sub-flying object is calculated through the server, so that the accuracy of the skill effect is ensured on the premise that the client displays a smooth virtual environment picture, and a similar frame synchronization data synchronization effect is realized in a double-insurance mode under a state synchronization mechanism.
According to the method, the server verifies the second prompt information to generate the sub-skill effect, and therefore the skill hit result is more accurate.
It can be understood that, when the skill is released, the server sends the motion parameters of the second flyer to each client participating in the game, and each client loads the virtual environment picture according to the motion parameters; when the second flyer hits the target object, the server sends skill effect effects to the clients participating in the game.
It should be noted that, when the user controls the virtual character to attack the NPC in the virtual environment, the server determines the hit result because the NPC has no corresponding client. The server determines that the flight hits the NPC in response to the distance between the flight and the NPC being less than or equal to the distance threshold; or, the server determines that the NPC is hit according to the prompt information reported by the client, based on the client reported to the server by the first client in the clients corresponding to the controlled virtual role located near the NPC.
The embodiment of the application controls the movement path of the flyer in the virtual environment through the Behavior Tree (Behavior Tree, BT): is a tree structure containing hierarchical nodes for controlling the decision-making behavior of the target. The server creates a first action tree corresponding to the first flyer, wherein the first action tree is used for controlling the first flyer to move in the virtual environment; the server also creates a second behavior tree corresponding to the second flying object, the second behavior tree for controlling the second flying object to move in the virtual environment.
As shown in fig. 9, the behavior tree implements various changes including timing acceleration and deceleration, re-selecting a target object when hit, splitting a flight after hit, and automatically destroying the flight when the flight reaches a destination by combining nodes such as selection, sequence, and condition, and combining with an interface event of the flight. The functions of the flyer are integrated into one event and can be directly invoked, and the embodiment takes the flyer as a bullet for explanation, and specifically includes the following events:
1. parameter notification event (bull _ Notify): and sending the parameterization information and the target information of the flyer to the client, and controlling the movement of the flyer after the client receives the notification.
2. Timer event (Timer): the behavior of the flying object can be driven as a condition, deciding what event the flying object performs after how long. The timer has two modes, single and cyclic. Events triggered after a single timing are executed only once, and events after cycle timing are also executed in a cycle manner.
3. Calling skill event (bull _ Impact): invoking a skill effect refers to an actual effect generated after the controlled virtual character releases a skill on the target object, such as reducing the life value of the virtual character (or virtual article), improving the gain effect of the flyer, creating a new flyer, and the like.
4. Summon flyer event (bulletSummon): one flyer can be decomposed into a plurality of sub flyers, for example, when the skill is released, one flyer is decomposed into a plurality of sub flyers to attack a target object; for another example, when the flying object hits a target object, the effect of the breaking of the sub-flying object is simulated, and the flying object is decomposed into a plurality of sub-flying objects. In one example, the bullet breaks into two segments after hitting the target object.
5. Reselection target event target (bump _ reset): when the preset conditions are met, the flyer can select a new target object to realize the ejection function, namely, after one target object is hit, the flyer can track other target objects.
6. Corrected speed event (Bullet _ Buff): the moving speed of the flyer is dynamically corrected, and the flyer can achieve the effect of gradual acceleration or gradual deceleration by matching with a cycle timing event.
7. Delete event (bull _ Die): and deleting the flyer. In some embodiments, when a flyer has hit a target object, the flyer fails, and the server invokes a delete event to delete the flyer; in other embodiments, the second flyer is split into a plurality of sub-flyers, and the server invokes a delete event to delete the second flyer.
It should be noted that, when designing the behavior tree, it is necessary to consider that the speed of the flying object is faster, and the calling frequency of the behavior tree is doubled compared with that of the NPC, so that the response and processing of the flying object are more timely, and the experience is more real.
With the flying object being a bullet, the flow shown in fig. 9 is as follows:
when the controlled avatar releases skills, the bullet function is triggered starting from the selection node 900. Entering a counting limit node 901 (the number "1" represents the counting times), entering a first sequence node 902, then initializing three timers, returning a result of successful timer initialization to the selection node 900 when the initialization is successful, entering a notification node 903, wherein the notification node 903 is used for enabling a server to notify a client to judge whether the bullet injury is received or not from 0.5 second later; when the client does not receive the damage of the bullet, a non-damage result (i.e. false) is returned to the first sequence node 902a, and the acceleration node 904 is entered from the first sequence node 902a, and the acceleration node 904 is used for enabling the server to inform the client that the acceleration starts after two seconds; when the preset time is not reached, false is returned to the first sequence node 902a, and a time interval node 905 is entered from the first sequence node 902a, and is used for accelerating the server from the 3 rd second every 2 seconds.
From the selection node 900, the count limiting node 901 is entered again, and since the limiting number is 1, the first sequence node 902a returns a count failure result (i.e., false). The selection node 900 enters the post-hit operation process, at this time, if the bullet misses the target object, the determination node 906 returns a miss result (false) to the second sequence node 902b, and the third sequence node 902c enters the sequence corresponding to the hit operation, and further enters a bullet injury effect node 907, where the bullet injury effect node 907 is used to indicate that the bullet causes an injury with an injury identifier 11095405 on the target object, and schematically, the injury effect corresponding to the injury identifier is to reduce the life value 20 of the target object. Bullet splitting effect node 908 is used to represent the effect of splitting a bullet into n bullets (n being an integer greater than 1). The bullet deletion node 909 is used to indicate that after a bullet is divided into n, the atomic bomb (undisgregated bullet) fails, and n bullets remain.
It should be noted that the selection node 900 is returned when the event execution is successful, and the sequence node 901 is returned when the event execution is failed. Continuously entering different sequence nodes according to the rule, when entering a fourth sequence node 902d, first entering a judgment damage node 910, where the judgment damage node 910 is configured to detect whether the client starts to judge damage after a specified timing, and if the client does not start to judge damage after the specified timing, returning false to the fourth sequence node 902 d; entering a bullet notification node 911, wherein the bullet notification node 911 is used for indicating that the bullet radius is 1500 and the bullet can be fired again every 1 second; stop timing node 912 is used to cause the server to notify the client to stop executing events.
When entering the fifth-sequence node 902e, the fifth-sequence node 902e first enters a determining acceleration node 913, where the determining acceleration node 913 is configured to detect whether the client starts acceleration after a specified time, and if the client does not start acceleration after the specified time, the fifth-sequence node 902e returns false; entering a judgment interval acceleration node 914, where the judgment interval acceleration node 914 is configured to detect whether the client starts to accelerate after a preset time interval; the bullet gain effect node 915 is used to indicate the gain effect brought by the bullet.
False returns to select node 900 at first sequence node 902a, second sequence node 902b, fourth sequence node 902d, and fifth sequence node 902e, and the incoming bullet moves to reach node 916, which is used to indicate that the bullet moves to the target object at node 916.
The above embodiments describe the above method based on the application scenario of the game, and the following describes the above method by way of example in the application scenario of military simulation.
The simulation technology is a model technology which reflects system behaviors or processes by simulating real world experiments by using software and hardware.
The military simulation program is a program specially constructed for military application by using a simulation technology, and is used for carrying out quantitative analysis on sea, land, air and other operational elements, weapon equipment performance, operational actions and the like, further accurately simulating a battlefield environment, presenting a battlefield situation and realizing the evaluation of an operational system and the assistance of decision making.
In one example, soldiers establish a virtual battlefield at a terminal where military simulation programs are located and fight in a team. The soldier controls a virtual object in the virtual battlefield environment to perform at least one operation of standing, squatting, sitting, lying on the back, lying on the stomach, lying on the side, walking, running, climbing, driving, shooting, throwing, being injured, reconnaissance, close-up combat and the like in the virtual battlefield environment. The battlefield virtual environment comprises: at least one natural form of flat ground, mountains, plateaus, basins, deserts, rivers, lakes, oceans and vegetation, and site forms of buildings, vehicles, ruins, training fields and the like. The virtual object includes: virtual characters, virtual animals, cartoon characters, etc., each virtual object having its own shape and volume in the three-dimensional virtual environment occupies a part of the space in the three-dimensional virtual environment.
Based on the above, in one example, soldier a controls virtual object a and soldier B controls virtual object B, and soldier a and soldier B are not soldiers in the same team.
Illustratively, soldier A controls virtual object a to attack virtual object b by using skill 1, and when skill 1 is released, a first client corresponding to soldier A sends a skill release request to a server, wherein the skill release request carries a skill identifier of skill 1. The server creates a first flyer according to the skill identification of the skill 1, generates a first data packet corresponding to the first flyer, wherein the first data packet comprises a first motion parameter of the first flyer, and sends the first data packet to a first client and a second client corresponding to the soldier B. The first client and the second client respectively display a first virtual environment picture according to the first data packet, the first virtual environment picture is a picture of the first flying object in the moving process to the virtual object b, and the life value of the virtual object b in the first virtual environment picture is a first life value. The second client detects the distance between the first flyer and the virtual object b at preset time intervals, when the distance is smaller than a distance threshold value, the second client determines that the first flyer hits the virtual object b, and sends first prompt information to the server, wherein the first prompt information is generated by the second client according to the fact that the virtual object b is hit by the first flyer. After receiving the first prompt message, the server calculates a skill effect corresponding to the first flyer, such as a special effect when the first flyer hits the virtual object b, a reduction value of a life value of the virtual object b after the virtual object b is hit, and the like. And the server sends the calculated result to the first client and the second client, and second virtual environment pictures are respectively displayed on the first client and the second client, wherein the second virtual environment pictures display that the life value of the virtual object b is a second life value which is smaller than the first life value.
Illustratively, soldier A controls virtual object a to attack virtual object b by using skill 2, and when skill 2 is released, a first client corresponding to soldier A sends a skill release request to a server, wherein the skill release request carries a skill identifier of skill 2. The server creates a second flyer according to the skill identification of the skill 2, generates a second data packet corresponding to the second flyer, wherein the second data packet comprises a second motion parameter of the second flyer, and sends the second data packet to the first client and a second client corresponding to the soldier B. And the first client and the second client respectively load and display a third virtual environment picture according to the second data packet, wherein the third virtual environment picture is a picture of a plurality of sub-flyers moving in the virtual environment, one sub-flyer moves to the virtual object b, and the sub-flyer is a flyer obtained by splitting the second flyer by the client according to the second data packet. The life value of the virtual object b in the third virtual environment screen is the first life value. The second client detects the distance between the sub-flying object and the virtual object b at preset time intervals (the target position of the sub-flying object is the position of the virtual object b), when the distance is smaller than a distance threshold value, the second client determines that the sub-flying object hits the virtual object b, the second client sends second prompt information to the server, and the second prompt information is generated when the second client hits the sub-flying object b according to the virtual object b. After receiving the second prompt message, the server calculates a skill effect corresponding to the second flying object, such as a special effect when the second flying object hits the virtual object b, a reduction value of a life value of the virtual object b after the virtual object b is hit, and the like. And the server sends the calculated result to the first client and the second client, and respectively displays a fourth virtual environment picture on the first client and the second client, wherein the fourth virtual environment picture displays that the life value of the virtual object b is a second life value, and the second life value is smaller than the first life value.
In summary, in this embodiment, the data synchronization method is applied to a military simulation program, so that the battle images displayed in the client are smoother, and the server synchronously performs logic operations, so as to ensure the accuracy of the operation results, and enable soldiers to better perform actual combat simulation contact.
The following are embodiments of the apparatus of the present application, and for details that are not described in detail in the embodiments of the apparatus, reference may be made to corresponding descriptions in the above method embodiments, and details are not described herein again.
Fig. 10 is a schematic structural diagram illustrating a data synchronization apparatus according to an exemplary embodiment of the present application. The apparatus can be implemented as all or a part of a terminal by software, hardware or a combination of both, and includes:
the first receiving module 1010 is used for receiving skill release operation and sending a skill release request to a server, wherein the skill release request carries a skill identifier, and the server is used for creating a first flyer corresponding to the skill according to the skill identifier;
the first receiving module 1010 is configured to receive a first data packet sent by a server, where the first data packet is generated by the server according to a first flying object, and the first data packet includes a first motion parameter of the first flying object;
the display module 1020 is configured to calculate a display position of the first flying object according to the first motion parameter, and display a first virtual environment picture according to the display position of the first flying object, where the first virtual environment picture displays a motion path of the first flying object.
In an alternative embodiment, the apparatus includes a first processing module 1030;
the first processing module 1030 is configured to extract a target position and a moving speed corresponding to the first flying object from the first motion parameter, where the target position is a position where a target object is located, the target object is an object located within an action range of the first flying object, and the target object corresponds to a first life value; calculating a motion path of the first flying object in the virtual environment according to the target position and the moving speed; and determining the display position of the first flying object in the first picture frame of each frame according to the motion path.
In an optional embodiment, the first receiving module 1010 is configured to receive, in response to the first flyer hitting the target object, a skill effect sent by the server, where the skill effect is generated by the server according to the received first prompt information, and the first prompt information is information generated by a client corresponding to the target object in response to the target object being located within an effect range of the first flyer;
the display module 1020 is configured to display a second image frame according to the skill effect, where the second image frame displays a second life value of the target object, the second life value is smaller than the first life value in the first image frame, and the second image frame is a subset image frame of the first virtual environment image.
In an optional embodiment, the first receiving module 1010 is configured to receive the skill action effect sent by the server in response to that the distance between the first flying object and the target object is less than or equal to the first distance threshold and the first verification of the first prompt information by the server passes;
the first verification is that the server determines that the first distance is smaller than or equal to a second distance after receiving the first prompt message, wherein the first distance is the distance between the first flying object and the target object calculated by the server, and the second distance is the distance between the first flying object and the target object calculated by the client corresponding to the target object.
In an alternative embodiment, the server is configured to create a second flyer based on the skill identification, the second flyer including at least two sub-flyers;
the first receiving module 1010 is configured to receive a second data packet sent by the server, where the second data packet is generated by the server according to a second flying object, and the second data packet includes second motion parameters corresponding to each of the sub-flying objects;
the display module 1020 is configured to calculate a display position of each sub-flying object according to the second motion parameter, and display a second virtual environment picture according to the display position of each sub-flying object, where the second virtual environment picture displays a motion path of each sub-flying object.
In an optional embodiment, the first processing module 1030 is configured to extract, from the second motion parameter, a number, a target position, and a moving speed corresponding to the sub-flying object, where the target position is a position where a target object is located, the target object is an object located within an action range of the second flying object, and the target object corresponds to a first life value; calculating the motion path of each sub-flying object in the virtual environment according to the number, the target position and the moving speed; and determining the display position of each sub-flying object in the third picture frame of each frame according to the motion path.
In an optional embodiment, the first receiving module 1010 is configured to receive, in response to the sub-aircraft hitting the target object, a sub-skill effect sent by the server, where the sub-skill effect is generated by the server according to the received second prompt information, and the second prompt information is information generated by a client corresponding to the target object in response to the target object being located within the range of action of the sub-aircraft;
the display module 1020 is configured to display a fourth image frame according to the sub-skill effect, where the fourth image frame displays a second life value of the target object, the second life value is smaller than the first life value in the third image frame, and the fourth image frame is a subset image frame of the second virtual environment image.
In an optional embodiment, the first receiving module 1010 is configured to receive the sub-skill action effect sent by the server in response to that the distance between the sub-flight object and the target object is less than or equal to the second distance threshold and the second verification of the second prompt information by the server passes;
the second verification is that the server determines that the third distance is smaller than or equal to a fourth distance after receiving the second prompt message, the third distance is the distance between the sub-flying object and the target object calculated by the server, and the fourth distance is the distance between the sub-flying object and the target object calculated by the client corresponding to the target object.
Fig. 11 is a schematic structural diagram illustrating a data synchronization apparatus according to another exemplary embodiment of the present application. The apparatus can be implemented as all or a part of a terminal by software, hardware or a combination of both, and includes:
a second receiving module 1110, configured to receive a skill release request sent by a first client, where the skill release request carries a skill identifier;
a creating module 1120, configured to create a first flyer corresponding to the skill according to the skill identifier;
a second processing module 1130, configured to generate a first data packet according to the first flying object, where the first data packet includes a first motion parameter of the first flying object;
a sending module 1140, configured to send the first data packet to at least two clients, where the at least two clients are configured to load and display the first virtual environment picture according to the first data packet, and the at least two clients include the first client.
In an optional embodiment, the second receiving module 1110 is configured to receive, in response to that the target object is within the range of the first flying object, first prompt information sent by a second client, where the first prompt information is information generated by the second client in response to that the target object is within the range of the first flying object, and the second client corresponds to the target object;
the second processing module 1130 is configured to generate a skill effect according to the first prompt information;
the sending module 1140 is configured to send the skill effect to the first client and the second client, where the first client and the second client display the target object respectively, and the life value of the target object is reduced.
In an alternative embodiment, the second processing module 1130 is configured to determine that the first verification of the first prompt message passes in response to the first distance being less than or equal to the second distance, and generate the skill effect according to the first verification of the first prompt message;
the first distance is the distance between the first flying object and the target object calculated by the server, and the second distance is the distance between the first flying object and the target object calculated by the client corresponding to the target object.
In an optional embodiment, the creating module 1120 is configured to create a second flyer corresponding to the skill according to the skill identifier, where the second flyer includes at least two sub-flyers;
the second receiving module 1110 is configured to receive, in response to that the target object is within an action range of the second flying object, second prompt information sent by a second client, where the second prompt information is generated by the second client in response to that the target object is within the action range of the sub-flying object, and the second client corresponds to the target object;
the second processing module 1130 is configured to generate a sub-skill effect according to the second prompt information;
the sending module 1140 is configured to send the sub-skill effect to a first client and a second client, where the first client and the second client display a target object respectively, and a life value of the target object is reduced.
In an optional embodiment, the second processing module 1130 is configured to determine that the second verification for the second prompt message passes in response to the third distance being less than or equal to the fourth distance; generating a sub-skill effect through second verification according to the second prompt message;
the third distance is the distance between the sub-flying object and the target object calculated by the server, and the fourth distance is the distance between the sub-flying object and the target object calculated by the client corresponding to the target object.
In an optional embodiment, the creating module 1120 is configured to create a first behavior tree corresponding to the first flying object, where the first behavior tree is used to control the first flying object to move in the virtual environment.
In an optional embodiment, the creating module 1120 is configured to create a second behavior tree corresponding to the second flying object, where the second behavior tree is used to control the second flying object to move in the virtual environment.
Fig. 12 shows a block diagram of a computer device 1200 according to an exemplary embodiment of the present application. The computer device 1200 may be a portable mobile terminal, such as: smart phones, tablet computers, MP3 players (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4). Computer device 1200 may also be referred to by other names such as user equipment, portable terminals, and the like.
Generally, computer device 1200 includes: a processor 1201 and a memory 1202.
The processor 1201 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1201 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1201 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1201 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, the processor 1201 may further include an AI (Artificial Intelligence) processor for processing a computing operation related to machine learning.
Memory 1202 may include one or more computer-readable storage media, which may be tangible and non-transitory. Memory 1202 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1202 is used to store at least one instruction for execution by the processor 1201 to implement the data synchronization methods provided in embodiments of the present application.
In some embodiments, the computer device 1200 may further optionally include: a peripheral interface 1203 and at least one peripheral. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1204, touch display 1205, camera 1206, audio circuitry 1207, pointing component 1208, and power source 1209.
The peripheral interface 1203 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 1201 and the memory 1202. In some embodiments, the processor 1201, memory 1202, and peripheral interface 1203 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1201, the memory 1202 and the peripheral device interface 1203 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1204 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuit 1204 communicates with a communication network and other communication devices by electromagnetic signals. The radio frequency circuit 1204 converts an electric signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electric signal. Optionally, the radio frequency circuit 1204 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, etc. The radio frequency circuit 1204 may communicate with other terminals through at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1204 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The touch display 1205 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. The touch display screen 1205 also has the ability to acquire touch signals on or over the surface of the touch display screen 1205. The touch signal may be input to the processor 1201 as a control signal for processing. The touch display 1205 is used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the touch display 1205 may be one, providing the front panel of the computer device 1200; in other embodiments, the touch display 1205 can be at least two, respectively disposed on different surfaces of the computer device 1200 or in a folded design; in other embodiments, the touch display 1205 may be a flexible display disposed on a curved surface or on a folded surface of the computer device 1200. Even more, the touch display panel 1205 can be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The touch Display panel 1205 can be made of a material such as an LCD (Liquid Crystal Display) or an OLED (Organic Light-Emitting Diode).
Camera assembly 1206 is used to capture images or video. Optionally, camera assembly 1206 includes a front camera and a rear camera. Generally, a front camera is used for realizing video call or self-shooting, and a rear camera is used for realizing shooting of pictures or videos. In some embodiments, the number of the rear cameras is at least two, and each of the rear cameras is any one of a main camera, a depth-of-field camera and a wide-angle camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting function and a VR (Virtual Reality) shooting function. In some embodiments, camera assembly 1206 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1207 is used to provide an audio interface between a user and the computer device 1200. The audio circuitry 1207 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals into the processor 1201 for processing or inputting the electric signals into the radio frequency circuit 1204 to achieve voice communication. For stereo capture or noise reduction purposes, the microphones may be multiple and located at different locations on the computer device 1200. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1201 or the radio frequency circuit 1204 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1207 may also include a headphone jack.
The Location component 1208 is used to locate a current geographic Location of the computer device 1200 for navigation or LBS (Location Based Service). The Positioning component 1208 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
The power supply 1209 is used to power the various components in the computer device 1200. The power source 1209 may be alternating current, direct current, disposable or rechargeable. When the power source 1209 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the computer device 1200 also includes one or more sensors 1210. The one or more sensors 1210 include, but are not limited to: acceleration sensor 1211, gyro sensor 1212, pressure sensor 1213, fingerprint sensor 1214, optical sensor 1215, and proximity sensor 1216.
The acceleration sensor 1211 may detect magnitudes of accelerations on three coordinate axes of a coordinate system established with the computer apparatus 1200. For example, the acceleration sensor 1211 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1201 may control the touch display 1205 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1211. The acceleration sensor 1211 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1212 may detect a body direction and a rotation angle of the computer device 1200, and the gyro sensor 1212 may collect a 3D motion of the user on the computer device 1200 in cooperation with the acceleration sensor 1211. The processor 1201 can implement the following functions according to the data collected by the gyro sensor 1212: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1213 may be disposed on the side bezel of computer device 1200 and/or on the underlying layers of touch display 1205. When the pressure sensor 1213 is disposed on the side frame of the computer apparatus 1200, a user's grip signal to the computer apparatus 1200 can be detected, and left-right hand recognition or shortcut operation can be performed based on the grip signal. When the pressure sensor 1213 is disposed on the lower layer of the touch display screen 1205, the operability control on the UI interface can be controlled according to the pressure operation of the user on the touch display screen 1205. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1214 is used for collecting a fingerprint of the user to identify the identity of the user according to the collected fingerprint. When the user identity is identified as a trusted identity, the processor 1201 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 1214 may be provided on the front, back or side of the computer device 1200. When a physical key or vendor Logo is provided on the computer device 1200, the fingerprint sensor 1214 may be integrated with the physical key or vendor Logo.
The optical sensor 1215 is used to collect the ambient light intensity. In one embodiment, the processor 1201 may control the display brightness of the touch display 1205 according to the ambient light intensity collected by the optical sensor 1215. Specifically, when the ambient light intensity is high, the display brightness of the touch display panel 1205 is increased; when the ambient light intensity is low, the display brightness of the touch display panel 1205 is turned down. In another embodiment, processor 1201 may also dynamically adjust the camera head 1206 shooting parameters based on the ambient light intensity collected by optical sensor 1215.
A proximity sensor 1216, also known as a distance sensor, is typically disposed on the front side of the computer device 1200. The proximity sensor 1216 is used to collect the distance between the user and the front of the computer device 1200. In one embodiment, the processor 1201 controls the touch display 1205 to switch from the bright screen state to the dark screen state when the proximity sensor 1216 detects that the distance between the user and the front of the computer device 1200 is gradually decreasing; when the proximity sensor 1216 detects that the distance between the user and the front of the computer device 1200 is gradually increased, the touch display 1205 is controlled by the processor 1201 to switch from the rest screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in FIG. 12 is not intended to be limiting of the computer device 1200 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Fig. 13 shows a schematic structural diagram of a server according to an exemplary embodiment of the present application. The server may be the server 140 in the computer system 100 shown in fig. 1.
The server 1300 includes a Central Processing Unit (CPU) 1301, a system Memory 1304 including a Random Access Memory (RAM) 1302 and a Read Only Memory (ROM) 1303, and a system bus 1305 connecting the system Memory 1304 and the Central Processing Unit 1301. The server 1300 also includes a basic Input/Output System (I/O System)1306 for facilitating information transfer between devices within the computer, and a mass storage device 1307 for storing an operating System 1313, application programs 1314, and other program modules 1315.
The basic input/output system 1306 includes a display 1308 for displaying information and an input device 1309, such as a mouse, keyboard, etc., for user input of information. Wherein a display 1308 and an input device 1309 are connected to the central processing unit 1301 through an input-output controller 1310 connected to the system bus 1305. The basic input/output system 1306 may also include an input/output controller 1310 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, input-output controller 1310 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1307 is connected to the central processing unit 1301 through a mass storage controller (not shown) connected to the system bus 1305. The mass storage device 1307 and its associated computer-readable media provide non-volatile storage for the server 1300. That is, the mass storage device 1307 may include a computer-readable medium (not shown) such as a hard disk or Compact disk Read Only Memory (CD-ROM) drive.
Computer-readable media may include computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), flash Memory or other Solid State Memory technology, CD-ROM, Digital Versatile Disks (DVD), or Solid State Drives (SSD), other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage devices. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM). Of course, those skilled in the art will appreciate that computer storage media is not limited to the foregoing. The system memory 1304 and mass storage device 1307 described above may be collectively referred to as memory.
According to various embodiments of the present application, server 1300 may also operate as a remote computer connected to a network via a network, such as the Internet. That is, the server 1300 may be connected to the network 1312 through the network interface unit 1311, which is connected to the system bus 1305, or may be connected to other types of networks or remote computer systems (not shown) using the network interface unit 1311.
The memory further includes one or more programs, and the one or more programs are stored in the memory and configured to be executed by the CPU.
In an alternative embodiment, a computer device is provided, the computer device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, the at least one instruction, at least one program, set of codes, or set of instructions being loaded and executed by the processor to implement the method of displaying an applet page as described above.
In an alternative embodiment, a computer readable storage medium is provided having stored therein at least one instruction, at least one program, set of codes, or set of instructions that is loaded and executed by a processor to implement the method of displaying an applet page as described above.
Optionally, the computer-readable storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM). The above-mentioned serial numbers of the embodiments of the present application are for description only and do not represent the merits of the embodiments.
The present application further provides a computer device, comprising: a processor and a memory, the storage medium having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by the processor to implement the data synchronization method provided by the above-described method embodiments.
The present application further provides a computer-readable storage medium, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by a processor to implement the data synchronization method provided by the above-mentioned method embodiments.
The present application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and executes the computer instructions to cause the computer device to perform the data synchronization method provided by the above method embodiments.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A method for synchronizing data, the method comprising:
receiving skill release operation, and sending a skill release request to a server, wherein the skill release request carries a skill identifier, and the server is used for creating a first flyer corresponding to a skill according to the skill identifier;
receiving a first data packet sent by the server, wherein the first data packet is generated by the server according to the first flying object and comprises a first motion parameter of the first flying object;
and calculating the display position of the first flying object according to the first motion parameter, and displaying a first virtual environment picture according to the display position of the first flying object, wherein the motion path of the first flying object is displayed on the first virtual environment picture.
2. The method of claim 1, wherein said calculating a display position of the first flying object from the first motion parameter comprises:
extracting a target position and a moving speed corresponding to the first flying object from the first motion parameters, wherein the target position is a position where a target object is located, the target object is an object located in an action range of the first flying object, and the target object corresponds to a first life value;
calculating a motion path of the first flying object in a virtual environment according to the target position and the moving speed;
and determining the display position of the first flying object in each frame of the first picture frame according to the motion path.
3. The method of claim 2, further comprising:
responding to the first flyer to hit the target object, and receiving a skill effect sent by the server, wherein the skill effect is generated by the server according to received first prompt information, and the first prompt information is information generated by a client corresponding to the target object in response to the target object being located in the action range of the first flyer;
and displaying a second picture frame according to the skill effect, wherein the second picture frame displays a second life value of the target object, the second life value is smaller than the first life value in the first picture frame, and the second picture frame is a subset picture frame of the first virtual environment picture.
4. The method of claim 3, wherein receiving the skill effect transmitted by the server in response to the first flyer hitting the target object comprises:
receiving a skill effect sent by the server in response to the fact that the distance between the first flying object and the target object is smaller than or equal to a first distance threshold and the first verification of the first prompt message by the server is passed;
the first verification pass means that the server determines that a first distance is smaller than or equal to a second distance after receiving the first prompt message, wherein the first distance is the distance between the first flying object and the target object calculated by the server, and the second distance is the distance between the first flying object and the target object calculated by a client corresponding to the target object.
5. The method according to any one of claims 1 to 4, wherein the server is configured to create a second flyer from the skill identification, the second flyer comprising at least two sub-flyers;
the method further comprises the following steps:
receiving a second data packet sent by the server, wherein the second data packet is generated by the server according to the second flying object and comprises second motion parameters corresponding to the sub-flying objects;
and calculating the display position of each sub-flying object according to the second motion parameter, and displaying a second virtual environment picture according to the display position of each sub-flying object, wherein the motion path of each sub-flying object is displayed on the second virtual environment picture.
6. The method of claim 5, wherein said calculating a display position for each of said sub-flights based on said second motion parameters comprises:
extracting the number, the target position and the moving speed corresponding to the sub-flyers from the second motion parameters, wherein the target position is the position of a target object, the target object is an object located in the action range of the second flyer, and the target object corresponds to a first life value;
calculating the motion path of each sub-flying object in the virtual environment according to the number, the target position and the moving speed;
and determining the display position of each sub-flying object in the third picture frame of each frame according to the motion path.
7. The method of claim 6, further comprising:
responding to the target object hit by the sub-flyer, and receiving a sub-skill effect sent by the server, wherein the sub-skill effect is generated by the server according to received second prompt information, and the second prompt information is generated by a client corresponding to the target object in response to the target object being located in the action range of the sub-flyer;
and displaying a fourth picture frame according to the sub-skill effect, wherein the fourth picture frame displays a second life value of the target object, the second life value is smaller than the first life value in the third picture frame, and the fourth picture frame is a subset picture frame of the second virtual environment picture.
8. The method of claim 7, wherein said receiving a sub-skill effect sent by said server in response to said sub-flier hitting said target object comprises:
receiving a sub-skill effect sent by the server after the distance between the sub-flying object and the target object is smaller than or equal to a second distance threshold and second verification of the second prompt message by the server is passed;
the second verification is that the server determines that a third distance is smaller than or equal to a fourth distance after receiving the second prompt message, where the third distance is the distance between the sub-flying object and the target object calculated by the server, and the fourth distance is the distance between the sub-flying object and the target object calculated by the client corresponding to the target object.
9. A method for synchronizing data, the method comprising:
receiving the skill release request sent by a first client, wherein the skill release request carries a skill identifier;
creating a first flyer corresponding to the skill according to the skill identification;
generating a first data packet according to the first flying object, wherein the first data packet comprises a first motion parameter of the first flying object;
and sending the first data packet to at least two clients, wherein the at least two clients are used for respectively loading and displaying a first virtual environment picture according to the first data packet, and the at least two clients comprise the first client.
10. The method of claim 9, further comprising:
responding to a target object in an action range of the first flyer, and receiving first prompt information sent by a second client, wherein the first prompt information is information generated by the second client in response to the target object being in the action range of the first flyer, and the second client corresponds to the target object;
generating a skill effect according to the first prompt information;
and respectively sending the skill effect to the first client and the second client, wherein the target object is respectively displayed on the first client and the second client, and the life value of the target object is reduced.
11. The method of claim 9, further comprising:
creating a second flyer corresponding to the skill according to the skill identification, wherein the second flyer comprises at least two sub flyers;
responding to the action range of a target object in a second flyer, and receiving second prompt information sent by a second client, wherein the second prompt information is generated by the second client in response to the target object being located in the action range of the sub flyer, and the second client corresponds to the target object;
generating a sub-skill effect according to the second prompt information;
and sending the sub-skill effect to the first client and the second client respectively, wherein the target object is displayed on the first client and the second client respectively, and the life value of the target object is reduced.
12. A data synchronization apparatus, the apparatus comprising:
the system comprises a first receiving module, a skill releasing module and a second receiving module, wherein the first receiving module is used for receiving skill releasing operation and sending a skill releasing request to a server, the skill releasing request carries a skill identifier, and the server is used for creating a first flyer corresponding to the skill according to the skill identifier;
the first receiving module is configured to receive a first data packet sent by the server, where the first data packet is generated by the server according to the first flying object, and the first data packet includes a first motion parameter of the first flying object;
and the display module is used for calculating the display position of the first flying object according to the first motion parameter, and displaying a first virtual environment picture according to the display position of the first flying object, wherein the motion path of the first flying object is displayed on the first virtual environment picture.
13. A data synchronization apparatus, the apparatus comprising:
the second receiving module is used for receiving the skill release request sent by the first client, and the skill release request carries a skill identifier;
the creating module is used for creating a first flyer corresponding to the skill according to the skill identification;
the processing module is used for generating a first data packet according to the first flying object, and the first data packet comprises a first motion parameter of the first flying object;
and the sending module is used for sending the first data packet to at least two clients, the at least two clients are used for respectively loading and displaying a first virtual environment picture according to the first data packet, and the at least two clients comprise the first client.
14. A computer device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, the instruction, the program, the set of codes, or the set of instructions being loaded and executed by the processor to implement a data synchronization method as claimed in any one of claims 1 to 11.
15. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the data synchronization method of any of claims 1 to 11.
CN202110243989.9A 2021-03-05 2021-03-05 Data synchronization method, device, equipment and storage medium Active CN112843682B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110243989.9A CN112843682B (en) 2021-03-05 2021-03-05 Data synchronization method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110243989.9A CN112843682B (en) 2021-03-05 2021-03-05 Data synchronization method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112843682A true CN112843682A (en) 2021-05-28
CN112843682B CN112843682B (en) 2022-07-29

Family

ID=75993632

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110243989.9A Active CN112843682B (en) 2021-03-05 2021-03-05 Data synchronization method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112843682B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180178107A1 (en) * 2008-06-09 2018-06-28 EyezOnBaseball, LLC Interactive scorekeeping and animation generation
CN108310765A (en) * 2017-12-14 2018-07-24 腾讯科技(深圳)有限公司 The display methods and device of image, storage medium, electronic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180178107A1 (en) * 2008-06-09 2018-06-28 EyezOnBaseball, LLC Interactive scorekeeping and animation generation
CN108310765A (en) * 2017-12-14 2018-07-24 腾讯科技(深圳)有限公司 The display methods and device of image, storage medium, electronic device

Also Published As

Publication number Publication date
CN112843682B (en) 2022-07-29

Similar Documents

Publication Publication Date Title
WO2021184806A1 (en) Interactive prop display method and apparatus, and terminal and storage medium
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN111744186B (en) Virtual object control method, device, equipment and storage medium
CN111589124B (en) Virtual object control method, device, terminal and storage medium
CN109529356B (en) Battle result determining method, device and storage medium
CN110917623B (en) Interactive information display method, device, terminal and storage medium
CN111659119B (en) Virtual object control method, device, equipment and storage medium
CN110755844B (en) Skill activation method and device, electronic equipment and storage medium
CN111389005B (en) Virtual object control method, device, equipment and storage medium
CN110585706B (en) Interactive property control method, device, terminal and storage medium
CN113117331B (en) Message sending method, device, terminal and medium in multi-person online battle program
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
CN111589144B (en) Virtual character control method, device, equipment and medium
CN113144597A (en) Virtual vehicle display method, device, equipment and storage medium
CN111672106B (en) Virtual scene display method and device, computer equipment and storage medium
WO2023029836A1 (en) Virtual picture display method and apparatus, device, medium, and computer program product
CN111249726B (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN112973117A (en) Interaction method of virtual objects, reward issuing method, device, equipment and medium
CN111330277A (en) Virtual object control method, device, equipment and storage medium
CN113101656B (en) Virtual object control method, device, terminal and storage medium
CN112221135B (en) Picture display method, device, equipment and storage medium
CN111589102B (en) Auxiliary tool detection method, device, equipment and storage medium
JP2023164687A (en) Virtual object control method and apparatus, and computer device and storage medium
CN111659122A (en) Virtual resource display method and device, electronic equipment and storage medium
CN111589113B (en) Virtual mark display method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40044181

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant