CN114225413A - Collision detection method and device, electronic equipment and storage medium - Google Patents

Collision detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114225413A
CN114225413A CN202111585025.9A CN202111585025A CN114225413A CN 114225413 A CN114225413 A CN 114225413A CN 202111585025 A CN202111585025 A CN 202111585025A CN 114225413 A CN114225413 A CN 114225413A
Authority
CN
China
Prior art keywords
collision
target
box
time
collision box
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111585025.9A
Other languages
Chinese (zh)
Inventor
杜敏
蔡倜
朱晟达
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Perfect Time And Space Software Co ltd
Original Assignee
Shanghai Perfect Time And Space Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Perfect Time And Space Software Co ltd filed Critical Shanghai Perfect Time And Space Software Co ltd
Priority to CN202111585025.9A priority Critical patent/CN114225413A/en
Publication of CN114225413A publication Critical patent/CN114225413A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Abstract

The application provides a collision detection method, a collision detection device, an electronic device and a storage medium, wherein the method comprises the following steps: acquiring target operation executed at a target client, wherein the target operation is used for triggering a target virtual character in a target game scene to release a target character skill; acquiring time shift information of target collision boxes of the target role skills according to the target operation, wherein each target role skill is associated with at least one target collision box, and the time shift information is used for determining the corresponding relation between the time and the position of the collision box; and determining a collision status of the target collision box and a collision object collision box based on the time shift information, wherein the collision object collision box comprises a collision box associated with a collision object and/or a collision box associated with an object skill of the collision object. The scheme solves the problems of low collision judgment accuracy, poor hitting feeling and high consumption of computing power in the related technology.

Description

Collision detection method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of internet, and in particular, to a collision detection method and apparatus, an electronic device, and a storage medium.
Background
Hit determination is a basic requirement of a game. There may be a plurality of different hit determination methods depending on the type of game.
The turn-based game generally only needs to be simply calculated according to the hit rate; 2D games and most MMORPGM (Massively Multiplayer Online Role-playing games) need to judge whether a target coordinate point is hit or not by calculating whether the target coordinate point is within the action range of skills; meanwhile, an action game that emphasizes a sense of percussion generally employs a physical engine to calculate whether a collision hit.
The first two hit judgment methods are simple and easy to use, but have low accuracy and poor hitting feeling, and are not suitable for being applied to action games. The last one is generally applied to a stand-alone game, and is difficult to apply to a multiplayer online server because the physical engine consumes a large amount of computing power.
Disclosure of Invention
The application provides a collision detection method, a collision detection device, an electronic device and a storage medium, which are used for at least solving the problems of low collision detection accuracy, poor striking feeling and high computing power consumption in the related art.
According to a first aspect of embodiments of the present application, there is provided a collision detection method, including:
acquiring target operation executed at a target client, wherein the target operation is used for triggering a target virtual character in a target game scene to release a target character skill; responding to the target operation, acquiring time shift information of target collision boxes of the target character skills, wherein each target character skill is associated with at least one target collision box, and the time shift information is used for determining the corresponding relation between the time and the position of the collision box; and determining a collision status of the target collision box and a collision object collision box based on the time shift information, wherein the collision object collision box comprises a collision box associated with a collision object and/or a collision box associated with an object skill of the collision object.
According to a first possible implementation manner of the first aspect of the embodiment of the present application, the time shift information includes a first preset track file and first real-time release information, and the obtaining of the time shift information of the target collision box of the target character skill in response to the target operation includes: determining the first preset trajectory file associated with the target crash box; and acquiring the first real-time release information based on the state of the target role in the skill release, wherein the real-time release information comprises a first real-time position, a first real-time direction and a first release time of the target virtual role in the skill release of the target role.
According to a second possible implementation manner of the first aspect of the embodiment of the application, the determining a collision status of the target collision box and the collision object collision box based on the time shift information includes: acquiring fixed position information of the collision object collision box in the case that the collision object collision box is associated with a static collision object; and determining a collision state of the target collision box and the collision object collision box based on the first preset track file, the first real-time release information and the fixed position information.
According to a third possible implementation manner of the first aspect of the embodiment of the present application, the determining a collision status of the target collision box and the collision object collision box based on the time shift information includes: under the condition that the collision object collision box is associated with the dynamic collision object, acquiring a second preset track file and second real-time release information of the collision object collision box; and determining the collision state of the target collision box and the collision object collision box based on the first preset track file, the first real-time release information, the second preset track file and the second real-time release information.
According to a fourth possible implementation manner of the first aspect of the embodiment of the present application, the determining a collision status of the target collision box and the collision object collision box based on the first preset trajectory file, the first real-time release information, the second preset trajectory file, and the second real-time release information includes: and respectively calculating the position of the target collision box and the position of the collision object collision box by adopting a difference method based on the first preset track file, the first real-time release information, the second preset track file and the second real-time release information, and determining the collision state of the target collision box and the collision object collision box according to the position relation of the target collision box and the collision object collision box.
According to a fifth possible implementation manner of the first aspect of the embodiment of the present application, the method further includes: acquiring collision characteristics of the collision box, wherein the collision characteristics comprise types of collidable objects and/or collidable times; the determining a collision status of the target collision box with a collision object collision box based on the time-shift information comprises: determining a collision status of the target collision box with a collision object collision box based on the time shift information and the collision characteristics.
According to a sixth possible implementation manner of the first aspect of the embodiment of the present application, the preset trajectory file includes at least one of an appearance time, a disappearance time, a real-time trajectory position, zoom information, and rotation information of the crash box.
According to a seventh possible implementation manner of the first aspect of the embodiment of the present application, the method further includes: modifying the preset track file; and/or modifying the association relationship between the target collision box and the preset track file.
According to an eighth possible implementation manner of an aspect of the embodiment of the present application, the method further includes: and modifying the preset track file based on a physical visual debugging tool and the collision state.
According to a second aspect of embodiments of the present application, there is provided a collision detection apparatus including: the system comprises an acquisition unit, a display unit and a control unit, wherein the acquisition unit is used for acquiring target operation executed at a target client, and the target operation is used for triggering a target virtual character in a target game scene to release target character skills; a response unit, configured to obtain, in response to the target operation, time shift information of target collision boxes of the target character skills, where each target character skill is associated with at least one target collision box, and the time shift information is used to determine a correspondence between time and a position of the collision box; and a determination unit configured to determine a collision status of the target collision box and a collision object collision box based on the time shift information, wherein the collision object collision box includes a collision box associated with a collision object and/or a collision box associated with an object skill of the collision object.
According to a third aspect of embodiments herein, there is also provided an electronic device comprising a processor and a memory, the memory for storing a computer program; a processor for performing the method steps in any of the above embodiments by running the computer program stored on the memory.
According to a fourth aspect of the embodiments of the present application, there is also provided a computer-readable storage medium, in which a computer program is stored, wherein the computer program is configured to perform the method steps in any of the above embodiments when the computer program is executed.
For the problems in the prior art, the server obtains a target operation executed at a target client, where the target operation is used to trigger a target virtual character in a target game scene to release a target character skill. Then, the server responds to the target operation to acquire time shift information of target collision boxes of the target character skills, wherein each target character skill is associated with at least one target collision box, and the time shift information is used for determining the corresponding relation between the time and the position of the collision box. Finally, the server determines the collision state of the target collision box and the collision object collision box based on the time shift information, wherein the collision object collision box comprises a collision box related to a collision object and/or a collision box related to the object skill of the collision object. Therefore, since the server acquires only the time shift information of the target collision box based on the target character skill and calculates whether the acquired target collision box and the collision object collision box collide based on the time shift information, it is determined whether the target operation specified by the target client hits the collision object. Due to the fact that the physical engine calculates collision, the operation amount of simulating the whole scene information is large, and the physical engine is directly applied to a multi-user online server and can cause performance bottleneck. According to the collision detection method, the time shift information of the collision box is obtained by deriving the key data of the motion trail, a simplified calculation mode is adopted, the calculation amount of collision detection is greatly reduced, and the usability of the collision detection in the server is improved while the real impact feeling brought by the physical engine calculation collision is ensured.
Accordingly, the above-mentioned apparatus, electronic device and storage medium have the same effects, and the above description is only an outline of the technical solutions of the embodiments of the present application, and can be implemented according to the content of the description in order to make the technical means of the embodiments of the present application more clear. In order to make the aforementioned and other objects, features and advantages of the embodiments of the present application more comprehensible, specific embodiments of the present application are described below.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
FIG. 1 is a schematic diagram of a hardware environment for an alternative collision detection method according to an embodiment of the invention;
FIG. 2 is a schematic flow chart diagram of a collision detection method according to an embodiment of the present application;
FIG. 3 is a block diagram of an alternative collision detection configuration according to an embodiment of the present application;
fig. 4 is a block diagram of an alternative electronic device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, partial nouns or terms appearing in the description of the embodiments of the present application are applicable to the following explanations:
unity Editor: a game editor is mainly used for tools for making game scenes and game objects.
Crash box (Collider): there are mainly the following crash boxes in Unity Editor: box cooler, sphere cooler, capsule cooler, mesh cooler, and wheel cooler. The main function of the crash box is to add object collisions in the scene, which can generate various physical phenomena (including physical effects such as blocking, friction, bounce, etc.) consistent in the real world within the game.
According to a first aspect of embodiments of the present application, there is provided a collision detection method. Alternatively, in the present embodiment, the collision detection method described above may be applied to a hardware environment formed by the terminal 102 and the server 104 as shown in fig. 1. As shown in fig. 1, the server 104 is connected to the terminal 102 through a network, and may be configured to provide services (e.g., game services, application services, etc.) for the terminal or a client installed on the terminal, and may be configured with a database on the server or separately from the server, and configured to provide data storage services for the server 104.
The network includes, but is not limited to, at least one of: a wired network, a wireless network, which may include, but is not limited to, at least one of: a wide area network, a metropolitan area network, or a local area network, which may include, but is not limited to, at least one of: bluetooth, WIFI (Wireless Fidelity), and other networks that enable Wireless communication. The terminal 102 may be a terminal for computing data, such as a mobile terminal (e.g., a mobile phone, a tablet computer), a notebook computer, a PC, and the like. The server may include, but is not limited to, any hardware device capable of performing computations.
The collision detection method according to the embodiment of the present application may be executed by the server 104, or may be executed by the terminal 102, or may be executed by both the server 104 and the terminal 102. The part of the terminal 102 that executes the collision detection method according to the embodiment of the present application, which is executed by the terminal 102 in cooperation with the server 104, may also be executed by a client installed thereon.
Taking the server 104 to execute the collision detection method in this embodiment as an example, fig. 2 is a schematic flowchart of an alternative collision detection method according to an embodiment of the present application, and as shown in fig. 2, the flowchart of the method may include the following steps:
step S202, obtaining target operation executed at a target client, wherein the target operation is used for triggering a target virtual character in a target game scene to release target character skill.
The collision detection method can be applied to game applications and hit judgment in games, such as shooting game applications and hit judgment of shooting game applications. During the course of running a shooting game mission, game players participating in the shooting game mission may be divided into two camps, where players of one camp may collaboratively beat or kill players of another camp. In this embodiment, when it is detected that a collision box associated with an attack operation performed by a player-controlled virtual character collides with a collision box associated with a virtual character controlled by another player in a battle or a collision box associated with a virtual character controlled by the same player in a battle, it is determined that there is a hit, and normal operation of the game is ensured.
In some examples, the gaming application may be a multiplayer online tactical sports gaming application. The types of game applications may include, but are not limited to, Three dimensional (3D) game applications, Virtual Reality (VR) game applications, Augmented Reality (AR) game applications, and Mixed Reality (MR) game applications. The above is merely an example, and the present embodiment is not limited to this.
In some examples, the Shooting Game application may be a Third Person Shooting Game (TPS) application that is executed from the perspective of a Third party character object other than a virtual character controlled by a current player, and may also be a First Person Shooting Game (FPS) application that is executed from the perspective of a virtual character controlled by a current player. Correspondingly, the sound source virtual object for generating sound during the game task running process may be, but is not limited to: a virtual Character (also referred to as a Player Character) controlled by each game application client by a Player, a Non-Player Character (NPC), a property object (such as a gun) controlled by the virtual Character, and a carrier object (such as a vehicle) controlled by the virtual Character. The above is merely an example, and this is not limited in this embodiment.
In some examples, the target game may be executed by a background server of the target game alone, that is, the target game is executed by the server alone, the client is used only to display a game screen of the target game, and to acquire an operation on the game screen (for example, an operation position on the game screen) and synchronize to the background server; the client of the target game and the background server of the target game may also execute together, that is, the client and the background server execute part of logic of the target game respectively, which is not limited in this embodiment.
In some examples, a processing logic that takes a target game as a target network game and a background server and a client execute the target network game together is taken as an example for description, where the client acquires operation information of a user and synchronizes the operation information to the background server, and the background server executes the processing logic of game operation and synchronizes a processing result to a relevant client, and the collision detection method in this embodiment is also applicable.
For example, a client of the target network game may be run on a terminal device of a user or a player, and the client may be communicatively connected to a background server of the target network game. The user can log in to the client running on the terminal equipment by using an account number, a password, a dynamic password, a related application login and the like.
Illustratively, a target user may be logged into a target client of a target network game using a target account. A target game scene (e.g., a multiplayer online tactical competitive game scene) of a target network game (e.g., a multiplayer online tactical competitive game) may be displayed on the target client. The target object may control a virtual character corresponding thereto (i.e., the target virtual character) to perform a game operation in the target game scene, for example, move in a game map, use scene props or character skills, perform game tasks, interact with other players, and the like.
In some examples, for other users, the virtual character controlled by the target user may perform the same or similar game operation in the target game scene in the same or similar manner as the target user, and a screen on which the virtual character controlled by the other object performs the game operation may be displayed on the target client.
In some examples, a target virtual character may release a target character skill in a target game scenario, the target character skill being associated with a target virtual character, one target virtual character may be associated with multiple target character skills. The target character skill may be a skill of each field that the virtual character has in the game, such as an attack skill, a cure skill, and the like. The target character skills may also be some high fighting skills, super killing skills, waking skills, reversing skills, etc. For example, each virtual character has a set of individual character skills, and by switching the virtual characters, the player can use different character skills.
Step S204, in response to the target operation, obtaining time shift information of target collision boxes of the target character skills, where each target character skill is associated with at least one target collision box, and the time shift information is used to determine a correspondence between time and position of the collision box.
In some examples, in response to the obtained target operation, the target client may control the target virtual character to release the target character skill, for example, control the target virtual character to launch outward a character skill capable of disengaging from the target virtual character, which may be to throw a cannonball outward; the target virtual character may have character skills such as an outward punch or a kick, which are similar to the body motion. The target operation of the target virtual role sent by the user is detected by the target client, and then the detected target operation is uploaded to the background server and executed by the background server. For example, the client detects a target operation of the user issuing a skill of the target virtual character to release the shell to be thrown outward, synchronizes the target operation of the user issuing the skill of the target virtual character to release the shell to be thrown outward to the background server, executes the target operation of the user issuing the skill of the target virtual character to release the shell to be thrown outward by the background server, and performs collision detection. The execution mode of the target operation may be configured as required, which is not limited in this embodiment.
In some examples, the target character skills described above may be performed for a particular object, e.g., a particular object may be a target scene object, and what is displayed by the target client may be: the target virtual character releases the screen of the target character skill toward the target scene object. The specific object can be other movable virtual characters controlled by other players, and displayed by the target client can be: the target avatar releases the screen of the target avatar's skills toward the other avatars. The target operation may not be performed for a specific object, and what is displayed by the target client may be: the target virtual character releases the screen of the target character skill along the transmission direction.
For example, a virtual character controlled by a user may fire a cannonball by shooting, moving toward other virtual characters that are movable toward a target scene object.
In some examples, the target collision box may be associated with a character skill that can disengage the target virtual character described above. For example, if the character skill that can escape the target virtual character is a plurality of projectiles that are flashed, each projectile may bind to a target crash box.
In some examples, the target collision box may be associated with character skills belonging to a class of limb movements, such as outward boxing or kicking, by the target virtual character, e.g., the target collision box may be tied to the target virtual character foot or hand.
In some examples, the time shift information may be information including a change in position of the crash box with time, that is, may include a motion trajectory of the crash box. For example, the position of the collision box of a plurality of shells that can escape from the target virtual character may change with time if the character skill is to flash, or the position of the collision box of a target virtual character that punches or kicks outward may change with time.
And step S206, determining the collision state of the target collision box and a collision object collision box based on the time shift information, wherein the collision object collision box comprises a collision box related to a collision object and/or a collision box related to the object skill of the collision object.
In some examples, the target collision box may collide with a fixed virtual object in the target game scene during its movement along the motion trajectory. The fixed virtual object itself may be a collision target, and the fixed virtual object itself may be associated with at least one collision target collision box. The collision may be: a collision occurs between a target collision box of the target character skill and a collision object collision box of a fixed virtual object in the scene.
For example, the fixed virtual object may be any scene object that allows interaction in the target game scene, and may be, but is not limited to, one of the following: terrain objects, buildings, trees, wooden boxes, scene objects.
In some examples, the target collision box may collide with a moving virtual object in the target game scene during its movement along the motion trajectory. The above-mentioned moving virtual object itself may be a collision target, and the above-mentioned moving virtual object itself may be associated with at least one collision target collision box. The collision may be: a collision occurs between a target collision box of the target character skill and a collision object collision box of a moving virtual object in the scene.
For example, the mobile virtual object may be any mobile virtual object in the target game scene that allows interaction, and may be, but is not limited to, one of the following: a mobile creature, which may be a player character or a non-player creature, and the like.
In some examples, the target collision box may collide with character skills released by other virtual characters in the target game scene during its movement along the motion trajectory. The character skills released by the other virtual characters can be used as collision objects, and at least one collision object collision box can be associated with the character skills released by the other virtual characters. The collision may be: collisions occur between the target collision box of the target character skill and collision object collision boxes of character skills released by other virtual characters in the scene.
For example, the character skills released by other virtual characters may be character skills released by any mobile virtual object that allows interaction in the target game scene.
In some examples, the collision status may be divided into collision and non-collision. In the case where it is determined that the movement trajectories of the target crash box and the crash-object crash box overlap, it may be determined that a collision occurs between the target crash box and the crash-object crash box, in accordance with the time-shift information of the target crash box, that is, the information of the temporal change in the position of the crash box. Further, it may be determined that the target character skill of the target virtual object hits the collision object associated with the collision object collision box, or that the target character skill of the target virtual object hits the object skill corresponding to the collision object associated with the collision object collision box. In addition, the collision detection method is characterized in that the motion tracks of a plurality of collision boxes of the virtual character are calculated simultaneously, so that the virtual character is prevented from being displaced under the condition of keeping the same action, and the problem of step slipping caused by character movement and animation mismatching can be greatly inhibited when the network condition is good.
In some examples, the collision detection method can be applied to a server, so that the problem of plug-in caused by hit judgment in a manner that the server trusts a client result when the network game adopts the client to calculate the collision is avoided, and the game experience of a user is further improved.
Through the steps S202 to S206, target operations executed at the target client are acquired, where the target operations are used to trigger the target virtual character in the target game scene to release the target character skills. Then, the server responds to the target operation to acquire time shift information of target collision boxes of the target character skills, wherein each target character skill is associated with at least one target collision box, and the time shift information is used for determining the corresponding relation between the time and the position of the collision box. Finally, the server determines the collision state of the target collision box and the collision object collision box based on the time shift information, wherein the collision object collision box comprises a collision box related to a collision object and/or a collision box related to the object skill of the collision object. Therefore, since the server acquires only the time shift information of the target collision box based on the target character skill and calculates whether the acquired target collision box and the collision object collision box collide based on the time shift information, it is determined whether the target operation specified by the target client hits the collision object. Due to the fact that the physical engine calculates collision, the operation amount of simulating the whole scene information is large, and the physical engine is directly applied to a multi-user online server and can cause performance bottleneck. According to the collision detection method, the time shift information of the collision box is obtained by deriving the key data of the motion trail, a simplified calculation mode is adopted, the calculation amount of collision detection is greatly reduced, and the usability of the collision detection in the server is improved while the real impact feeling brought by the physical engine calculation collision is ensured. And moreover, the server is adopted for judging collision detection, the hidden danger of plug-in is eliminated, and the game experience of the user is improved.
As an alternative embodiment, the time shift information includes a first preset track file and first real-time release information, and the step of obtaining the time shift information of the target collision box of the target character skill in response to the target operation may include:
s11, determining the first preset track file associated with the target collision box;
s12, acquiring the first real-time release information based on the state of the target character in the skill release, wherein the real-time release information includes a first real-time position, a first real-time direction, and a first release time of the target virtual character in the skill release of the target character.
For example, the first preset track file may be motion track information of the target character skill set in advance by the art editor and planner without being affected by the release direction release manner or the like. For example, if the target character skill is a punch-up skill, the first preset trajectory file may be a preset punch-up motion of the punch-up itself, and the motion trajectory information of the target skill collision box associated with the punch-up skill without being affected by the release direction release manner and the like is a motion trajectory of the punch-up motion.
For example, if the target character skill is a single projectile fired outward so as to be capable of departing from the target virtual character, the first preset trajectory file may be a preset trajectory of motion of firing the projectile in a straight line, and the trajectory information of the target skill crash box associated with the above-mentioned projectile skill fired outward is a trajectory of motion fired forward without being affected by a release direction release manner or the like.
For example, if the target character skill is a plurality of shells which can be scattered outwards and are separated from the target virtual character, the first preset trajectory file may be a preset trajectory of motion of shells which are shot in a straight line or a deflection direction, and the trajectory information of the plurality of target skill collision boxes associated with the above-mentioned scattered outwards shell skill is a trajectory of motion of a trajectory of motion which is shot in a straight line or a deflection direction without being affected by a release direction release manner and the like.
Illustratively, the actual motion trajectory of the target collision box is not influenced by the motion trajectory information of the target character skill itself. But also by the time of release, manner of release, etc. Therefore, the first real-time release information may be obtained based on the state of the target character when the target character skill is released, where the real-time release information includes a first real-time position, a first real-time direction, and a first release time of the target virtual character when the target character skill is released.
For example, the target character skill is a single cannonball launched forward that can break away from the target virtual character. The first real-time position is that the target virtual character is in front of the collision object, the first real-time direction is the direction of the target character skill of the target virtual character facing the opposite direction of the collision object, the target virtual character and the collision object are in a relatively static state at the first release time, and the target character skill is a single cannonball which is sent forwards and can be separated from the target virtual character, namely, the first preset track file can be a preset movement track of cannonball which is launched along a straight line, and then in combination with the first preset track file, the movement track information of the target skill collision box which is associated with the cannonball skill and is sent forwards under the influence of a release direction release mode and the like cannot be overlapped with the collision box which is associated with the collision object. It can be determined that no collision has occurred between the target collision cell and the collision object collision cell. Further, it may be determined that the target character skill of the target virtual object does not hit the collision object associated with the collision object collision box.
Through the embodiment, the first preset track file and the first real-time release information of the target role skill are considered, so that the judgment of collision detection can be more accurately carried out, the real impact feeling and the optimized calculation amount brought by the physical engine calculation collision are ensured, and the accuracy of collision detection is improved.
As an alternative embodiment, the step of determining the collision status of the target collision box and the collision object collision box based on the time shift information may include:
s21, acquiring the fixed position information of the collision object collision box when the collision object collision box is associated with a static collision object.
S22, determining a collision status of the target collision box with the collision object collision box based on the first preset trajectory file, the first real-time release information, and the fixed position information.
In some examples, if the collision cell is associated with a static collision object, for example, the static collision object may be a terrain object, a building, a tree, or other static object, and since the position information of the collision cell of the static collision object is fixed, only the fixed position information of the collision cell of the static collision object may be acquired.
Illustratively, the target character skill is a single projectile fired forward that can escape the target virtual character. The collision object may be a stationary building. The first real-time position is that the target virtual character is in front of the building, the first real-time direction is the direction that the target character skill of the target virtual character faces the building, the target virtual character and the building are in a relatively static state at the first release time, and the release position of the target character skill is 50 meters away from the building. The target character skill is a single forward-sent cannonball which can be separated from the target virtual character, that is, the first preset track file can be a preset motion track of shooting the cannonball along a straight line, and if the motion track comprises the shooting range of the cannonball of 100 meters, the motion track information of the target skill collision box related to the forward-sent cannonball skill under the condition of being influenced by a releasing direction releasing mode and the like can be overlapped with the collision box related to a collision object by combining the first preset track file. It can be determined that a collision occurs between the target collision box and the collision object collision box. Further, it may be determined that the target character skill of the target virtual object hits a collision object associated with the collision object collision box.
Through the embodiment, under the condition that the collision object collision box is associated with the static collision object, collision detection is carried out only by acquiring the fixed position information of the collision object collision box and combining the first preset track file and the first real-time release information, so that the calculation amount is further optimized while the physical engine is ensured to calculate the real impact feeling brought by the collision, and the collision detection efficiency is improved.
As an alternative embodiment, the step of determining the collision status of the target collision box and the collision object collision box based on the time shift information may include:
s31, acquiring a second preset track file and second real-time release information of the collision object collision box under the condition that the collision object collision box is associated with the dynamic collision object;
s32, determining a collision status between the target crash box and the collision object crash box based on the first preset trajectory file, the first real-time release information, the second preset trajectory file, and the second real-time release information.
In some examples, if a collision object collision box is associated with a dynamic collision object, for example, the dynamic collision object may be any moving virtual object in the target game scene that allows interaction, and may be, but is not limited to, one of: a mobile creature, which may be a player character or a non-player creature, and the like.
Illustratively, the target character skill is a single projectile fired forward that can escape the target virtual character. The collision object may be another virtual character that moves. The first real-time position and the second release position indicate that the target avatar is in front of and 10 meters away from the other moving avatars, and the first real-time direction and the second real-time direction are the same, i.e., the target avatar skills of the target avatar are directed toward the moving direction of the other moving avatars. The first release time is the same as the second release time. The second preset track file indicates that the moving virtual character moves linearly at a speed of 1 meter per second, and the first preset track file indicates that the cannonball moves linearly at a speed of 5 meters per second. Then, combining the first preset track file, the first real-time release information, the second preset track file and the second real-time release information, the motion track information of the target skill collision box associated with the forward-issued cannonball skill and the collision boxes associated with other moving virtual characters under the condition that the collision boxes are influenced by the release direction release mode and the like are overlapped. It can be determined that a collision occurs between the target collision box and the collision object collision box. Further, it may be determined that the target character skill of the target virtual object hits a collision object associated with the collision object collision box.
Through the embodiment, under the condition that the collision object collision box is associated with the dynamic collision object, the first preset track file, the first real-time release information, the second preset track file and the second real-time release information are simultaneously acquired to perform collision detection, so that the accuracy of collision detection and the sense of reality of a game can be further improved.
As an alternative embodiment, the determining the collision status of the target crash box with the collision object crash box based on the first preset trajectory file, the first real-time release information, the second preset trajectory file, and the second real-time release information may include:
s41, calculating the position of the target crash box and the position of the crash box based on the first preset trajectory file, the first real-time release information, the second preset trajectory file, and the second real-time release information by using a difference method, and determining the collision state between the target crash box and the crash box according to the positional relationship between the target crash box and the crash box.
For example, the interpolation method, which is also called "interpolation method", uses the function values of several points known in a certain interval of the function f (x) to determine a suitable specific function, and uses the value of the specific function at other points in the interval as the approximate value of the function f (x) at other points. The interpolation method may include any one or a combination of a distance reciprocal multiplication method, a kriging method, a minimum curvature method, a multiple regression method, a radial basis function method, a linear interpolation method, a natural neighbor interpolation method, a nearest neighbor interpolation method, and other interpolation algorithms, which are not particularly limited and may be used. And the calculated amount of collision detection can be further evolved by adopting a difference method, so that the detection efficiency is improved.
As an optional implementation, the method may further include:
s51, acquiring the collision characteristics of the collision box, wherein the collision characteristics comprise types of collision-capable objects and/or collision-capable times;
the determining a collision status of the target collision box and the collision object collision box based on the time shift information may include:
s52, determining the collision state of the target collision box and the collision object collision box based on the time shift information and the collision characteristics.
In some examples, crash characteristics of the crash box, including the types of crashable objects and/or the number of crashes, may also need to be considered in determining the crash state. For example, the target character skill is a single cannonball launched forward that can break away from the target virtual character. The collision object may be a teammate, and if the collidable object type does not include a teammate, it may be determined that no collision occurs between the target collision box and the collision box of the collision object even if the above-described target skill collision box associated with the forwardly issued shell skill and the trajectory information of the motions of the teammate-associated collision boxes under the influence of the release direction release pattern or the like can be overlapped. Through the embodiment, the personalized collision detection can be performed according to game setting, and the user experience is improved.
As another alternative embodiment, the preset trajectory file may include at least one of an appearance time, a disappearance time, a real-time trajectory position, zoom information, and rotation information of the crash box.
In some examples, the crash box appearance time, disappearance time, real-time trajectory position, zoom information, and rotation information may also be incorporated in determining the crash state.
For example, the appearance time of the crash box appears after a predetermined time of the release time of the target character skill, that is, there may be a certain waiting time before the crash box appears after the target character skill is released. Taking the target role skill as the time for sending the cannonball as an example, the time for releasing the target role skill is the time for sending the cannonball, and the time for the collision box to appear is the time for actually shooting the cannonball. It is then clear that the time at which the cannonball is actually fired has an effect on whether the motion trajectory information of the cannonball-associated target skill crash boxes and the crash boxes associated with other virtual characters that are moving can be overlapped. Then, similarly, the disappearance time of the crash boxes described above also has an effect on whether the motion trajectory information of the cannonball-associated target skill crash box and the crash boxes associated with other virtual characters that are moved can be overlapped.
In some examples, the zoom information of the crash boxes has an effect on the crash detection result by affecting whether two crash boxes having the same center position distance can collide. Taking the target character skill as a punch, in the case where the target collision box associated with the punch of the target virtual character is not zoomed, the movement trajectories of the target collision box and the collision box associated with the collision object are close but overlap. And if it is determined by the zoom information that there is a magnification of the target crash box, it is still determined that there is a collision between the target crash box and the collision object crash box in the case where there is contact between the target crash box and the collision object crash box. The accuracy of collision detection and the sense of realism of the game can be further improved.
As an alternative embodiment, the method may further include:
s61, modifying the preset track file; and/or the presence of a gas in the gas,
and S62, modifying the association relationship between the target collision box and the preset track file.
In some examples, art editing and planning may modify the pre-set trajectory file according to user requirements and design requirements, and may make more complex motion trajectories than pure velocity calculations.
As an alternative embodiment, the method further includes:
and S71, modifying the preset track file based on the physical visualization debugging tool and the collision state.
Illustratively, the physical visualization debugging tool may be a PhysX Visual debug, and the physical visualization debugging tool may perform Visual debugging of the collision process, so as to facilitate debugging work of program maintenance personnel. Taking the adjustment of the collision times in the collision detection as an example, the configuration personnel can make the collision process more accurate and real by adjusting the collision times in the collision characteristics of the collision box, thereby improving the user experience.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., a ROM (Read-Only Memory)/RAM (Random Access Memory), a magnetic disk, an optical disk) and includes several instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the methods according to the embodiments of the present application.
According to another aspect of the embodiments of the present application, there is also provided a collision detection apparatus for implementing the above-described collision detection method. Fig. 3 is a block diagram of an alternative collision detection apparatus according to an embodiment of the present application, and as shown in fig. 3, the apparatus may include:
an obtaining unit 302, configured to obtain a target operation executed at a target client, where the target operation is used to trigger a target virtual character in a target game scene to release a target character skill;
a response unit 304, connected to the obtaining unit 302, configured to obtain, in response to the target operation, time shift information of target collision boxes of the target character skills, where each target character skill is associated with at least one target collision box, and the time shift information is used to determine a correspondence between time and position of the collision box;
a determining unit 306, connected to the responding unit 304, for determining a collision status of the target collision box and a collision object collision box based on the time shift information, wherein the collision object collision box includes a collision box associated with a collision object and/or a collision box associated with an object skill of the collision object.
It should be noted that the obtaining unit 302 in this embodiment may be configured to execute the step S202, the responding unit 304 in this embodiment may be configured to execute the step S204, and the determining unit 306 in this embodiment may be configured to execute the step S206.
Through the modules, the server obtains target operation executed at the target client, wherein the target operation is used for triggering the target virtual character in the target game scene to release the target character skill. Then, the server responds to the target operation to acquire time shift information of target collision boxes of the target character skills, wherein each target character skill is associated with at least one target collision box, and the time shift information is used for determining the corresponding relation between the time and the position of the collision box. Finally, the server determines the collision state of the target collision box and the collision object collision box based on the time shift information, wherein the collision object collision box comprises a collision box related to a collision object and/or a collision box related to the object skill of the collision object. Therefore, since the server acquires only the time shift information of the target collision box based on the target character skill and calculates whether the acquired target collision box and the collision object collision box collide based on the time shift information, it is determined whether the target operation specified by the target client hits the collision object. Due to the fact that the physical engine calculates collision, the operation amount of simulating the whole scene information is large, and the physical engine is directly applied to a multi-user online server and can cause performance bottleneck. According to the collision detection method, the time shift information of the collision box is obtained by deriving the key data of the motion trail, a simplified calculation mode is adopted, the calculation amount of collision detection is greatly reduced, and the usability of the collision detection in the server is improved. And moreover, the server is adopted for judging collision detection, the hidden danger of plug-in is eliminated, and the game experience of the user is improved.
As an alternative embodiment, the time shift information may include a first preset track file and first real-time release information, and the response unit 304 may be configured to:
determining the first preset trajectory file associated with the target crash box;
and acquiring the first real-time release information based on the state of the target character in the skill release process, wherein the real-time release information comprises a first real-time position, a first real-time direction and a first release time of the target virtual character in the skill release process of the target character.
As an alternative embodiment, the determining unit 306 may be configured to:
acquiring the fixed position information of the collision object collision box in the case that the collision object collision box is associated with a static collision object;
determining a collision status of the target collision box with the collision object collision box based on the first preset trajectory file, the first real-time release information, and the fixed position information.
As an alternative embodiment, the determining unit 306 may be configured to:
under the condition that the collision object collision box is associated with a dynamic collision object, acquiring a second preset track file and second real-time release information of the collision object collision box;
and determining the collision state of the target collision box and the collision object collision box based on the first preset track file, the first real-time release information, the second preset track file and the second real-time release information.
As an alternative embodiment, the determining unit 306 may be configured to:
and respectively calculating the position of the target collision box and the position of the collision object collision box by adopting a difference method based on the first preset track file, the first real-time release information, the second preset track file and the second real-time release information, and determining the collision state of the target collision box and the collision object collision box according to the position relation of the target collision box and the collision object collision box.
As an alternative embodiment, the determining unit 306 may be configured to:
acquiring collision characteristics of the collision box, wherein the collision characteristics comprise types of collision-capable objects and/or collision-capable times;
determining a collision status of the target collision box with a collision object collision box based on the time shift information and the collision characteristics.
As an optional embodiment, the preset trajectory file includes at least one of an appearance time, a disappearance time, a real-time trajectory position, zoom information, and rotation information of the crash box.
As an alternative embodiment, the apparatus further comprises:
the modification unit is used for modifying the preset track file; and/or modifying the association relationship between the target collision box and the preset track file.
As an alternative embodiment, the modification unit is further configured to:
and modifying the preset track file based on a physical visual debugging tool and the collision state.
It should be noted here that the modules described above are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to the disclosure of the above embodiments. It should be noted that the modules described above as a part of the apparatus may be operated in a hardware environment as shown in fig. 1, and may be implemented by software, or may be implemented by hardware, where the hardware environment includes a network environment.
According to a third aspect of the embodiments of the present application, there is also provided an electronic device for implementing the above-described collision detection method, where the electronic device may be a server or a combination thereof with a terminal.
Fig. 4 is a block diagram of an alternative electronic device according to an embodiment of the present application, as shown in fig. 4, including a processor 402, a communication interface 404, a memory 406, and a communication bus 408, where the processor 402, the communication interface 404, and the memory 406 communicate with each other via the communication bus 408, where,
a memory 406 for storing a computer program;
the processor 402, when executing the computer program stored in the memory 406, performs the following steps:
s1, acquiring target operation executed at a target client, wherein the target operation is used for triggering a target virtual character in a target game scene to release target character skill;
s2, in response to the target operation, acquiring time shift information of target collision boxes of the target character skills, where each target character skill is associated with at least one target collision box, and the time shift information is used to determine a correspondence between time and position of the collision box;
and S3, determining the collision state of the target collision box and the collision object collision box based on the time shift information, wherein the collision object collision box comprises a collision box related to a collision object and/or a collision box related to the object skill of the collision object.
Alternatively, in this embodiment, the communication bus may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 4, but this does not indicate only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The memory may include RAM, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory. Alternatively, the memory may be at least one memory device located remotely from the processor.
As an example, the memory 406 may include, but is not limited to, the obtaining unit 302, the responding unit 304, and the determining unit 306 of the collision detecting apparatus. In addition, other module units in the collision detection apparatus may also be included, but are not limited to these, and are not described in detail in this example.
The processor may be a general-purpose processor, and may include but is not limited to: a CPU (Central Processing Unit), an NP (Network Processor), and the like; but also a DSP (Digital Signal Processing), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
In addition, the electronic device further includes: and the display is used for displaying the display interface of the target client.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments, and this embodiment is not described herein again.
It can be understood by those skilled in the art that the structure shown in fig. 4 is only an illustration, and the device implementing the collision detection method may include a terminal device, where the terminal device may be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, a Mobile Internet Device (MID), a PAD, and the like. Fig. 4 is a diagram illustrating a structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in FIG. 4, or have a different configuration than shown in FIG. 4.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disk, ROM, RAM, magnetic or optical disk, and the like.
According to still another aspect of an embodiment of the present application, there is also provided a storage medium. Alternatively, in this embodiment, the storage medium may be used to execute a program code of any one of the collision detection methods described in the embodiments of the present application.
Optionally, in this embodiment, the storage medium may be located on at least one of a plurality of network devices in a network shown in the above embodiment.
Optionally, in this embodiment, the storage medium is configured to store program code for performing the following steps:
s1, acquiring target operation executed at a target client, wherein the target operation is used for triggering a target virtual character in a target game scene to release target character skill;
s2, in response to the target operation, acquiring time shift information of target collision boxes of the target character skills, where each target character skill is associated with at least one target collision box, and the time shift information is used to determine a correspondence between time and position of the collision box;
and S3, determining the collision state of the target collision box and the collision object collision box based on the time shift information, wherein the collision object collision box comprises a collision box related to a collision object and/or a collision box related to the object skill of the collision object.
Optionally, the specific example in this embodiment may refer to the example described in the above embodiment, which is not described again in this embodiment.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing program codes, such as a U disk, a ROM, a RAM, a removable hard disk, a magnetic disk, or an optical disk.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including instructions for causing one or more computer devices (which may be personal computers, servers, network devices, or the like) to execute all or part of the steps of the method described in the embodiments of the present application.
In the above embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, and may also be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution provided in the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (11)

1. A collision detection method, characterized by comprising:
acquiring target operation executed at a target client, wherein the target operation is used for triggering a target virtual character in a target game scene to release a target character skill;
responding to the target operation, acquiring time shift information of target collision boxes of the target character skills, wherein each target character skill is associated with at least one target collision box, and the time shift information is used for determining the corresponding relation between the time and the position of the collision box; and
determining a collision status of the target collision box with a collision object collision box based on the time-shift information, wherein the collision object collision box comprises a collision box associated with a collision object and/or a collision box associated with an object skill of the collision object.
2. The method of claim 1, wherein the time-shift information comprises a first preset track file and first real-time release information, and the obtaining the time-shift information of the target collision box of the target character skill in response to the target operation comprises:
determining the first preset trajectory file associated with the target crash box; and
and acquiring the first real-time release information based on the state of the target role in the skill release process, wherein the real-time release information comprises a first real-time position, a first real-time direction and a first release time of the target virtual role in the skill release process of the target role.
3. The method of claim 2, wherein said determining a collision status of said target collision box with a collision object collision box based on said time-shift information comprises:
acquiring fixed position information of the collision object collision box in the case that the collision object collision box is associated with a static collision object; and
determining a collision status of the target collision box and the collision object collision box based on the first preset trajectory file, the first real-time release information, and the fixed location information.
4. The method of claim 2, wherein said determining a collision status of said target collision box with a collision object collision box based on said time-shift information comprises:
under the condition that the collision object collision box is associated with the dynamic collision object, acquiring a second preset track file and second real-time release information of the collision object collision box; and
and determining the collision state of the target collision box and the collision object collision box based on the first preset track file, the first real-time release information, the second preset track file and the second real-time release information.
5. The method of claim 4, wherein said determining a collision status of the target crash box with the collision object crash box based on the first preset trajectory file, first real-time release information, second preset trajectory file, and second real-time release information comprises:
and respectively calculating the position of the target collision box and the position of the collision object collision box by adopting a difference method based on the first preset track file, the first real-time release information, the second preset track file and the second real-time release information, and determining the collision state of the target collision box and the collision object collision box according to the position relation of the target collision box and the collision object collision box.
6. The method of any of claims 2 to 5, wherein the preset trajectory file comprises at least one of an appearance time, a disappearance time, a trajectory real-time location, zoom information, and rotation information of the crash box.
7. The method according to any one of claims 2 to 5, further comprising:
modifying the preset track file; and/or modifying the association relationship between the target collision box and the preset track file.
8. The method of claim 1, further comprising:
acquiring collision characteristics of the collision box, wherein the collision characteristics comprise types of collidable objects and/or collidable times;
the determining a collision status of the target collision box with a collision object collision box based on the time-shift information comprises:
determining a collision status of the target collision box with a collision object collision box based on the time shift information and the collision characteristics.
9. A collision detecting apparatus, characterized by comprising:
the system comprises an acquisition unit, a display unit and a control unit, wherein the acquisition unit is used for acquiring target operation executed at a target client, and the target operation is used for triggering a target virtual character in a target game scene to release target character skills;
a response unit, configured to obtain, in response to the target operation, time shift information of target collision boxes of the target character skills, where each target character skill is associated with at least one target collision box, and the time shift information is used to determine a correspondence between time and a position of the collision box; and
a determination unit configured to determine a collision status of the target collision box and a collision object collision box based on the time shift information, wherein the collision object collision box comprises a collision box associated with a collision object and/or a collision box associated with an object skill of the collision object.
10. An electronic device comprising a processor and a memory, wherein,
the memory for storing a computer program; and
the processor for performing the method steps of any one of claims 1 to 8 by running the computer program stored on the memory.
11. A computer-readable storage medium, in which a computer program is stored, wherein the computer program is configured to carry out the method steps of any one of claims 1 to 8 when executed.
CN202111585025.9A 2021-12-22 2021-12-22 Collision detection method and device, electronic equipment and storage medium Pending CN114225413A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111585025.9A CN114225413A (en) 2021-12-22 2021-12-22 Collision detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111585025.9A CN114225413A (en) 2021-12-22 2021-12-22 Collision detection method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114225413A true CN114225413A (en) 2022-03-25

Family

ID=80761627

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111585025.9A Pending CN114225413A (en) 2021-12-22 2021-12-22 Collision detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114225413A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116206026A (en) * 2023-05-06 2023-06-02 腾讯科技(深圳)有限公司 Track information processing method, track information processing device, computer equipment and readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116206026A (en) * 2023-05-06 2023-06-02 腾讯科技(深圳)有限公司 Track information processing method, track information processing device, computer equipment and readable storage medium
CN116206026B (en) * 2023-05-06 2023-07-18 腾讯科技(深圳)有限公司 Track information processing method, track information processing device, computer equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN107589829B (en) System and method for providing interactive game experience
US7126607B2 (en) Electronic game and method for effecting game features
CN108211358B (en) Information display method and device, storage medium and electronic device
CN110465087B (en) Virtual article control method, device, terminal and storage medium
CN110548288B (en) Virtual object hit prompting method and device, terminal and storage medium
CN110538455B (en) Method, device, terminal and storage medium for controlling movement of virtual object
CN107913521B (en) The display methods and device of virtual environment picture
CN107998654B (en) Acceleration adjusting method and device, storage medium and electronic device
CN112807681B (en) Game control method, game control device, electronic equipment and storage medium
CN111589145B (en) Virtual article display method, device, terminal and storage medium
JP2023543519A (en) Virtual item input method, device, terminal, and program
CN111467804A (en) Hit processing method and device in game
CN116672712A (en) Prop control method and device, electronic equipment and storage medium
CN112156459A (en) Method and apparatus for controlling battle game, storage medium, and electronic apparatus
TWI803147B (en) Virtual object control method, device, apparatus, storage medium, and program product thereof
JP2024511796A (en) Virtual gun shooting display method and device, computer equipment and computer program
CN114225413A (en) Collision detection method and device, electronic equipment and storage medium
CN114377396A (en) Game data processing method and device, electronic equipment and storage medium
US7148894B1 (en) Image generation system and program
CN111135566A (en) Control method and device of virtual prop, storage medium and electronic device
CN111921195B (en) Three-dimensional scene generation method and device, storage medium and electronic device
CN111228806B (en) Control method and device of virtual operation object, storage medium and electronic device
CN116920374A (en) Virtual object display method and device, storage medium and electronic equipment
JP2024508682A (en) Virtual object control method, virtual object control device, computer equipment, and computer program
CN114344917A (en) Operation data verification method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination