CN110917619B - Interactive property control method, device, terminal and storage medium - Google Patents

Interactive property control method, device, terminal and storage medium Download PDF

Info

Publication number
CN110917619B
CN110917619B CN201911129284.3A CN201911129284A CN110917619B CN 110917619 B CN110917619 B CN 110917619B CN 201911129284 A CN201911129284 A CN 201911129284A CN 110917619 B CN110917619 B CN 110917619B
Authority
CN
China
Prior art keywords
interactive prop
interactive
prop
target object
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911129284.3A
Other languages
Chinese (zh)
Other versions
CN110917619A (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911129284.3A priority Critical patent/CN110917619B/en
Publication of CN110917619A publication Critical patent/CN110917619A/en
Application granted granted Critical
Publication of CN110917619B publication Critical patent/CN110917619B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The application discloses an interactive prop control method, an interactive prop control device, a terminal and a storage medium, and belongs to the technical field of multimedia. The technical scheme that this application embodiment provided provides a neotype interactive stage property, puts in the back at this interactive stage property, can remove based on its initial flight orbit to after locking the target object, can change the flight orbit automatically and realize the attack of pursuit formula to the target object, improved the accuracy of this interactive stage property greatly, also improved user experience exactly.

Description

Interactive property control method, device, terminal and storage medium
Technical Field
The present application relates to the field of multimedia technologies, and in particular, to a method, an apparatus, a terminal, and a storage medium for controlling an interactive property.
Background
With the development of multimedia technology and the diversification of terminal functions, more and more games can be played on the terminal. The shooting game is a more popular game, the terminal can display a virtual scene in the interface and display a virtual object in the virtual scene, and the virtual object can control the interactive prop to fight against other virtual objects.
The basic interactive props provided by shooting games at present are usually virtual weapons of common firearms, and the virtual weapons cannot meet the entertainment requirements of users, so that many games also provide interactive props of launching types, and compared with the common interactive props, the interactive props have the advantages of strong harmfulness and wide explosion range, and can enrich the interactive modes.
However, when a game user uses the launching-type interactive prop in a game, the executed operation precision is limited, so that the accuracy of the interactive prop is poor, and the user experience is poor.
Disclosure of Invention
The embodiment of the application provides an interactive prop control method, an interactive prop control device, a terminal and a storage medium, and can solve the problems of poor emission accuracy and poor user experience of an interactive prop with explosion power.
The technical scheme is as follows:
in one aspect, an interactive prop control method is provided, and the method includes:
when the launching operation of the interactive prop is detected, the interactive prop is controlled to move along a first flight track in the virtual scene, and the first flight track is determined based on the launching operation;
in the moving process of the interactive prop, if any target object is detected to be included in the sensing range of the interactive prop, determining a second flight track between the position of the interactive prop and the target object;
and controlling the interactive prop to move along a second flight track in the virtual scene.
In one possible embodiment, the apparatus is a node device in a blockchain system, and when the node device executes an interactive prop control method, interactive data generated in a control process of the interactive prop is uploaded to the blockchain system.
In one aspect, an interactive prop control apparatus is provided, the apparatus comprising:
the control module is used for controlling the interactive prop to move along a first flight track in the virtual scene when the launching operation of the interactive prop is detected, and the first flight track is determined based on the launching operation;
the trajectory determination module is used for determining a second flight trajectory between the position of the interactive prop and a target object if any target object is detected to be included in the induction range of the interactive prop in the moving process of the interactive prop;
the control module is further configured to control the interactive prop to move along a second flight trajectory in the virtual scene.
In a possible implementation manner, the control module is further configured to control the interactive prop to move towards the target object after the position of the interactive prop is changed in the virtual scene if it is detected that the position of the target object is changed.
In a possible implementation manner, the control module is configured to regenerate a third flight trajectory according to the changed position of the target object and the position of the interactive prop;
and controlling the interactive prop to move along the third flight track in the virtual scene.
In a possible implementation manner, the control module is further configured to control the interactive prop to trigger a deformation effect if the interactive prop falls into a collision detection range of any prop in a process of moving along the second flight trajectory.
In a possible implementation manner, the control module is further configured to control the interactive prop to trigger a deformation effect if the interactive prop falls into a collision detection range of the target object in a process of moving along the second flight trajectory.
In a possible implementation manner, the control module is further configured to determine an influence range of the interactive prop according to the position of the interactive prop, and control a target within the influence range.
In a possible implementation manner, the control module is configured to determine a loss value of an interactive attribute value of the prop according to a distance between the prop located in the influence range and the position of the interactive prop;
and when the loss value of the prop meets a first preset condition, canceling the display of the prop.
In one possible implementation manner, the control module is configured to determine a loss value of the activity value of the virtual object according to a distance between the virtual object located within the influence range and the position of the interactive prop, where the distance is negatively correlated with the loss value;
and when the loss value of the virtual object meets a second preset condition, setting the virtual object to be in a death state.
In one possible implementation, the apparatus further includes:
the accumulation module is used for accumulating the attack scores of the virtual objects controlled by the current user;
and the activation module is used for activating the launching option of the interactive prop when the attack score accumulated value reaches a target threshold value.
In one possible implementation manner, the activation module is used for adding the launching option of the interactive prop in an activated state in the skill selection area of the virtual object controlled by the current user.
In one possible implementation manner, the sensing range of the interactive prop is a spherical three-dimensional space with the real-time position of the interactive prop as a center.
In a possible implementation manner, the apparatus is applied to a node device in a blockchain system, and the node device executes an interactive prop control method to upload interactive data generated in a control process of the interactive prop to the blockchain system.
In one aspect, a terminal is provided and includes one or more processors and one or more memories, where at least one program code is stored in the one or more memories, and the at least one program code is loaded by the one or more processors and executed to implement the operations performed by the interactive prop control method according to any of the above possible implementations.
In one aspect, a storage medium is provided, in which at least one program code is stored, and the at least one program code is loaded and executed by a processor to implement the operations performed by the interactive prop control method according to any one of the above possible implementations.
The technical scheme that this application embodiment provided provides a neotype interactive stage property, puts in the back at this interactive stage property, can remove based on its initial flight orbit to after locking the target object, can change the flight orbit automatically and realize the attack of pursuit formula to the target object, improved the accuracy of this interactive stage property greatly, also improved user experience exactly.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of an interactive prop control method according to an embodiment of the present application;
fig. 2 is a flowchart of an interactive prop control method according to an embodiment of the present disclosure;
FIG. 3 is an interface schematic diagram of a prop selection panel provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of an interactive prop provided in an embodiment of the present application in an inactivated state;
FIG. 5 is a schematic diagram of an interactive prop provided in an embodiment of the present application in an activated state;
fig. 6 is a schematic diagram of switching of interactive props provided in the present application;
FIG. 7 is a schematic view of a flight trajectory of an interactive prop provided in an embodiment of the present application;
FIG. 8 is a schematic diagram illustrating a sensing range of an interactive prop provided in an embodiment of the present application;
FIG. 9 is a schematic diagram of a locked target object and an interactive prop shown in an aerial perspective provided by an embodiment of the present application;
FIG. 10 is a schematic illustration of the explosive effect provided by embodiments of the present application;
FIG. 11 is a schematic diagram illustrating the range of influence of an interactive prop provided in an embodiment of the present application;
FIG. 12 is a flowchart of an example of interactive prop control provided in an embodiment of the present application;
fig. 13 is a schematic structural diagram of an interactive prop control apparatus according to an embodiment of the present disclosure;
fig. 14 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Hereinafter, terms related to the present application are explained.
Virtual scene: is a virtual scene that is displayed (or provided) by an application program when the application program runs on a terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts, cities, etc., and a user may control a virtual object to move in the virtual scene.
Virtual object: refers to a movable object in a virtual scene. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the virtual scene. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. The virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene and occupying a portion of the space in the virtual scene.
Alternatively, the virtual object may be a Player Character controlled by an operation on the client, an Artificial Intelligence (AI) set in the virtual scene fight by training, or a Non-Player Character (NPC) set in the virtual scene interaction. Alternatively, the virtual object may be a virtual character playing a game in a virtual scene. Optionally, the number of virtual objects participating in the interaction in the virtual scene may be preset, or may be dynamically determined according to the number of clients participating in the interaction.
Taking a shooting game as an example, the user may control a virtual object to freely fall, glide, open a parachute to fall, run, jump, climb, bend over, and move on the land, or control a virtual object to swim, float, or dive in the sea, or the like, but the user may also control a virtual object to move in the virtual scene by riding a virtual vehicle, for example, the virtual vehicle may be a virtual car, a virtual aircraft, a virtual yacht, and the like, and the above-mentioned scenes are merely exemplified, and the present invention is not limited to this. The user may also control the virtual object to interact with other virtual objects in a manner of fighting and the like through the virtual weapon, for example, the virtual weapon may be a throwing type virtual weapon such as a grenade, a cluster mine, a viscous grenade (abbreviated as "viscous grenade"), or a shooting type virtual weapon such as a machine gun, a pistol, a rifle, and the like, and the type of the virtual weapon is not specifically limited in the present application.
Hereinafter, a system architecture according to the present application will be described.
Fig. 1 is a schematic diagram of an implementation environment of an interactive prop control method provided in an embodiment of the present application, and referring to fig. 1, the implementation environment includes: a first terminal 120, a server 140, and a second terminal 160.
The first terminal 120 is installed and operated with an application program supporting a virtual scene. The application program may be any one of a First-Person shooter game (FPS), a third-Person shooter game, a Multiplayer Online Battle Arena game (MOBA), a virtual reality application program, a three-dimensional map program, a military simulation program, or a Multiplayer gunfight type live game. The first terminal 120 may be a terminal used by a first user, who uses the first terminal 120 to operate a first virtual object located in a virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first virtual object is a first virtual character, such as a simulated persona or an animated persona.
The first terminal 120 and the second terminal 160 are connected to the server 140 through a wireless network or a wired network.
The server 140 may include at least one of a server, a plurality of servers, a cloud computing platform, or a virtualization center. The server 140 is used to provide background services for applications that support virtual scenarios. Alternatively, the server 140 may undertake primary computational tasks and the first and second terminals 120, 160 may undertake secondary computational tasks; alternatively, the server 140 undertakes the secondary computing work and the first terminal 120 and the second terminal 160 undertakes the primary computing work; alternatively, the server 140, the first terminal 120, and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
The second terminal 160 is installed and operated with an application program supporting a virtual scene. The application program can be any one of an FPS, a third person named shooting game, an MOBA, a virtual reality application program, a three-dimensional map program, a military simulation program or a multi-person gunfight survival game. The second terminal 160 may be a terminal used by a second user, who uses the second terminal 160 to operate a second virtual object located in the virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as a simulated persona or an animated persona.
Optionally, the first virtual object controlled by the first terminal 120 and the second virtual object controlled by the second terminal 160 are in the same virtual scene, and the first virtual object may interact with the second virtual object in the virtual scene. In some embodiments, the first virtual object and the second virtual object may be in a hostile relationship, for example, the first virtual object and the second virtual object may belong to different teams and organizations, and the hostile virtual objects may interact with each other in a mutual shooting manner on land.
In an exemplary scenario, the first terminal 120 controls the first virtual object to launch an interactive prop, the interactive prop is used to trigger a deformation effect, for example, the deformation effect may be an explosion effect, when the first terminal 120 controls the first virtual object to launch the interactive prop in any direction, a flight trajectory of the interactive prop may be determined based on the launching operation, for example, when the first terminal 120 controls the first virtual object to launch the interactive prop in a direction of the second virtual object, once it is detected that the second virtual object falls within a sensing range of the interactive prop, the interactive prop may be controlled to automatically lock the second virtual object, change a current flight trajectory, move along the changed flight trajectory toward a position of the second virtual object, if it is detected that the second virtual object falls within the influence range of the interactive prop, the interactive prop may be launched, the purpose of damaging the second virtual object is achieved, and the intelligent tracking effect is achieved through the throwing operation and the mode that the interactive prop automatically locks the target, such as the virtual object, and changes the flight track.
The interactive prop is any deformable virtual prop, and when the interactive prop reaches a deformation position, the interactive prop can present a deformation effect, that is, the interactive prop can be converted from a first form to a second form, for example, the interactive prop can be a throwing or launching virtual weapon such as a grenade, a bundled grenade, a viscous grenade, a flying missile, and the like.
In other embodiments, the first virtual object and the second virtual object may be in a teammate relationship, for example, the first virtual character and the second virtual character may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights.
Alternatively, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application of different operating system platforms. The first terminal 120 may generally refer to one of a plurality of terminals, and the second terminal 160 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 120 and the second terminal 160. The device types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smart phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4) player, a laptop portable computer, and a desktop computer. For example, the first terminal 120 and the second terminal 160 may be smart phones, or other handheld portable gaming devices. The following embodiments are illustrated with the terminal comprising a smartphone.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
In an exemplary scenario, the implementation environment may be built on a blockchain system, where a blockchain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, and an encryption algorithm. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block.
In some embodiments, the first terminal 120 and the second terminal 160 may be node devices on a blockchain system, so that each time any node device performs control operation of an interactive prop through an application program and generates interactive data, the interactive data may be uploaded to the blockchain system, thereby implementing persistent storage on the blockchain system. In the above process, the interactive data may include the release time, the deformation position, the modification time of the interactive attribute value of each virtual object, the numerical value before and after modification, and the like of each interactive prop, and these interactive data may reflect the fighting records of each virtual object in the interactive process, and the storage of the interactive data has higher security due to the non-tamper-ability of the block chain system.
Fig. 2 is a flowchart of an interactive prop control method according to an embodiment of the present application. Referring to fig. 2, the embodiment is exemplified by applying the method to a terminal, which may be the first terminal 120 shown in fig. 1, and includes the following steps:
201. and the terminal adds the interactive prop to the first virtual object based on the operation of the user.
The first virtual object is a virtual object controlled by an end user. In the above process, the terminal may equip the interactive prop by itself before starting the game, the interactive prop may be exchanged by the terminal user based on virtual currency, may be issued in a form of reward based on the level of the user, and the like, and may be an interactive prop selected based on the interactive prop provided at the current game stage.
For example, the terminal may display a property selection panel in the virtual scene, the property selection bread may include multiple types of interactive properties, when the user wants to use a certain type of interactive property, the interactive property may be added to the first virtual object through a selection operation on the interactive property, and after the addition is completed, the selected interactive property may be displayed in an added state (for example, a matching number in the upper right corner of the graphic frame) in the property selection panel shown in fig. 3, where the property "bomb drone" displayed in the circle is one of the interactive properties related to the embodiment of the present application. Certainly, when the property is added through the property selection panel, description information of the interactive property may be displayed based on an operation of a user, where the description information may include a use description, an influence range description, and the like of the interactive property, as shown in a lower right corner in fig. 3, in the property selection panel, each interactive property may also be displayed in a graphic frame, and besides whether the property is added or not is displayed in the graphic frame, in a corresponding position of the description information, a control for indicating whether the property is added or not is displayed, for example, "equipped" in the lower right corner in fig. 3, which is not limited in this embodiment of the present application.
It should be noted that the interactive prop may be an interactive prop that can be used only under a certain specific scenario, for example, the continuous scoring skill shown in fig. 3, that is, the interactive prop can be used only when the attack score of the first virtual object is accumulated to a certain value, and this application embodiment only takes this kind of interactive prop as an example for description.
202. The terminal accumulates attack scores for the first virtual object.
The terminal can accumulate the attack scores of the first virtual object based on the antagonistic interaction between the first virtual object and the second virtual object in the virtual scene, namely other virtual objects, and when any antagonistic interaction behavior of the first virtual object hits any second virtual object, the attack scores of the first virtual object can be accumulated based on the hit time. In one possible implementation, an attack score corresponding to the hit portion may be determined from the hit portion of the second virtual object. Or, the corresponding attack score may be determined according to the response of the second virtual object to the hit behavior, for example, if the activity value of the second virtual object after hit is reduced to 0 and the second virtual object is set to be in a dead state, the corresponding attack score may be the attack score when the activity value of the second virtual object after hit is greater than 0, so as to improve the attack tension and the activity of the user.
203. And when the attack score accumulated value reaches a target threshold value, displaying the launching option of the interactive prop in an activated state in a skill selection area of the virtual object controlled by the current user.
It should be noted that, the above steps 201 to 203 are described by taking an example of determining whether to activate the delivery option of the interactive prop based on the accumulated value of the attack scores, that is, for the terminal, the interactive prop is conditionally provided, so as to improve the difficulty and interest of the user. The activation may refer to that the interactive prop is available, when the interactive prop is in an inactive state, the launch option may be displayed in a gray scale, as shown in an encircled area in fig. 4, and in order to visually prompt the user, the launch option in an active state may be highlighted, as shown in an encircled area in fig. 5, and in order to further improve the prompting effect, when the interactive prop is switched from an inactive state to an active state, an activation prompt message, such as "continuous score reward activation" shown in fig. 5, may be displayed in a virtual scene.
The terminal can determine the interactive prop to be used in the attack through the interactive prop switching operation of the user, when the interactive prop is switched to, the display size of the throwing option of the interactive prop can be increased in the skill selection area to highlight the selected state of the interactive prop, as shown in fig. 6, in the area encircled in fig. 6, a plurality of activated interactive props of the first virtual object can be displayed, when the interactive prop switching operation of the user is detected, the interactive prop currently being used can be switched to the next interactive prop on the basis of the clockwise sequence, and convenient prop switching is achieved.
204. When the launching operation of the interactive prop is detected, the terminal controls the interactive prop to move along a first flight track in the virtual scene, and the first flight track is determined based on the launching operation.
The launching operation of the interactive prop can be a triggering operation of a launching option of the interactive prop. For example, the trigger operation may be a drag operation on the drop option, based on which a first flight trajectory of the interactive prop may be determined. The dragging direction of the dragging operation can be used as the flight direction of the interactive prop to determine the first flight track of the interactive prop.
The first flight trajectory may be a linear trajectory, that is, when a launching operation of the interactive prop is detected, the terminal may control the interactive prop to move along a linear trajectory in a flight direction, for example, a linear trajectory 700 in fig. 7, with a dragging direction of the dragging operation as a flight direction and a starting point of the dragging operation as a flight starting point. For the interactive prop, as the interactive prop belongs to a launching type interactive prop, the speed is relatively high, and the flight track of the interactive prop can be determined without adopting a parabola form so as to simulate the launching effect in a real environment.
205. In the moving process of the interactive prop, if the terminal detects that any target object is included in the induction range of the interactive prop, a second flight track between the interactive prop and the target object is determined.
For the interactive prop, the interactive prop may have a certain sensing range, where the sensing range may be a three-dimensional spherical range with a preset sensing distance as a radius and a real-time position of the interactive prop as a center, as shown in fig. 8, the sensing range occupies a certain space in a three-dimensional virtual environment, and thus, the all-dimensional sensing with the interactive prop as the center can be realized.
In the moving process of the interactive prop, the terminal may determine a real-time position of the interactive prop, so as to determine the sensing range based on the real-time position, where the target object may be any type of object, for example, the target object may be a preset type of interactive prop, such as a retrievable prop, and the target object may also be a virtual object, such as a player of an opponent. It should be noted that the specific type of the target object may be a default type, that is, a certain object type defaulted by the system, or may be a to-be-attacked type preset by the user, that is, the user may set in advance a to-be-attacked type that tends to the user, so as to ensure that the attack effect best meets the expectation of the user.
In a possible implementation manner, when the terminal detects that the sensing range of the interactive prop includes multiple target objects, the target object of the attack may be determined according to the type of the target object, for example, when the multiple target objects include a virtual object and any type of interactive prop, the virtual object may be used as the target object of the attack according to the attack priority. Certainly, in a situation of team confrontation, when detecting a virtual object, the terminal may detect whether a group to which the virtual object belongs is the same as the group of the first virtual object, and if not, take the group as a target object of the attack, so as to avoid accidental injury.
In one possible implementation, the possible actions of the attacked object can be predicted to realize a predictive attack, that is, the determining the second flight trajectory between the interactive prop and the target object can include: if any target object is detected to be included in the induction range of the interactive prop, acquiring the real-time position of the target object; predicting a target location of the target object based on the real-time location of the target object; and determining a second flight track between the real-time position of the interactive prop and the target position of the target object. By the prediction, the attack success rate can be improved, and even if the target object moves, accurate attack can be realized based on the prediction of the movement of the target object. For example, based on the real-time location of the target object, predicting the target location of the target object may include: and predicting the target position of the target object after the target flight time length by taking preset time length as the target flight time length according to the real-time position of the target object and the travel track and the travel speed of the target object. The preset duration can be a ratio between a preset induction radius and the flying speed of the interactive prop. In another possible implementation manner, the terminal may model the flight trajectory of the interactive prop to obtain a dynamical model of the interactive prop, and input the start point coordinate and the end point coordinate into the dynamical model, that is, output a second flight trajectory of the interactive prop, where the dynamical model may consider gravitational acceleration, air resistance, impact force, and the like, and the embodiment of the present application does not specifically limit the influence factors introduced into the dynamical model.
206. And the terminal controls the interactive prop to move along a second flight track in the virtual scene.
The step is the same as the moving process in the above step, and is not described herein again.
FIG. 9 is a schematic diagram of a locked target object and an interactive prop shown from an aerial perspective. For the target object, even if the interactive prop flying in the air is observed, the target user can still be locked by tracking the moving state of the target object. Correspondingly, the method further comprises: and if the position of the target object is detected to be changed, controlling the interactive prop to move towards the target object after the position of the interactive prop is changed in the virtual scene.
207. In the process of moving along the second flight track, if the interactive prop falls into the collision detection range of the target object, the terminal controls the interactive prop to trigger a deformation effect.
For an interactive prop which does not fall into the collision detection range of any obstacle in the process of flying, the interactive prop can be controlled to continuously fly until the target object is hit, and then the interactive prop is controlled to deform to generate an explosion effect, and the effect can be seen in fig. 10.
For a virtual scene, the arrangement of articles in the scene is complex, when a second flight track is determined, in order to simulate a real tracking effect, after a target object is positioned, the virtual scene can continuously fly along the second flight track, and in the process of moving along the second flight track, if the interactive prop falls into a collision detection range of any obstacle, the terminal controls the interactive prop to trigger a deformation effect, that is, if any obstacle exists or suddenly appears on the flight track, the interactive prop can be mistakenly detonated, for example, when a house exists on the second flight track, the house is used as the obstacle of the current attack, and a certain protection can be formed on the target object, and when the terminal detects that the real-time position of the interactive prop falls into the collision detection range of any obstacle, the interactive prop can be controlled to deform to generate an explosion effect, the effect of which can be seen in figure 10. The collision detection range of the barrier can be used for improving the accuracy of attack, so that detonation can be realized without overlapping of the interactive prop and the edge of the barrier, and the game experience of a user is improved.
The mode of detonation when encountering the obstacle can provide an avoiding mode for the target object, truly simulates an attack scene and an avoiding scene which may appear in the attack process, and greatly improves the reality sense of the game.
208. And the terminal determines the influence range of the interactive prop according to the real-time position of the interactive prop.
For the interactive prop, after the deformation effect is triggered, because the interactive prop belongs to an explosive virtual weapon, the attack capability of the interactive prop can be determined based on a range that can be influenced by explosion, which is referred to as an influence range in the embodiment of the present application, that is, an attack object of the interactive prop includes not only a target object but also other objects within the influence range of the interactive prop, as shown in fig. 11, if the influence range of the interactive prop includes other virtual objects besides the target object, the virtual object is also attacked by the interactive prop. If a number of items with lossy properties are included within the scope of influence, the items will also receive an attack from the interactive items.
209. And the terminal determines the loss value of the interactive attribute value of the prop according to the distance between the prop located in the influence range and the real-time position of the interactive prop, and cancels the display of the prop when the loss value of the prop meets a first preset condition.
210. And the terminal determines the loss value of the activity value of the virtual object according to the distance between the virtual object positioned in the influence range and the real-time position of the interactive prop, and when the loss value of the virtual object meets a second preset condition, the virtual object is set to be in a death state, and the distance is in negative correlation with the loss value.
The method provided by the embodiment of the application provides a novel interactive prop, and after the interactive prop is put in, the interactive prop can be moved based on an initial flight track, and after a target object is locked, the flight track can be automatically changed to realize tracking type attack on the target object, so that the accuracy of the interactive prop is greatly improved, and the user experience is also improved. Furthermore, as the induction range is a three-dimensional spherical range, the attack on the ground and the aerial target can be realized, the accuracy of the interactive prop is ensured, the attack range is expanded, and the use experience of the game is greatly improved. For example, the above process can be illustrated by an implementation flow of fig. 12, referring to fig. 12, before game launch, a user may equip the interactive prop, but at this time, the interactive prop may be in an inactivated state, when attack scores of the virtual object are accumulated to a certain degree, the interactive prop is activated, a launch option of the interactive prop is highlighted, when the user clicks to use the interactive prop, the interactive prop may be switched to, once the user clicks to launch, the interactive prop may be controlled to start flying according to a certain flight trajectory, once a target object is sensed in a sensing range of the interactive prop, the target is locked and the flight trajectory is changed, the target is flown in a direction of the target, if the target object is touched during flying, an explosion effect is displayed, and if an activity value consumed by explosion injury exceeds an activity value of the target object, the target object can be set as dead or removed, and the attack is completed.
In a possible implementation manner, the interactive prop control method may be applied to a node device of a blockchain system, that is, the terminal may be any node device in the blockchain system. After the node device executes the interactive prop control method, the interactive data generated in the control process of the interactive prop can be uploaded to the blockchain system, so that persistent storage of the interactive data is realized on the blockchain system.
The interactive data may include at least one of a release time and a release position of each interactive prop, a throwing time and a deformation position of each interactive prop, a launch time of each second target prop, or a change time of an interactive attribute value of each virtual object and a value before and after the change, and of course, the interactive data may further include at least one of a change time of an interactive attribute value of each interactive prop or a value before and after the change when the interactive prop has the interactive attribute value. The interactive data can reflect the fighting records of each virtual object in the interactive process, and the interactive data is stored in the block chain system due to the non-tamper property of the block chain system, so that the interactive data is stored with higher safety.
Optionally, the process of uploading the interactive data may include the following steps: the node device (i.e., the terminal) generates a block according to the interactive data, broadcasts the block in the blockchain system, and after receiving the block sent by the node device, other node devices (i.e., any device except the terminal) on the blockchain system perform consensus on the block, and when the block passes through the consensus of the blockchain system, the block is added to the blockchain, which is not described in detail herein.
All the above optional technical solutions may be combined arbitrarily to form the optional embodiments of the present disclosure, and are not described herein again.
Fig. 13 is a schematic structural diagram of an interactive prop control device provided in an embodiment of the present application, and referring to fig. 13, the device includes:
the control module 1301 is configured to, when a launching operation on an interactive prop is detected, control the interactive prop to move along a first flight trajectory in the virtual scene, where the first flight trajectory is determined based on the launching operation;
a trajectory determination module 1302, configured to determine, during the moving process of the interactive prop, a second flight trajectory between the position of the interactive prop and a target object if it is detected that any target object is included in the sensing range of the interactive prop;
the control module 1301 is further configured to control the interactive prop to move along a second flight trajectory in the virtual scene.
In a possible implementation manner, the control module is further configured to control the interactive prop to move towards the target object after the position of the interactive prop is changed in the virtual scene if it is detected that the position of the target object is changed.
In a possible implementation manner, the control module is configured to regenerate a third flight trajectory according to the changed position of the target object and the position of the interactive prop;
and controlling the interactive prop to move along the third flight track in the virtual scene.
In a possible implementation manner, the control module is further configured to control the interactive prop to trigger a deformation effect if the interactive prop falls into a collision detection range of any prop in a process of moving along the second flight trajectory.
In a possible implementation manner, the control module is further configured to control the interactive prop to trigger a deformation effect if the interactive prop falls into a collision detection range of the target object in a process of moving along the second flight trajectory.
In a possible implementation manner, the control module is further configured to determine an influence range of the interactive prop according to the position of the interactive prop, and control a target within the influence range.
In a possible implementation manner, the control module is configured to determine a loss value of an interactive attribute value of the prop according to a distance between the prop located in the influence range and the position of the interactive prop;
and when the loss value of the prop meets a first preset condition, canceling the display of the prop.
In one possible implementation manner, the control module is configured to determine a loss value of the activity value of the virtual object according to a distance between the virtual object located within the influence range and the position of the interactive prop, where the distance is negatively correlated with the loss value;
and when the loss value of the virtual object meets a second preset condition, setting the virtual object to be in a death state.
In one possible implementation, the apparatus further includes:
the accumulation module is used for accumulating the attack scores of the virtual objects controlled by the current user;
and the activation module is used for activating the launching option of the interactive prop when the attack score accumulated value reaches a target threshold value.
In one possible implementation manner, the activation module is used for adding the launching option of the interactive prop in an activated state in the skill selection area of the virtual object controlled by the current user.
In one possible implementation manner, the sensing range of the interactive prop is a spherical three-dimensional space with the real-time position of the interactive prop as a center.
In a possible implementation manner, the apparatus is applied to a node device in a blockchain system, and the node device executes an interactive prop control method to upload interactive data generated in a control process of the interactive prop to the blockchain system.
All the above optional technical solutions may be combined arbitrarily to form the optional embodiments of the present disclosure, and are not described herein again.
It should be noted that: the interactive prop control device provided in the above embodiment is exemplified by only the division of the above functional modules when controlling the interactive prop, and in practical applications, the function allocation may be completed by different functional modules as needed, that is, the internal structure of the terminal is divided into different functional modules to complete all or part of the above described functions. In addition, the interactive prop control device and the interactive prop control method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the interactive prop control method embodiment, and are not described herein again.
Fig. 14 is a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal 1400 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1400 can also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or other names.
In general, terminal 1400 includes: a processor 1401, and a memory 1402.
Processor 1401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 1401 may be implemented in at least one hardware form of DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array). Processor 1401 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1401 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1401 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1402 may include one or more computer-readable storage media, which may be non-transitory. Memory 1402 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1402 is used to store at least one instruction for execution by processor 1401 to implement the interactive prop control methods provided by various embodiments herein.
In some embodiments, terminal 1400 may further optionally include: a peripheral device interface 1403 and at least one peripheral device. The processor 1401, the memory 1402, and the peripheral device interface 1403 may be connected by buses or signal lines. Each peripheral device may be connected to the peripheral device interface 1403 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1404, a touch display 1405, a camera assembly 1406, audio circuitry 1407, a positioning assembly 1408, and a power supply 1409.
The peripheral device interface 1403 can be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1401 and the memory 1402. In some embodiments, the processor 1401, memory 1402, and peripheral interface 1403 are integrated on the same chip or circuit board; in some other embodiments, any one or both of the processor 1401, the memory 1402, and the peripheral device interface 1403 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1404 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1404 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1404 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1404 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1404 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1404 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1405 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1405 is a touch display screen, the display screen 1405 also has the ability to capture touch signals at or above the surface of the display screen 1405. The touch signal may be input to the processor 1401 for processing as a control signal. At this point, the display 1405 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 1405 may be one, providing the front panel of the terminal 1400; in other embodiments, display 1405 may be at least two, respectively disposed on different surfaces of terminal 1400 or in a folded design; in still other embodiments, display 1405 may be a flexible display disposed on a curved surface or on a folded surface of terminal 1400. Even further, the display 1405 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display 1405 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 1406 is used to capture images or video. Optionally, camera assembly 1406 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1406 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1407 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1401 for processing or inputting the electric signals to the radio frequency circuit 1404 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1400. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is then used to convert electrical signals from the processor 1401 or the radio frequency circuit 1404 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1407 may also include a headphone jack.
The positioning component 1408 serves to locate the current geographic position of the terminal 1400 for navigation or LBS (Location Based Service). The Positioning component 1408 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, the russian graves System, or the european union galileo System.
Power supply 1409 is used to power the various components of terminal 1400. The power source 1409 may be alternating current, direct current, disposable or rechargeable. When the power source 1409 comprises a rechargeable battery, the rechargeable battery can support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1400 also includes one or more sensors 1410. The one or more sensors 1410 include, but are not limited to: acceleration sensor 1411, gyroscope sensor 1412, pressure sensor 1413, fingerprint sensor 1414, optical sensor 1415, and proximity sensor 1416.
The acceleration sensor 1411 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal 1400. For example, the acceleration sensor 1411 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1401 can control the touch display 1405 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1411. The acceleration sensor 1411 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 1412 may detect a body direction and a rotation angle of the terminal 1400, and the gyro sensor 1412 and the acceleration sensor 1411 may cooperate to collect a 3D motion of the user on the terminal 1400. The processor 1401 can realize the following functions according to the data collected by the gyro sensor 1412: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1413 may be disposed on the side bezel of terminal 1400 and/or underlying touch display 1405. When the pressure sensor 1413 is disposed on the side frame of the terminal 1400, the user's holding signal of the terminal 1400 can be detected, and the processor 1401 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1413. When the pressure sensor 1413 is disposed at the lower layer of the touch display 1405, the processor 1401 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1405. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1414 is used for collecting a fingerprint of a user, and the processor 1401 identifies the user according to the fingerprint collected by the fingerprint sensor 1414, or the fingerprint sensor 1414 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 1401 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for, and changing settings, etc. Fingerprint sensor 1414 may be disposed on the front, back, or side of terminal 1400. When a physical button or vendor Logo is provided on terminal 1400, fingerprint sensor 1414 may be integrated with the physical button or vendor Logo.
The optical sensor 1415 is used to collect ambient light intensity. In one embodiment, processor 1401 can control the display brightness of touch display 1405 based on the ambient light intensity collected by optical sensor 1415. Specifically, when the ambient light intensity is high, the display luminance of the touch display 1405 is increased; when the ambient light intensity is low, the display brightness of the touch display 1405 is turned down. In another embodiment, the processor 1401 can also dynamically adjust the shooting parameters of the camera assembly 1406 according to the intensity of the ambient light collected by the optical sensor 1415.
Proximity sensor 1416, also known as a distance sensor, is typically disposed on the front panel of terminal 1400. The proximity sensor 1416 is used to collect the distance between the user and the front surface of the terminal 1400. In one embodiment, when proximity sensor 1416 detects that the distance between the user and the front face of terminal 1400 is gradually decreased, processor 1401 controls touch display 1405 to switch from a bright screen state to a dark screen state; when proximity sensor 1416 detects that the distance between the user and the front face of terminal 1400 is gradually increasing, processor 1401 controls touch display 1405 to switch from a breath-screen state to a bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 14 is not intended to be limiting with respect to terminal 1400 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
In an exemplary embodiment, a computer-readable storage medium, such as a memory, including at least one program code, which is executable by a processor in a terminal to perform the interactive prop control method in the above embodiments is also provided. For example, the computer-readable storage medium may be a ROM (Read-Only Memory), a RAM (Random-Access Memory), a CD-ROM (Compact Disc Read-Only Memory), a magnetic tape, a floppy disk, an optical data storage device, and the like.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. An interactive prop control method, comprising:
when the launching operation of the interactive prop is detected, the interactive prop is controlled to move along a first flight track in a virtual scene, and the first flight track is determined based on the launching operation;
in the moving process of the interactive prop, when a plurality of target objects are detected to be included in the induction range of the interactive prop, determining the target object attacked by the interactive prop this time according to the type of the target objects, wherein the type of the target object comprises at least one of a default type and a preset type to be attacked;
determining a second flight track between the interactive prop and the target object of the attack, wherein the second flight track is obtained by outputting a dynamic model, and the dynamic model is constructed according to the flight track of the interactive prop;
and controlling the interactive prop to move along the second flight track in the virtual scene.
2. The method of claim 1, wherein the determining a second flight trajectory between the interactive prop and the target object of the attack comprises:
acquiring the real-time position of the target object of the attack;
predicting the target position of the target object of the attack based on the real-time position of the target object of the attack;
and determining the second flight track between the real-time position of the interactive prop and the target position of the target object of the attack.
3. The method of claim 1, further comprising:
and if the position of the target object of the attack is detected to be changed, controlling the interactive prop to move towards the target object of the attack after the position of the interactive prop is changed in the virtual scene.
4. The method according to claim 3, wherein the controlling, in the virtual scene, the movement of the target object of the attack after the position of the interactive prop changes comprises:
regenerating a third flight track according to the changed position of the target object of the attack and the real-time position of the interactive prop;
and controlling the interactive prop to move along the third flight track in the virtual scene.
5. The method of claim 1, wherein after controlling the interactive prop to move along the second flight trajectory in the virtual scene, the method further comprises:
in the process of moving along the second flight track, if the interactive prop falls into the collision detection range of any prop, controlling the interactive prop to trigger a deformation effect; or the like, or, alternatively,
and in the process of moving along the second flight track, if the interactive prop falls into the collision detection range of the target object, controlling the interactive prop to trigger a deformation effect.
6. The method of claim 5, wherein after controlling the interactive prop to trigger a deformation effect, the method further comprises:
and determining the influence range of the interactive prop according to the real-time position of the interactive prop, and controlling a target object in the influence range.
7. The method of claim 6, wherein the controlling the target object within the range of influence comprises:
determining a loss value of an interactive attribute value of the prop according to the distance between the prop located in the influence range and the real-time position of the interactive prop;
and when the loss value of the prop meets a first preset condition, canceling the display of the prop.
8. The method of claim 6, wherein the controlling the target within the range of influence comprises:
determining a loss value of the activity value of the virtual object according to the distance between the virtual object located in the influence range and the real-time position of the interactive prop, wherein the distance is in negative correlation with the loss value;
and when the loss value of the virtual object meets a second preset condition, setting the virtual object to be in a death state.
9. The method of claim 1, wherein when the launching operation of the interactive prop is detected, before controlling the interactive prop to move along the first flight trajectory in the virtual scene, the method further comprises:
accumulating attack scores of virtual objects controlled by a current user;
and when the attack score accumulated value reaches a target threshold value, activating a launching option of the interactive prop.
10. The method of claim 9, wherein activating the option to launch the interactive prop comprises:
adding a launching option of the interactive prop in an activated state in a skill selection area of the virtual object controlled by the current user.
11. The method of any one of claims 1 to 10, wherein the sensing range of the interactive prop is a spherical solid space centered on the real-time location of the interactive prop.
12. The method according to claim 1, wherein the method is applied to a node device in a blockchain system, and when the node device executes an interactive prop control method, interactive data generated in the control process of the interactive prop is uploaded to the blockchain system.
13. An interactive prop control apparatus, the apparatus comprising:
the control module is used for controlling the interactive prop to move along a first flight track in a virtual scene when the launching operation of the interactive prop is detected, and the first flight track is determined based on the launching operation;
the trajectory determination module is used for determining a target object attacked by the interactive prop this time according to the type of the target object when detecting that the induction range of the interactive prop comprises a plurality of target objects in the moving process of the interactive prop, wherein the type of the target object comprises at least one of a default type and a preset type to be attacked; determining a second flight track between the interactive prop and the target object of the attack, wherein the second flight track is obtained by outputting a dynamic model, and the dynamic model is constructed according to the flight track of the interactive prop;
the control module is further configured to control the interactive prop to move along the second flight trajectory in the virtual scene.
14. A terminal, characterized in that the terminal comprises one or more processors and one or more memories, in which at least one program code is stored, which is loaded and executed by the one or more processors to implement the operations executed by the interactive prop control method according to any one of claims 1 to 12.
15. A storage medium having stored therein at least one program code, the at least one program code being loaded into and executed by a processor to perform operations performed by an interactive prop control method according to any one of claims 1 to 12.
CN201911129284.3A 2019-11-18 2019-11-18 Interactive property control method, device, terminal and storage medium Active CN110917619B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911129284.3A CN110917619B (en) 2019-11-18 2019-11-18 Interactive property control method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911129284.3A CN110917619B (en) 2019-11-18 2019-11-18 Interactive property control method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN110917619A CN110917619A (en) 2020-03-27
CN110917619B true CN110917619B (en) 2020-12-25

Family

ID=69853270

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911129284.3A Active CN110917619B (en) 2019-11-18 2019-11-18 Interactive property control method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110917619B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111773658B (en) * 2020-07-03 2024-02-23 珠海金山数字网络科技有限公司 Game interaction method and device based on computer vision library
CN112402969B (en) * 2020-11-19 2022-08-09 腾讯科技(深圳)有限公司 Virtual object control method, device, equipment and storage medium in virtual scene
CN112619134B (en) * 2020-12-22 2023-05-02 上海米哈游天命科技有限公司 Method, device, equipment and storage medium for determining flight distance of transmission target
CN112619164B (en) * 2020-12-22 2023-05-02 上海米哈游天命科技有限公司 Method, device, equipment and storage medium for determining flying height of transmission target
CN112619163B (en) * 2020-12-22 2023-05-02 上海米哈游天命科技有限公司 Flight path control method and device, electronic equipment and storage medium
CN113101648B (en) * 2021-04-14 2023-10-24 北京字跳网络技术有限公司 Interaction method, device and storage medium based on map
CN113509729B (en) * 2021-05-20 2023-10-03 腾讯科技(深圳)有限公司 Virtual prop control method and device, computer equipment and storage medium
CN113457151B (en) * 2021-07-16 2024-02-27 腾讯科技(深圳)有限公司 Virtual prop control method, device, equipment and computer readable storage medium
CN113680061B (en) * 2021-09-03 2023-07-25 腾讯科技(深圳)有限公司 Virtual prop control method, device, terminal and storage medium
CN113713382B (en) * 2021-09-10 2023-06-16 腾讯科技(深圳)有限公司 Virtual prop control method and device, computer equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109200582A (en) * 2018-08-02 2019-01-15 腾讯科技(深圳)有限公司 The method, apparatus and storage medium that control virtual objects are interacted with ammunition
CN109242886A (en) * 2018-09-06 2019-01-18 中国人民解放军63921部队 A kind of modeling of space cluster target trajectory and forecasting procedure
CN109939438A (en) * 2019-02-19 2019-06-28 腾讯数码(天津)有限公司 Track display method and device, storage medium and electronic device
CN110368691A (en) * 2019-07-19 2019-10-25 腾讯科技(深圳)有限公司 More people fight prompting message sending method, device and terminal in program online
CN110417772A (en) * 2019-07-25 2019-11-05 浙江大华技术股份有限公司 The analysis method and device of attack, storage medium, electronic device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN207722354U (en) * 2014-01-28 2018-08-14 马卡里 A kind of on-line off-line real-time interactive games system
CN106382937A (en) * 2015-08-25 2017-02-08 深圳视景文化科技有限公司 Navigation method and navigation terminal
CN108066981A (en) * 2016-11-12 2018-05-25 金德奎 A kind of AR or MR method for gaming identified based on position and image and system
CN207818006U (en) * 2017-08-03 2018-09-04 北京北方新视野数码科技有限公司 A kind of general VR tactical trainings platform of new equipment
CN109991961A (en) * 2017-12-29 2019-07-09 技嘉科技股份有限公司 Mobile device and its control method, tele-control system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109200582A (en) * 2018-08-02 2019-01-15 腾讯科技(深圳)有限公司 The method, apparatus and storage medium that control virtual objects are interacted with ammunition
CN109242886A (en) * 2018-09-06 2019-01-18 中国人民解放军63921部队 A kind of modeling of space cluster target trajectory and forecasting procedure
CN109939438A (en) * 2019-02-19 2019-06-28 腾讯数码(天津)有限公司 Track display method and device, storage medium and electronic device
CN110368691A (en) * 2019-07-19 2019-10-25 腾讯科技(深圳)有限公司 More people fight prompting message sending method, device and terminal in program online
CN110417772A (en) * 2019-07-25 2019-11-05 浙江大华技术股份有限公司 The analysis method and device of attack, storage medium, electronic device

Also Published As

Publication number Publication date
CN110917619A (en) 2020-03-27

Similar Documents

Publication Publication Date Title
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN110694261B (en) Method, terminal and storage medium for controlling virtual object to attack
CN111265869B (en) Virtual object detection method, device, terminal and storage medium
CN110448891B (en) Method, device and storage medium for controlling virtual object to operate remote virtual prop
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
CN110585710B (en) Interactive property control method, device, terminal and storage medium
CN111408133B (en) Interactive property display method, device, terminal and storage medium
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN110721468B (en) Interactive property control method, device, terminal and storage medium
CN111475573B (en) Data synchronization method and device, electronic equipment and storage medium
CN110507990B (en) Interaction method, device, terminal and storage medium based on virtual aircraft
CN110755841A (en) Method, device and equipment for switching props in virtual environment and readable storage medium
CN110917623B (en) Interactive information display method, device, terminal and storage medium
CN111714893A (en) Method, device, terminal and storage medium for controlling virtual object to recover attribute value
CN111228809A (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN112870715B (en) Virtual item putting method, device, terminal and storage medium
CN110755844B (en) Skill activation method and device, electronic equipment and storage medium
CN110585706B (en) Interactive property control method, device, terminal and storage medium
CN112221141A (en) Method and device for controlling virtual object to use virtual prop
CN112057857A (en) Interactive property processing method, device, terminal and storage medium
CN111760284A (en) Virtual item control method, device, equipment and storage medium
CN111249726B (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN113713382A (en) Virtual prop control method and device, computer equipment and storage medium
CN111659122B (en) Virtual resource display method and device, electronic equipment and storage medium
CN110960849B (en) Interactive property control method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022647

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant