CN112057857B - Interactive property processing method, device, terminal and storage medium - Google Patents

Interactive property processing method, device, terminal and storage medium Download PDF

Info

Publication number
CN112057857B
CN112057857B CN202010951124.3A CN202010951124A CN112057857B CN 112057857 B CN112057857 B CN 112057857B CN 202010951124 A CN202010951124 A CN 202010951124A CN 112057857 B CN112057857 B CN 112057857B
Authority
CN
China
Prior art keywords
target
virtual
prop
injury
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010951124.3A
Other languages
Chinese (zh)
Other versions
CN112057857A (en
Inventor
周岷科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010951124.3A priority Critical patent/CN112057857B/en
Publication of CN112057857A publication Critical patent/CN112057857A/en
Application granted granted Critical
Publication of CN112057857B publication Critical patent/CN112057857B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an interactive prop processing method, an interactive prop processing device, a terminal and a storage medium, and belongs to the technical field of multimedia. The method comprises the following steps: responding to the trigger operation of the target interaction prop, controlling a prop component of the target interaction prop to move along a target moving track in a virtual scene; responding to the item component to hit any target object, displaying a first hitting effect at a target position corresponding to the target object, wherein the target object is any one of a virtual object or a virtual item, and the first hitting effect is used for indicating that the target object is damaged firstly; and continuously displaying a second hitting effect at the target position, wherein the second hitting effect is used for indicating that a second injury is caused to the target object, and the first injury and the second injury belong to different injury types. According to the technical scheme, the injury caused by the first hitting effect and the second hitting effect is prompted, so that the user can control the virtual object to effectively confront the virtual object and the virtual prop, and the human-computer interaction efficiency is improved.

Description

Interactive property processing method, device, terminal and storage medium
Technical Field
The present application relates to the field of multimedia technologies, and in particular, to a method, an apparatus, a terminal, and a storage medium for processing an interactive property.
Background
With the development of multimedia technology and the diversification of terminal functions, more and more games can be played on the terminal. The shooting game is a more popular game, after the terminal starts a game program, a virtual scene can be displayed in an interface of the game program, a controlled virtual object controlled by a current terminal user is displayed in the virtual scene, and the controlled virtual object can perform antagonistic interaction with other virtual objects through the interactive prop.
At present, interactive props provided by shooting games have virtual firearms, virtual objects can shoot each other by using the virtual firearms, and certain virtual life values are deducted from the shot virtual objects, so that antagonistic interaction among the virtual objects is realized.
However, the virtual scene includes, in addition to the virtual object, virtual items that can interact with the virtual object, such as a scout, a helicopter, a vehicle, and the like, however, there is no effective way to handle the antagonistic interaction between the virtual object and the virtual items, so that the user cannot control the virtual object to effectively resist the virtual items, thereby reducing the human-computer interaction efficiency.
Disclosure of Invention
The embodiment of the application provides an interactive prop method, an interactive prop device, a terminal and a storage medium, wherein a user can control a virtual object to effectively confront the virtual object and the virtual prop, and the human-computer interaction efficiency is improved. The technical scheme is as follows:
in one aspect, an interactive prop processing method is provided, where the method includes:
responding to the trigger operation of a target interaction prop, controlling a prop component of the target interaction prop to move along a target moving track in a virtual scene;
responding to the item component to hit any target object, and displaying a first hit effect at a target position corresponding to the target object, wherein the target object is any one of a virtual object or a virtual item, and the first hit effect is used for indicating that a first injury is caused to the target object;
continuously displaying a second hit effect at the target location, wherein the second hit effect is used for indicating that a second injury is caused to the target object, and the first injury and the second injury belong to different injury types.
In another aspect, an interactive prop processing apparatus is provided, the apparatus including:
the prop component control module is used for responding to the triggering operation of the target interactive prop, controlling the prop component of the target interactive prop and moving along a target moving track in a virtual scene;
the first display module is used for responding to the fact that the prop component hits any target object, displaying a first hitting effect at a target position corresponding to the target object, wherein the target object is any one of a virtual object or a virtual prop, and the first hitting effect is used for indicating that a first injury is caused to the target object;
and the second display module is used for continuously displaying a second hit effect at the target position, wherein the second hit effect is used for indicating that a second injury is caused to the target object, and the first injury and the second injury belong to different injury types.
In an alternative implementation, the first display module includes:
the determining unit is used for determining a target object hit by the prop component and a target position corresponding to the target object according to the target moving track;
and the first display unit is used for responding to the movement of the prop component to the target position and displaying the first hitting effect at the target position.
In an optional implementation manner, the target object is a virtual object existing on the target movement track;
the first determining unit is used for acquiring a plurality of collision boxes of the virtual object, and one collision box corresponds to one body part of the virtual object; and determining a target collision box from the plurality of collision boxes according to the target moving track, and taking the body part indicated by the target collision box as a target position corresponding to the virtual object, wherein the target collision box exists on the target moving track.
In an alternative implementation manner, the target object is a virtual prop, and the first hit effect is an explosion effect.
In an alternative implementation, the target object is a virtual object, and the first hit effect is a flame effect.
In an optional implementation, the apparatus further includes:
a first determination module to determine first injury information, the first injury information to indicate an injury value of a first injury to the target object;
a first information sending module, configured to send the first injury information to a server, where the server determines, based on the first injury information, a current remaining virtual life value of the target object;
and the information receiving module is used for receiving the currently remaining virtual life value of the target object.
In an optional implementation manner, the second display module is further configured to, in response to that the currently remaining virtual life value of the target object is zero, not perform the step of continuously displaying the second hit effect at the target location.
In an optional implementation, the apparatus further includes:
a second determination module, configured to determine second injury information according to a type of an object to which the target object belongs, where the second injury information is used to indicate a unit-time injury value and a duration of a second injury to the target object;
and the second information sending module is used for sending the second damage information to a server, and the server forwards the second damage information to a terminal corresponding to the target object.
In an optional implementation, the apparatus further includes:
the first state setting module is used for setting the target object to be a first target state, and the first target state is used for increasing the damage degree of the target object.
In an optional implementation manner, the apparatus further includes:
and the second state setting module is used for setting the target object to be in a second target state, the second target state is used for enabling the protection prop of the target object to be invalid, and the protection prop is used for preventing the target object from being damaged.
In an optional implementation, the apparatus further includes:
the third display module is used for displaying a prop component configuration interface, and the prop component configuration interface is used for displaying at least one prop component of the target interaction prop;
and the prop component configuration module is used for responding to the configuration operation of any prop component and configuring the prop component for the target interactive prop.
In an optional implementation manner, the third display module is further configured to display, in response to a selection operation of any prop component, prop component introduction information on the prop component configuration interface, where the prop component introduction information is used to indicate an influence on the target interactive prop after the prop component is configured.
In another aspect, a terminal is provided, where the terminal includes a processor and a memory, and the memory is used to store at least one program code, where the at least one program code is loaded and executed by the processor to implement the operations performed in the interactive prop processing method in this embodiment.
In another aspect, a computer-readable storage medium is provided, and at least one program code is stored in the computer-readable storage medium, and is loaded and executed by a processor to implement the operations performed in the interactive prop processing method in the embodiment of the present application.
In another aspect, a computer program product or a computer program is provided, the computer program product or the computer program comprising computer program code, the computer program code being stored in a computer readable storage medium. The processor of the terminal reads the computer program code from the computer-readable storage medium, and the processor executes the computer program code, so that the terminal performs the interactive item processing method provided in the above aspects or various alternative implementations of the aspects.
The technical scheme provided by the embodiment of the application has the following beneficial effects:
the embodiment of the application provides an interactive prop processing method, after a user triggers a target interactive prop, when a prop component of the target interactive prop hits a virtual object or a virtual prop, the virtual object or the virtual prop is subjected to first damage and second damage which belong to different damage types, and the damage caused by the first hitting effect and the second hitting effect is prompted, so that the user can control the virtual object to effectively confront the virtual object and the virtual prop, and the human-computer interaction efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of an interactive item processing method according to an embodiment of the present application;
FIG. 2 is a flowchart of a method for processing an interactive prop according to an embodiment of the present disclosure;
FIG. 3 is a flowchart of another interactive prop processing method according to an embodiment of the present application;
FIG. 4 is a schematic illustration of a firearm configuration interface provided in accordance with an embodiment of the subject application;
FIG. 5 is a schematic illustration of an ammunition deployment interface provided in accordance with an embodiment of the present application;
FIG. 6 is a schematic diagram of a first hit effect provided according to an embodiment of the present application;
FIG. 7 is a schematic diagram of another first hit effect provided according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a ballistic trajectory provided by embodiments of the present application;
FIG. 9 is a schematic diagram of a crash box for a virtual object provided in accordance with an embodiment of the present application;
FIG. 10 is a schematic diagram of a crash box for another virtual object provided in accordance with an embodiment of the present application;
FIG. 11 is a diagram illustrating hit effects provided by an embodiment of the present application;
figure 12 is a schematic diagram of the logic for a multi-type injury round provided in accordance with an embodiment of the present application;
FIG. 13 is a flowchart of another interactive prop processing method provided in accordance with an embodiment of the present application;
FIG. 14 is a block diagram of an interactive prop processing apparatus according to an embodiment of the present application;
fig. 15 is a block diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
Hereinafter, terms related to the present application are explained.
Virtual scene: is a virtual scene that is displayed (or provided) by an application program when the application program runs on a terminal. The virtual scene can be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts, cities, etc., and the end user may control the virtual objects to move within the virtual scene. Optionally, the virtual scene may also be used for virtual scene engagement between at least two virtual objects, in which virtual resources are available for use by the at least two virtual objects.
Virtual object: refers to a movable object in a virtual scene. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the virtual scene. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. A plurality of virtual objects may be included in the virtual scene, each virtual object having its own shape and volume in the virtual scene, occupying a portion of the space in the virtual scene. Optionally, when the virtual scene is a three-dimensional virtual scene, the virtual object can be a three-dimensional model, the three-dimensional model can be a three-dimensional character constructed based on a three-dimensional human skeleton technology, and the same virtual object can show different external images by wearing different skins. In some embodiments, the virtual object may also be implemented by using a 2.5-dimensional or 2-dimensional model, which is not limited in this application.
Alternatively, the virtual object may be a user Character controlled by an operation on the client, an Artificial Intelligence (AI) set in a virtual scene battle by training, or a Non-user Character (NPC) set in a virtual scene interaction. Alternatively, the virtual object may be a virtual character that is confrontationally interacted with in a virtual scene. Optionally, the number of virtual objects participating in the interaction in the virtual scene may be preset, or may be dynamically determined according to the number of clients participating in the interaction.
Taking a shooting game as an example, the user may control a virtual object to freely fall, glide, open a parachute to fall, run, jump, climb, bend over, and move on the land, or control a virtual object to swim, float, or dive in the sea, or the like, but the user may also control a virtual object to move in the virtual scene by riding a virtual vehicle, for example, the virtual vehicle may be a virtual car, a virtual aircraft, a virtual yacht, and the like, and the above-mentioned scenes are merely exemplified, and the present invention is not limited to this. The user may also control the virtual object to perform antagonistic interaction with other virtual objects through a virtual weapon, for example, the virtual weapon may be a throwing virtual weapon such as a grenade, a bundled grenade, a sticky grenade (abbreviated as "sticky grenade"), and a flying axe, or a shooting virtual weapon such as a machine gun, a pistol, and a rifle, and the type of the virtual weapon is not specifically limited in the present application.
Instantaneous injury: one type of injury in a game is damage caused at the moment of being hit by a virtual weapon, virtual bullet, virtual grenade, or skill, as distinguished from sustained and delayed damage.
And (3) continuous injury: one type of injury in a game is one that persists for a period of time after being hit by a virtual weapon, virtual bullet, virtual grenade, or skill, as distinguished from a momentary injury and a delayed injury. Sustained injury is generally expressed in terms of causing X injury per second, X being a positive number.
And (3) injury delaying: one type of injury in a game is to wait for a period of time before causing injury after being hit by a virtual weapon, virtual bullet, virtual grenade, or skill.
An implementation environment of the interactive prop processing method provided in the embodiment of the present application is described below. Fig. 1 is a schematic diagram of an implementation environment of an interactive prop processing method according to an embodiment of the present application. Referring to fig. 1, the implementation environment includes a terminal 101 and a server 102.
The terminal 101 and the server 102 can be directly or indirectly connected through wired or wireless communication, and the application is not limited herein.
Optionally, the terminal 101 is a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, or the like, but is not limited thereto. The terminal 101 is installed and operated with an application program supporting a virtual scene. The application program may be any one of a First-Person Shooting game (FPS), a third-Person Shooting game, a Multiplayer Online Battle sports game (MOBA), a virtual reality application program, a three-dimensional map program, or a Multiplayer gunfight survival game. Illustratively, the terminal 101 is a terminal used by a user, and the user uses the terminal 101 to operate a virtual object located in a virtual scene for an activity, which includes but is not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the virtual object is a virtual character, such as a simulated character or an animated character.
Optionally, the server 102 is an independent physical server, may also be a server cluster or a distributed system formed by a plurality of physical servers, and may also be a cloud server that provides basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, Network service, cloud communication, middleware service, domain name service, security service, CDN (Content Delivery Network), big data, and an artificial intelligence platform. The server 102 is used for providing background services for the application programs supporting the virtual scenes. Alternatively, the server 102 may undertake primary computational tasks and the terminal 101 may undertake secondary computational tasks; or, the server 102 undertakes the secondary computing work, and the terminal 101 undertakes the primary computing work; alternatively, the server 102 and the terminal 101 perform cooperative computing by using a distributed computing architecture.
Optionally, the virtual object controlled by the terminal 101 (hereinafter referred to as the controlled virtual object) and the virtual object controlled by the other terminal 101 (hereinafter referred to as the other virtual object) are in the same virtual scene, and at this time, the controlled virtual object can interact with the other virtual object in the virtual scene. In some embodiments, the controlled virtual object and the other virtual objects may be in a hostile relationship, for example, the controlled virtual object and the other virtual objects may belong to different teams and organizations, and the hostile relationship between the virtual objects may enable antagonistic interaction on land in a manner of shooting each other.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
Fig. 2 is a flowchart of an interactive prop processing method according to an embodiment of the present application, and as shown in fig. 2, the application to a terminal is taken as an example in the embodiment of the present application for description. The interactive prop processing method comprises the following steps:
201. and responding to the triggering operation of the target interactive prop, and controlling the prop component of the target interactive prop by the terminal to move along the target moving track in the virtual scene.
In this embodiment, the target interaction prop is a virtual gun, the prop component is a virtual ammunition, the virtual gun can be used for launching the virtual ammunition, the target moving trajectory is a ballistic trajectory of the virtual gun, and the virtual ammunition can cause various types of injuries to a hit target object. The terminal can control the virtual ammunition to move along the trajectory track in a virtual scene when a user triggers the target interactive prop, wherein the trajectory tracks of different virtual firearms are different, and the trajectory tracks are rays, parabolas and the like.
202. In response to the prop component hitting any target object, the terminal displays a first hit effect at a target position corresponding to the target object, the target object is any one of a virtual object or a virtual prop, and the first hit effect is used for indicating that a first injury is caused to the target object.
In this embodiment, the damage type of the first damage is an instant damage, that is, the virtual ammunition can cause an instant damage to the target object or the virtual prop when hitting the virtual object or the virtual prop, and accordingly, the terminal can display the first hitting effect to indicate the instant damage. Optionally, the virtual object is a virtual object having an adversary relationship with the controlled virtual object of the current terminal. Optionally, the virtual prop is a carrier such as a helicopter, a vehicle, a ship, or the like, or the virtual prop is tactical equipment such as a shield turret, an unmanned reconnaissance plane, a landmine, or the like, or the virtual prop is a summoning object of a controlled virtual object, such as a mechanical soldier, a mechanical dog, or the like, and the virtual prop is not limited in the embodiment of the application.
203. And the terminal continuously displays a second hitting effect at the target position, wherein the second hitting effect is used for indicating that a second injury is caused to the target object, and the first injury and the second injury belong to different injury types.
In the embodiment of the present application, the type of injury to which the second injury belongs is a sustained injury, that is, the virtual ammunition can cause a sustained injury to the target object in addition to a transient injury to the target object, and accordingly, the terminal can display the second hitting effect to indicate the sustained injury. Optionally, the terminal is further capable of displaying a third hit effect indicating a delayed damage to the target object. The embodiments of the present application do not limit this.
The embodiment of the application provides an interactive prop processing method, after a user triggers a target interactive prop, when a prop component of the target interactive prop hits a virtual object or a virtual prop, the virtual object or the virtual prop is subjected to first damage and second damage which belong to different damage types, and the damage caused by the first hitting effect and the second hitting effect is prompted, so that the user can control the virtual object to effectively confront the virtual object and the virtual prop, and the human-computer interaction efficiency is improved.
Fig. 3 is a flowchart of another interactive prop processing method according to an embodiment of the present application, and as shown in fig. 3, the application to a terminal is taken as an example in the embodiment of the present application for description. The interactive prop processing method comprises the following steps:
301. and the terminal configures the prop component for the target interactive prop according to the operation of the user on a prop component configuration interface, wherein the prop component configuration interface is used for displaying at least one prop component of the target interactive prop.
In the embodiment of the present application, taking a shooting game as an example, if the target interactive prop is a virtual firearm, the prop component is a virtual ammunition. The user can configure the virtual firearm with accessories such as muzzle, grip, barrel, sight, ammunition, grip, and butt through the firearm configuration interface. When a user configures virtual ammunition for the virtual firearm, the terminal can display an ammunition configuration interface, the ammunition configuration interface is the prop component configuration interface, and in response to the configuration operation of the virtual ammunition, the terminal can configure the virtual ammunition for the target interactive prop, namely the virtual firearm. Wherein the configuration operation comprises a selection operation and a determination operation.
In an alternative implementation, in order to enable a user to know the influence of different ammunition on the virtual firearm, when the terminal displays the ammunition configuration interface, in response to a selection operation of any ammunition, ammunition introduction information which is used for indicating the influence on the virtual firearm after the ammunition is configured can be displayed on the ammunition configuration interface. By displaying the ammunition introduction information, the influence on the virtual firearm, such as the capacity of a cartridge clip, the ammunition changing speed, the aiming speed, the injury range and the like, caused when different virtual ammunitions are configured for the virtual firearm can be displayed for a user.
For example, referring to fig. 4, fig. 4 is a schematic diagram of a firearm configuration interface provided according to an embodiment of the present application. As shown in fig. 4, a virtual firearm is exemplarily shown, which can configure six accessories, namely a muzzle, a grip, a barrel, an aiming device, ammunition and a stock, one accessory corresponds to one trigger button, and a user can perform a corresponding configuration interface by triggering any trigger button. And responding to the triggering button corresponding to the ammunition accessory triggered by the user, and displaying an ammunition configuration interface by the terminal. Referring to fig. 5, fig. 5 is a schematic view of an ammunition deployment interface provided in accordance with an embodiment of the present application. As shown in fig. 5, four ammunition accessories configurable for a virtual firearm are illustratively shown: 10 rounds, 12 rounds, drag power rounds, and multiple types of rounds of injury. When a user selects a multi-type injury round, the terminal displays the impact of configuring the multi-type injury round on the virtual firearm in the ammunition configuration interface. Wherein, the positive influence is marked with "+" and the negative influence is marked with "-". The positive effects include: the capacity of the cartridge clip is increased, the injury range is increased, and continuous injury is increased. Negative effects include: the instantaneous damage is reduced, the aiming speed is reduced, and the bullet changing speed is reduced.
The firearm placement interface and the ammunition placement interface may be displayed during the game or before the game is started, and the present embodiment is not limited thereto.
302. Responding to the triggering operation of the target interactive prop, and controlling a prop component of the target interactive prop by the terminal to move along a target moving track in a virtual scene.
In this embodiment, the virtual firearm may be configured to fire virtual ammunition, and after a user configures a plurality of types of injury bullets for the target interaction prop in an ammunition configuration interface, the plurality of types of injury bullets are the virtual ammunition, and the virtual ammunition may cause a plurality of types of injuries to a hit target object. The target moving trajectory is a ballistic trajectory of the virtual firearm, wherein when a user triggers the virtual firearm, the terminal can control the virtual ammunition to move along the ballistic trajectory of the virtual firearm in a virtual scene until the virtual ammunition hits an obstacle in the virtual scene, such as a wall, a wooden box, a virtual object, a virtual prop and the like, due to the fact that the ballistic trajectories of different virtual firearms are different. The trajectory is a ray, a parabola, and the like, which is not limited in the embodiments of the present application.
303. In response to the prop component hitting any target object, the terminal displays a first hit effect at a target position corresponding to the target object, the target object is any one of a virtual object or a virtual prop, and the first hit effect is used for indicating that a first injury is caused to the target object.
In the embodiment of the present application, the damage type to which the first damage belongs is an instant damage, that is, when the virtual ammunition hits a virtual object or a virtual prop, the target object or the virtual prop can be instantaneously damaged, and the instant damage can reduce a virtual life value of the target object. Accordingly, the terminal can display the first hit effect to indicate the instant injury, enabling the user to intuitively determine whether the target object is hit. Optionally, the terminal can also display a firing action, where the firing action is used to indicate an action after the virtual firearm fires the virtual ammunition, such as a muzzle upward shift, a bolt pulling and ammunition changing action, and the like, and the embodiment of the present application does not limit this. It should be noted that the firing action is related to the effect of the virtual ammunition on the virtual firearm.
For example, taking the target object as a virtual object as an example, the current remaining virtual life value of the virtual object is 90 points. The controlled virtual object launches virtual ammunition through the virtual firearm, which can cause 20 points of transient injury to the virtual object. At the moment when the virtual ammunition hits the virtual object, the virtual object is reduced by 20 points of virtual life value, and the current remaining virtual life value is 70 points.
Optionally, the virtual object is a virtual object having an enemy relationship with the controlled virtual object of the current terminal, and the virtual object can be controlled by other users and also can be controlled by the AI. Optionally, the virtual prop is a carrier such as a helicopter, a vehicle, a ship and the like; or the virtual prop is tactical equipment such as a shield gun tower, an unmanned reconnaissance plane, a landmine and the like; or the virtual prop is a summoning object of a controlled virtual object, such as a mechanical soldier, a mechanical dog and the like; or the virtual prop is a destructible building, such as a wall, a door, a floor, etc. The target object is not limited in the embodiments of the present application.
In an alternative implementation manner, the target object is a virtual prop, and the first hit effect is an explosion effect.
For example, referring to fig. 6, fig. 6 is a schematic diagram of a first hit effect provided according to an embodiment of the present application. As shown in fig. 6, the target interactive prop is a virtual firearm, the virtual prop is a metal plate, and a bullet fired by the virtual firearm is exploded after hitting the metal plate.
In an alternative implementation, the target object is a virtual object, and the first hit effect is a flame effect.
For example, referring to fig. 7, fig. 7 is a schematic view of another first hit effect provided according to an embodiment of the present application. As shown in fig. 7, the target interactive prop is a virtual gun, the virtual object is a virtual soldier, and after a bullet shot by the virtual gun hits the shoulder of the virtual soldier, the terminal displays a flame effect at the shoulder of the virtual soldier. After the virtual firearm is fired, the muzzle of the virtual firearm is shifted to the upper right, indicating that the virtual firearm is affected by the arranged virtual ammunition.
In an alternative implementation, the terminal can determine whether to hit the target object and the corresponding target position based on the target moving trajectory. Correspondingly, in response to the virtual ammunition hitting any target object, the step that the terminal displays the first hitting effect at the target position corresponding to the target object is as follows: and the terminal determines the target object hit by the prop component and the target position corresponding to the target object according to the target moving track. Then, in response to the prop component moving to the target position, the terminal displays the first hit effect at the target position. Optionally, the target moving trajectory is a trajectory of the virtual firearm. The hit target object and the hit target position can be accurately determined through the target moving track, so that the first hit effect can be displayed at the target position, and a user can intuitively determine that the target object is hit.
For example, referring to fig. 8, fig. 8 is a schematic diagram of a ballistic trajectory provided by an embodiment of the present application, and as shown in fig. 8, the ballistic trajectory is a ray starting from a muzzle of a virtual firearm. And the terminal performs ray detection based on the ray, acquires information of the detected first object, and determines the first object as a target object. Optionally, the trajectory can also be a parabola or a line segment, which is not described herein again.
It should be noted that the virtual object and the virtual item both correspond to a collision box, and the terminal can determine a target position corresponding to the target object according to the target movement trajectory and the collision box.
In an optional implementation manner, the target object is a virtual item existing on the target movement track, the virtual item corresponds to a collision box covering the virtual item, and the terminal can determine a position where the target movement track and the collision box intersect for the first time as a target position corresponding to the virtual item.
In an alternative implementation manner, the target object is a virtual object existing on the target movement trajectory, the virtual object corresponds to a collision box as a whole, and each body part of the virtual object also corresponds to a collision box. Correspondingly, the step that the terminal determines the target position of the hit target object according to the target moving track comprises the following steps: the terminal acquires a plurality of collision boxes of the virtual object, and one collision box corresponds to one body part of the virtual object. Then, the terminal determines a target collision box from the plurality of collision boxes according to the target moving track, and takes the body part indicated by the target collision box as a target position corresponding to the virtual object, namely the target position where the virtual object is hit, wherein the target collision box exists on the target moving track.
For example, referring to fig. 9, fig. 9 is a schematic diagram of a crash box of a virtual object provided according to an embodiment of the present application. As shown in fig. 9, the virtual object is a virtual soldier, which is covered with a cylindrical crash box. Each body part of the virtual soldier corresponds to a collision box. Referring to fig. 10, fig. 10 is a schematic diagram of a crash box of another virtual object provided according to an embodiment of the present application. As shown in FIG. 10, two arms of the virtual soldier are exemplarily shown to be respectively corresponding to a cubic crash box.
In an optional implementation manner, after the terminal displays the first hit effect at the target position where the target object is hit, the terminal can further determine first injury information, where the first injury information is used for indicating an injury value of the first injury caused to the target object. The terminal can then send the first injury information to the server, and the server determines a current remaining virtual life value of the target object based on the first injury information. And finally, the terminal receives the currently remaining virtual life value of the target virtual object.
It should be noted that, in response to that the currently remaining virtual life value of the target object is zero, the terminal can no longer perform the step of continuously displaying the second hit effect at the target position.
It should be noted that the terminal can also determine the first injury information before displaying the first hit effect, and then send the first injury information to the server.
304. And the terminal continuously displays a second hitting effect at the target position, wherein the second hitting effect is used for indicating that a second injury is caused to the target object.
In the embodiment of the present application, the damage type to which the second damage belongs is a sustained damage, that is, the virtual ammunition can cause a sustained damage to the target object in addition to a transient damage to the target object, and the sustained damage can continuously reduce the virtual life value of the target object. Accordingly, the terminal can display a second hit effect to indicate the sustained injury. Optionally, after the virtual ammunition continuously damages the target object, different states can be added to the target object, such as poisoning, armor failure, increase in damage degree, real-time position display and the like, which is not limited in the embodiment of the present application.
For example, continuing with the target object as a virtual object, the virtual ammunition can also cause a sustained injury of 3 points per second, for example 10 seconds. The current remaining virtual life value of the virtual object after being subjected to the transient injury is 70 points, and then every second of the virtual object is reduced by 3 points, namely the current remaining virtual life value of the virtual object after 1 second is 67 points, the current remaining virtual life value of the virtual object after 5 seconds is 55 points, and the current remaining virtual life value of the virtual object after 10 seconds is 40 points. If the virtual object is not otherwise damaged, the current remaining virtual life value of the virtual object is not decreased.
In an alternative implementation, the terminal can set the target object to a first target state, and the first target state is used for increasing the damage degree of the target object. Namely, the injury value of the subsequent injury of the target object is improved, so that the controlled virtual object can effectively strike the enemy virtual object and the virtual prop.
In an optional implementation manner, the terminal may set the target object to be in a second target state, where the second target state is used to disable a protection prop of the target object, and the protection prop is used to prevent the target object from being damaged. If the second target state can disable the body armor of the virtual object or disable the armor of the virtual item, the controlled virtual object can effectively attack the enemy virtual object and the virtual item.
In an alternative implementation, the terminal is capable of setting the target object to a third target state for poisoning the target object. Such as making the target object move at a reduced speed, blurring the field of view, slowly decreasing the virtual life value, etc.
In an alternative implementation, the terminal can set the target object to a fourth target state, and the fourth target state is used for exposing the real-time position of the target object. The real-time location of the target object can be displayed on a small map of virtual objects having an adversarial relationship with the target object.
In an optional implementation manner, the terminal may send the target position to the server, the server sends the target position to the terminal corresponding to the target object, and the terminal corresponding to the target object displays the first hit effect and the second hit effect in real time according to the target position. Through the synchronous target position, the terminal can synchronously display the first hitting effect and the second hitting effect, the display effect is improved, and the game experience of a user is improved.
In an optional implementation manner, when the terminal displays the second hit effect, the hit effect displayed after the target object is hit can be replaced by the second hit effect, and the terminal does not need to synchronize the target position with the server, that is, the terminal corresponding to the target object does not need to display the second hit effect, so that the terminal can accurately display the second hit effect, reduce the data interaction amount between the terminal and the server, and meet the requirement of performance optimization. Wherein the hit effect is the effect of the conventional ammunition such as hit steel plates, display of bullet holes, hit wood plates and generation of wood chips.
For example, referring to fig. 11, fig. 11 is a schematic diagram illustrating a hit effect according to an embodiment of the present disclosure. As shown in fig. 11, the bullet hit the steel plate to show the bullet hole, and the bullet hit the stone to show dust.
It should be noted that the terminal can also display a third hit effect, which is used to indicate delayed damage to the target object. The embodiment of the present application does not limit this.
It should be noted that, in order to make the effect generated after the multi-type injury bullet is disposed easier to understand, referring to fig. 12, fig. 12 is a logic diagram of a multi-type injury bullet according to an embodiment of the present application. As shown in fig. 12, the multi-type injury round can cause both transient injuries and sustained injuries. Wherein the transient injury can cause a Y point injury to the target object, Y being a positive number, configurable by a technician, optionally with a lower injury value than conventional ammunition. The injury rule of the sustained injury is to cause different injuries to different target objects, for example, the injury to the virtual object is less than the injury to the virtual prop. The form of sustained injury is Z points/second, with Z being a positive number. The target object is a virtual object or a virtual prop, the virtual object is a human-shaped object or a non-mechanical object, and the virtual prop is a mechanical prop, an obstacle and the like. The mechanical prop includes: vehicles such as helicopters, vehicles, and ships; shield gun towers, unmanned reconnaissance planes, landmines and other tactical equipment; summoning objects such as mechanical soldiers, mechanical dogs, etc.
In an alternative implementation manner, the terminal can determine second injury information according to the type of the object to which the target object belongs, wherein the second injury information is used for indicating a unit-time injury value and duration of second injury caused to the target object. And then the terminal sends the second damage information to the server, and the server forwards the second damage information to the terminal corresponding to the target object. And synchronizing the unit time injury value and the duration of the second injury so that the information of the current terminal and the terminal corresponding to the target object can be synchronized.
It should be noted that, the foregoing steps 301 to 304 are optional implementation manners of the method for processing the interactive prop provided in the embodiment of the present application, and correspondingly, the method for processing the interactive prop has other optional implementation manners. Referring to fig. 13, fig. 13 is a flowchart of another interactive prop processing method according to an embodiment of the present application. As shown in fig. 13, the method comprises the following steps: 1301. configuring various types of damage bullets; 1302. judging whether the target object is hit; if yes, executing step 1303, displaying a first hit effect, such as flame, explosion, etc.; 1304. judging whether the object is a virtual object such as a humanoid object; if not, executing step 1305, displaying the explosion effect; if yes, executing step 1306, continuously displaying a second hit effect, such as continuous combustion, slow dissolution of nail protection and the like; 1307. judging whether the duration of the second hit effect is finished; if the result is finished, step 1308 is executed, and the second hit effect is no longer displayed, wherein if the virtual life value of the target object is zero, the target virtual object is eliminated.
The embodiment of the application provides an interactive prop processing method, after a user triggers a target interactive prop, when a prop component of the target interactive prop hits a virtual object or a virtual prop, the virtual object or the virtual prop is subjected to first damage and second damage which belong to different damage types, and the damage caused by the first hitting effect and the second hitting effect is prompted, so that the user can control the virtual object to effectively confront the virtual object and the virtual prop, and the human-computer interaction efficiency is improved.
Fig. 14 is a block diagram of an interactive prop processing apparatus according to an embodiment of the present application. The device is used for executing the steps executed by the interactive prop processing method, and referring to fig. 14, the device comprises: prop assembly control module 1401, first display module 1402, and second display module 1403.
An ammunition control module 1401, which is used for responding to the triggering operation of the target interactive prop, controlling a prop component of the target interactive prop to move along a target moving track in a virtual scene;
a first display module 1402, configured to, in response to the prop component hitting any target object, display a first hit effect at a target position corresponding to the target object, where the target object is any one of a virtual object and a virtual prop, and the first hit effect is used to indicate that a first injury is caused to the target object;
a second display module 1403, configured to continuously display a second hit effect at the target location, where the second hit effect is used to indicate that a second injury is caused to the target object, and the first injury and the second injury belong to different injury types.
In an alternative implementation manner, the first display module 1402 includes:
the determining unit is used for determining a target object hit by the prop component and a target position corresponding to the target object according to the target moving track;
and the first display unit is used for responding to the movement of the prop component to the target position and displaying the first hitting effect at the target position.
In an alternative implementation, the target object is a virtual object existing on the target movement track;
the first determining unit is used for acquiring a plurality of collision boxes of the virtual object, and one collision box corresponds to one body part of the virtual object; and determining a target collision box from the plurality of collision boxes according to the target movement track, and taking the body part indicated by the target collision box as the target position corresponding to the virtual object, wherein the target collision box exists on the target movement track.
In an alternative implementation manner, the target object is a virtual prop, and the first hit effect is an explosion effect.
In an alternative implementation, the target object is a virtual object, and the first hit effect is a flame effect.
In an optional implementation, the apparatus further includes:
a first determination module 1404 configured to determine first injury information indicating an injury value of a first injury to the target object;
a first information sending module 1405, configured to send the first injury information to a server, where the server determines, based on the first injury information, a current remaining virtual life value of the target object;
an information receiving module 1406, configured to receive the currently remaining virtual life value of the target object.
In an alternative implementation manner, the second display module 1403 is further configured to, in response to that the currently remaining virtual life value of the target object is zero, not perform the step of continuously displaying the second hit effect at the target position.
In an optional implementation, the apparatus further includes:
a second determining module 1407, configured to determine second injury information according to the type of the object to which the target object belongs, where the second injury information is used to indicate a unit-time injury value and duration of a second injury to the target object;
a second information sending module 1408, configured to send the second injury information to a server, where the server forwards the second injury information to a terminal corresponding to the target object.
In an optional implementation, the apparatus further includes:
a first state setting module 1409 is configured to set the target object to a first target state, which is configured to increase a degree of injury suffered by the target object.
In an optional implementation, the apparatus further includes:
a second state setting module 1410, configured to set the target object to a second target state, where the second target state is used to disable a protection prop of the target object, and the protection prop is used to prevent the target object from being damaged.
In an optional implementation, the apparatus further includes:
a third display module 1411, configured to display a prop component configuration interface, the prop component configuration interface being configured to display at least one of the plurality of prop components of the target interaction prop;
and a prop element configuration module 1412 for configuring any prop element for the target interactive prop.
In an alternative implementation manner, the third display module 1411 is further configured to display, in response to a selection operation of any prop component, prop component introduction information on the prop component configuration interface, where the prop component introduction information is used to indicate an influence on the target interactive prop after configuring the prop component.
The embodiment of the application provides an interactive prop processing method, after a user triggers a target interactive prop, when a prop component of the target interactive prop hits a virtual object or a virtual prop, the virtual object or the virtual prop is subjected to first damage and second damage which belong to different damage types, and the damage caused by the first hitting effect and the second hitting effect is prompted, so that the user can control the virtual object to effectively confront the virtual object and the virtual prop, and the human-computer interaction efficiency is improved.
It should be noted that: when the interactive prop processing device provided in the above embodiment processes an interactive prop, only the division of the above functional modules is used for illustration, and in practical applications, the above function distribution can be completed by different functional modules as needed, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the interactive prop processing device and the interactive prop processing method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
Fig. 15 is a block diagram of a terminal 1500 according to an embodiment of the present disclosure. The terminal 1500 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1500 may also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or other names.
In general, terminal 1500 includes: a processor 1501 and memory 1502.
Processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1501 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, processor 1501 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
The memory 1502 may include one or more computer-readable storage media, which may be non-transitory. The memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer-readable storage medium in memory 1502 is used to store at least one program code for execution by processor 1501 to implement the interactive prop processing methods provided by method embodiments herein.
In some embodiments, the terminal 1500 may further include: a peripheral interface 1503 and at least one peripheral. The processor 1501, memory 1502, and peripheral interface 1503 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1503 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1504, a display 1505, a camera assembly 1506, an audio circuit 1507, and a power supply 1509.
The peripheral device interface 1503 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1501 and the memory 1502. In some embodiments, the processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1501, the memory 1502, and the peripheral device interface 1503 may be implemented on separate chips or circuit boards, which is not limited by the present embodiment.
The Radio Frequency circuit 1504 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1504 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1504 can communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1505 is a touch display screen, the display screen 1505 also has the ability to capture touch signals on or over the surface of the display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. In this case, the display screen 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1505 may be one, provided on the front panel of terminal 1500; in other embodiments, display 1505 may be at least two, each disposed on a different surface of terminal 1500 or in a folded design; in other embodiments, display 1505 may be a flexible display disposed on a curved surface or a folded surface of terminal 1500. Even further, the display 1505 may be configured in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1505 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, the main camera and the wide-angle camera are fused to realize panoramic shooting and a VR (Virtual Reality) shooting function or other fusion shooting functions. In some embodiments, camera assembly 1506 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1507 may include a microphone and speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1501 for processing or inputting the electric signals to the radio frequency circuit 1504 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of the terminal 1500. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1507 may also include a headphone jack.
Power supply 1509 is used to power the various components in terminal 1500. The power supply 1509 may be alternating current, direct current, disposable or rechargeable. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 may detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 1500. For example, the acceleration sensor 1511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1501 may control the display screen 1505 to display the user interface in a landscape view or a portrait view based on the gravitational acceleration signal collected by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for acquisition of motion data of a game or a user.
The gyroscope sensor 1512 can detect the body direction and the rotation angle of the terminal 1500, and the gyroscope sensor 1512 and the acceleration sensor 1511 cooperate to collect the 3D motion of the user on the terminal 1500. The processor 1501 may implement the following functions according to the data collected by the gyro sensor 1512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1513 may be disposed on a side frame of terminal 1500 and/or underneath display 1505. When the pressure sensor 1513 is disposed on the side frame of the terminal 1500, the holding signal of the user to the terminal 1500 may be detected, and the processor 1501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at the lower layer of the display 1505, the processor 1501 controls the operability control on the UI interface in accordance with the pressure operation of the user on the display 1505. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The optical sensor 1515 is used to collect ambient light intensity. In one embodiment, processor 1501 may control the brightness of display screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1505 is increased; when the ambient light intensity is low, the display brightness of the display screen 1505 is adjusted down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also known as a distance sensor, is typically provided on the front panel of the terminal 1500. The proximity sensor 1516 is used to collect the distance between the user and the front surface of the terminal 1500. In one embodiment, when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually decreases, the processor 1501 controls the display 1505 to switch from the bright screen state to the mute screen state; when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually becomes larger, the processor 1501 controls the display 1505 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 15 does not constitute a limitation of terminal 1500, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be employed.
The embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium is applied to a terminal, and at least one section of program code is stored in the computer-readable storage medium, and the at least one section of program code is loaded and executed by a processor to implement the operation executed by the terminal in the interactive prop processing method according to the embodiment.
Embodiments of the present application also provide a computer program product or a computer program comprising computer program code stored in a computer readable storage medium. The processor of the terminal reads the computer program code from the computer-readable storage medium, and executes the computer program code, so that the terminal executes the interactive item processing method provided in the above-mentioned various optional implementation modes.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. An interactive prop processing method, characterized in that the method comprises:
responding to the trigger operation of a target interaction prop, controlling a prop component of the target interaction prop to move along a target moving track in a virtual scene;
responding to the item component to hit any target object, and displaying a first hit effect at a target position corresponding to the target object, wherein the target object is any one of a virtual object or a virtual item, and the first hit effect is used for indicating that a first injury is caused to the target object;
continuously displaying a second hit effect at the target position, wherein the second hit effect is used for indicating that a second injury is caused to the target object, and the first injury and the second injury belong to different injury types, wherein the first injury and the second injury are used for reducing a virtual life value of the target object;
and determining second injury information according to the type of the target object, wherein the second injury information is used for indicating the injury value and the duration of the second injury to the target object, and the second injury to the virtual object is less than the second injury to the virtual prop.
2. The method according to claim 1, wherein the displaying a first hit effect at a target position corresponding to the target object in response to the prop component hitting any target object comprises:
determining a target object hit by the prop component and a target position corresponding to the target object according to the target moving track;
in response to the prop assembly moving to the target location, displaying the first hit effect at the target location.
3. The method of claim 2, wherein the target object is a virtual object existing on the target movement trajectory;
determining a target object hit by the prop component and a target position corresponding to the target object according to the target moving trajectory, including:
obtaining a plurality of collision boxes of the virtual object, one collision box corresponding to one body part of the virtual object;
and determining a target collision box from the plurality of collision boxes according to the target moving track, and taking the body part indicated by the target collision box as a target position corresponding to the virtual object, wherein the target collision box exists on the target moving track.
4. The method of claim 2, wherein the target object is a virtual prop, and the first hit effect is an explosion effect.
5. The method of claim 2, wherein the target object is a virtual object and the first hit effect is a flame effect.
6. The method according to claim 1, wherein in response to the prop component hitting any target object, after the target location corresponding to the target object shows a first hit effect, the method further comprises:
determining first injury information, the first injury information being indicative of an injury value of a first injury inflicted on the target object;
sending the first injury information to a server, and determining a currently remaining virtual life value of the target object by the server based on the first injury information;
and receiving the current remaining virtual life value of the target object.
7. The method of claim 6, wherein after receiving the virtual life value currently remaining for the target object, the method further comprises:
and in response to the current remaining virtual life value of the target object being zero, not executing the step of continuously displaying the second hit effect at the target position.
8. The method of claim 1, wherein after determining second injury information based on the type of object to which the target object belongs, the method further comprises:
and sending the second damage information to a server, and forwarding the second damage information to a terminal corresponding to the target object by the server.
9. The method according to claim 1, wherein after the target location continues to display the second hit effect, the method further comprises:
setting the target object to be in a first target state, wherein the first target state is used for increasing the damage degree of the target object.
10. The method according to claim 1, wherein after the target location continues to display the second hit effect, the method further comprises:
and setting the target object to be in a second target state, wherein the second target state is used for enabling a protective prop of the target object to be invalid, and the protective prop is used for resisting the damage to the target object.
11. The method according to claim 1, wherein the prop component controlling the target interactive prop in response to the triggering operation of the target interactive prop, before moving along the target moving track in the virtual scene, the method further comprises:
displaying a prop component configuration interface for displaying at least one prop component of the target interactive prop;
and responding to the configuration operation of any item component, and configuring the item component for the target interactive item.
12. The method of claim 11, further comprising:
and responding to the selection operation of any prop component, displaying prop component introduction information on a prop component configuration interface, wherein the prop component introduction information is used for indicating the influence on the target interactive prop after the prop component is configured.
13. An interactive property processing apparatus, the apparatus comprising:
the prop component control module is used for responding to the triggering operation of the target interactive prop, controlling the prop component of the target interactive prop and moving along a target moving track in a virtual scene;
the first display module is used for responding to the fact that the prop component hits any target object, displaying a first hitting effect at a target position corresponding to the target object, wherein the target object is any one of a virtual object or a virtual prop, and the first hitting effect is used for indicating that a first injury is caused to the target object;
a second display module, configured to continuously display a second hit effect at the target location, where the second hit effect is used to indicate that a second injury is caused to the target object, and the first injury and the second injury belong to different injury types, where both the first injury and the second injury are used to reduce a virtual life value of the target object;
and determining second injury information according to the type of the target object, wherein the second injury information is used for indicating the unit-time injury value and the duration of second injury to the target object, and the second injury to the virtual object is less than the second injury to the virtual prop.
14. A terminal, characterized in that the terminal comprises a processor and a memory for storing at least one piece of program code, which is loaded by the processor and executes the interactive prop processing method according to any one of claims 1 to 12.
15. A storage medium for storing at least one program code for executing the interactive item processing method of any one of claims 1 to 12.
CN202010951124.3A 2020-09-11 2020-09-11 Interactive property processing method, device, terminal and storage medium Active CN112057857B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010951124.3A CN112057857B (en) 2020-09-11 2020-09-11 Interactive property processing method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010951124.3A CN112057857B (en) 2020-09-11 2020-09-11 Interactive property processing method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN112057857A CN112057857A (en) 2020-12-11
CN112057857B true CN112057857B (en) 2022-05-31

Family

ID=73695350

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010951124.3A Active CN112057857B (en) 2020-09-11 2020-09-11 Interactive property processing method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN112057857B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112642163A (en) * 2020-12-22 2021-04-13 上海米哈游天命科技有限公司 Motion trajectory prediction method and device, electronic equipment and storage medium
CN112915543B (en) * 2021-02-22 2024-07-16 网易(杭州)网络有限公司 Method and device for displaying virtual object in game and electronic terminal
CN113220405B (en) * 2021-06-11 2022-07-08 腾讯科技(深圳)有限公司 Message interaction method and related device
CN113633985B (en) * 2021-08-18 2024-05-28 腾讯科技(深圳)有限公司 Virtual accessory using method, related device, equipment and storage medium
CN113680061B (en) * 2021-09-03 2023-07-25 腾讯科技(深圳)有限公司 Virtual prop control method, device, terminal and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102046746B1 (en) * 2019-03-29 2019-11-19 서정란 System for progressing survival game
CN111228812B (en) * 2020-01-08 2021-08-10 腾讯科技(深圳)有限公司 Virtual object control method, device, terminal and storage medium
CN111298437A (en) * 2020-02-11 2020-06-19 腾讯科技(深圳)有限公司 Control method and device for virtual attack prop

Also Published As

Publication number Publication date
CN112057857A (en) 2020-12-11

Similar Documents

Publication Publication Date Title
CN108434736B (en) Equipment display method, device, equipment and storage medium in virtual environment battle
CN112057857B (en) Interactive property processing method, device, terminal and storage medium
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN111475573B (en) Data synchronization method and device, electronic equipment and storage medium
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN111589149B (en) Using method, device, equipment and storage medium of virtual prop
CN111744186B (en) Virtual object control method, device, equipment and storage medium
CN111001159B (en) Virtual item control method, device, equipment and storage medium in virtual scene
CN111265873A (en) Using method, device, equipment and storage medium of virtual prop
CN112870709B (en) Display method and device of virtual prop, electronic equipment and storage medium
US20220161138A1 (en) Method and apparatus for using virtual prop, device, and storage medium
CN112933601B (en) Virtual throwing object operation method, device, equipment and medium
US11786817B2 (en) Method and apparatus for operating virtual prop in virtual environment, device and readable medium
CN112717410B (en) Virtual object control method and device, computer equipment and storage medium
CN113713383B (en) Throwing prop control method, throwing prop control device, computer equipment and storage medium
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
CN111760284A (en) Virtual item control method, device, equipment and storage medium
CN111330274A (en) Virtual object control method, device, equipment and storage medium
CN111921190A (en) Method, device, terminal and storage medium for equipping props of virtual objects
CN112717394B (en) Aiming mark display method, device, equipment and storage medium
CN111659122B (en) Virtual resource display method and device, electronic equipment and storage medium
CN111589137B (en) Control method, device, equipment and medium of virtual role
CN110960849B (en) Interactive property control method, device, terminal and storage medium
CN114100128B (en) Prop special effect display method, device, computer equipment and storage medium
CN112402969B (en) Virtual object control method, device, equipment and storage medium in virtual scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant