WO2022057624A1 - 控制虚拟对象使用虚拟道具的方法、装置、终端及介质 - Google Patents

控制虚拟对象使用虚拟道具的方法、装置、终端及介质 Download PDF

Info

Publication number
WO2022057624A1
WO2022057624A1 PCT/CN2021/116014 CN2021116014W WO2022057624A1 WO 2022057624 A1 WO2022057624 A1 WO 2022057624A1 CN 2021116014 W CN2021116014 W CN 2021116014W WO 2022057624 A1 WO2022057624 A1 WO 2022057624A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
throwing
route
target
virtual environment
Prior art date
Application number
PCT/CN2021/116014
Other languages
English (en)
French (fr)
Inventor
姚丽
刘智洪
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2022057624A1 publication Critical patent/WO2022057624A1/zh
Priority to US17/984,114 priority Critical patent/US20230068653A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the embodiments of the present application relate to the technical field of virtual scenes, and in particular, to a method, device, terminal and medium for controlling virtual objects to use virtual props.
  • First-Person Shooting game is an application based on a three-dimensional virtual environment. Users can manipulate virtual objects in the virtual environment to perform actions such as walking, running, climbing, and shooting. You can team up online to complete a task together in the same virtual environment.
  • the virtual object can be pre-equipped with a throwing virtual prop (eg, a grenade) before the battle starts. Accordingly, the user can control the virtual object to use the throwing virtual prop on the target user.
  • the process of the user controlling the virtual object to initiate damage is as follows: : Click the virtual prop control, determine the throwing position, and control the virtual object to throw the virtual prop to the throwing position.
  • the throwing virtual props provided in the related art all need to control the virtual object to throw, and each throw can only be thrown to a single fixed point position, and there is a certain time interval from the throwing virtual prop to the landing position, and Due to the fixed-point use of throwing virtual props, the scope of action of the throwing virtual props is small, which makes the scope of action easy to be discovered and avoided, resulting in a low hit rate of the throwing virtual props.
  • the embodiments of the present application provide a method, device, terminal and medium for controlling virtual objects to use virtual props, which can enrich the types of virtual props, and use the virtual props to change the attribute values of each virtual object on the target throwing route, so as to improve the virtual Item hit rate.
  • the technical solution is as follows:
  • an embodiment of the present application provides a method for controlling a virtual object to use a virtual prop, the method is applied to a terminal, and the method includes:
  • the throwing route setting control is displayed, the target prop control is the use control corresponding to the airdrop virtual prop, and the throwing route setting control displays the virtual environment map;
  • the target throwing route corresponding to the airdrop virtual prop is displayed in the virtual environment map, the gesture operation includes a first operation position and a second operation position, the The target throwing route passes through the first operating position and the second operating position;
  • the air-drop virtual props thrown along the target throwing route are displayed in the virtual environment, and the air-drop virtual props are used to change the attribute value of the virtual object.
  • an embodiment of the present application provides an apparatus for controlling a virtual object to use a virtual prop, the apparatus comprising:
  • a first display module configured to display a throwing route setting control in response to a triggering operation on a target item control, the target item control is a usage control corresponding to an airdrop virtual item, and the throwing route setting control displays a virtual environment map;
  • the second display module is configured to display the target throwing route corresponding to the air-drop virtual prop in the virtual environment map in response to the gesture operation on the throwing route setting control, the gesture operation includes the first operation position and the a second operating position, the target throwing route passing through the first operating position and the second operating position;
  • the third display module is configured to display the air-drop virtual props thrown along the target throwing route in a virtual environment, and the air-drop virtual props are used to change the attribute value of the virtual object.
  • an embodiment of the present application provides a terminal, the terminal includes a processor and a memory, the memory stores at least one program, and the at least one program is loaded and executed by the processor to achieve the above
  • the method described in the aspect controls the use of virtual props by a virtual object.
  • an embodiment of the present application provides a computer-readable storage medium, where at least a piece of program is stored in the computer-readable storage medium, and the at least one piece of program is loaded and executed by a processor to implement the above aspects A method of controlling virtual objects using virtual props.
  • an embodiment of the present application provides a computer program product or computer program, where the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the terminal reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the terminal executes the method for controlling the virtual object to use the virtual prop provided in various optional implementation manners of the above aspect.
  • the air-dropped virtual props can be thrown along the target throwing route.
  • only virtual props can be thrown from a fixed point.
  • the air-dropped virtual props provided by the embodiments of the present application can be thrown along a specified route.
  • Improve the hit rate of virtual props on the other hand, when some virtual objects adopt squatting or long-range attack strategies, the use of this airdrop virtual prop can carry out long-range and large-scale attacks on such virtual objects, improving the ability to attack such virtual objects.
  • the hit rate of the object is accelerated, the game process is accelerated, and the duration of a single game is effectively controlled, thereby reducing the processing pressure on the server.
  • FIG. 1 shows a schematic diagram of the architecture of a computer system provided by an embodiment of the present application
  • FIG. 2 shows a flowchart of a method for controlling a virtual object to use a virtual prop provided by an exemplary embodiment of the present application
  • FIG. 3 shows a schematic diagram of a process of controlling virtual objects to use virtual props according to an exemplary embodiment of the present application
  • Fig. 4 shows the process schematic diagram of determining the target throwing route according to the first operation position and the second operation position
  • FIG. 5 shows a flowchart of a method for controlling a virtual object to use a virtual prop provided by another exemplary embodiment of the present application
  • FIG. 6 shows a schematic diagram of a props equipment interface of an airdrop virtual prop shown in an exemplary embodiment of the present application
  • FIG. 7 shows a flowchart of a method for controlling a virtual object to use a virtual prop provided by another exemplary embodiment of the present application
  • FIG. 8 is a schematic diagram showing a process of displaying the position of a virtual object according to an exemplary embodiment of the present application.
  • FIG. 9 shows a flowchart of a method for controlling a virtual object to use a virtual prop provided by another exemplary embodiment of the present application.
  • FIG. 10 shows a schematic diagram of throwing air-drop virtual props in a virtual environment according to a preset throwing distance and a target throwing route according to an exemplary embodiment of the present application
  • Fig. 11 shows a schematic diagram of throwing air-drop virtual props in a virtual environment according to a preset throw quantity and a target throw route according to an exemplary embodiment of the present application
  • Fig. 12 shows a flowchart of a method for controlling a virtual object to use a virtual prop provided by another exemplary embodiment of the present application
  • FIG. 13 shows a schematic diagram of a throwing process of an airdrop virtual prop shown in an exemplary embodiment of the present application
  • Fig. 14 shows a flowchart of a method for controlling a virtual object to use a virtual prop provided by another exemplary embodiment of the present application
  • FIG. 15 is a structural block diagram of an apparatus for controlling virtual objects to use virtual props provided by an exemplary embodiment of the present application.
  • FIG. 16 shows a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • Virtual environment is the virtual environment displayed (or provided) by the application when it is run on the terminal.
  • the virtual environment may be a simulated environment of the real world, a semi-simulated and semi-fictional environment, or a purely fictional environment.
  • the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application. The following embodiments are described by taking an example that the virtual environment is a three-dimensional virtual environment.
  • Virtual object refers to the movable object in the virtual environment.
  • the movable object may be a virtual character, a virtual animal, an animation character, etc., for example, a character or an animal displayed in a three-dimensional virtual environment.
  • the virtual object is a three-dimensional solid model created based on animation skeleton technology.
  • Each virtual object has its own shape and volume in the three-dimensional virtual environment, and occupies a part of the space in the three-dimensional virtual environment.
  • the first-person shooting game refers to a shooting game that the user can play from a first-person perspective
  • the picture of the virtual environment in the game is a picture of observing the virtual environment from the perspective of the first virtual object.
  • a third-person shooting game refers to a shooting game played from a third-person perspective
  • the picture of the virtual environment in the game is a picture of observing the virtual environment from a third-person perspective (eg, behind the head of the first virtual object).
  • At least two virtual objects are in a single-game battle mode in the virtual environment, and the virtual objects achieve the virtual reality by avoiding the damage initiated by other virtual objects and the dangers in the virtual environment (such as gas circles, swamps, etc.).
  • the life value of the virtual object in the virtual environment is zero, the life of the virtual object in the virtual environment ends, and the last virtual object that survives in the virtual environment is the winner.
  • the battle starts at the moment when the first client joins the battle, and ends at the moment when the last client quits the battle.
  • Each client can control one or more virtual objects in the virtual environment.
  • the competitive mode of the battle may include a single-player battle mode, a two-person group battle mode, or a multi-person group battle mode, and the embodiment of the present application does not limit the battle mode.
  • Virtual props refers to the props that virtual objects can use in the virtual environment, including virtual weapons that can change the attribute values of other virtual objects, supply props such as bullets, defense props such as shields, armors, armored vehicles, virtual beams, virtual shock waves, etc.
  • virtual props that can change the attribute values of other virtual objects including long-distance virtual props such as pistols, rifles, sniper rifles, short-range virtual props such as daggers, knives, swords, ropes, flying axes, flying knives, grenades, flash bombs , smoke bombs and other throwing virtual props.
  • FIG. 1 shows a schematic structural diagram of a computer system provided by an embodiment of the present application.
  • the computer system may include: a first terminal 110 , a server 120 and a second terminal 130 .
  • the first terminal 110 runs an application program 111 supporting a virtual environment, and the application program 111 may be a multiplayer online battle program.
  • the user interface of the application 111 is displayed on the screen of the first terminal 110 .
  • the application 111 can be any one of a military simulation program, a multiplayer online battle arena (MOBA) game, a battle royale shooting game, and a simulation strategy game (Simulation Game, SLG).
  • the application 111 is an FPS game as an example.
  • the first terminal 110 is a terminal used by the first user 112.
  • the first user 112 uses the first terminal 110 to control the first virtual object located in the virtual environment to perform activities.
  • the first virtual object may be referred to as the master virtual object of the first user 112. object.
  • the activities of the first virtual object include, but are not limited to, at least one of adjusting body posture, crawling, walking, running, riding, flying, jumping, driving, picking up, shooting, attacking, throwing, and releasing skills.
  • the first virtual object is a first virtual character, such as a simulated character or an anime character.
  • the second terminal 130 runs an application program 131 supporting a virtual environment, and the application program 131 may be a multiplayer online battle program.
  • the user interface of the application 131 is displayed on the screen of the second terminal 130 .
  • the client can be any one of a military simulation program, a MOBA game, a battle royale shooting game, and an SLG game.
  • the application 131 is an FPS game as an example.
  • the second terminal 130 is a terminal used by the second user 132.
  • the second user 132 uses the second terminal 130 to control a second virtual object located in the virtual environment to perform activities.
  • the second virtual object may be referred to as the master virtual object of the second user 132. Role.
  • the second virtual object is a second virtual character, such as a simulated character or an anime character.
  • the first virtual object and the second virtual object are in the same virtual world.
  • the first virtual object and the second virtual object may belong to the same camp, the same team, the same organization, have a friend relationship or have temporary communication rights.
  • the first virtual object and the second virtual object may belong to different camps, different teams, different organizations, or have an adversarial relationship.
  • the applications running on the first terminal 110 and the second terminal 130 are the same, or the applications running on the two terminals are the same type of applications on different operating system platforms (Android or IOS).
  • the first terminal 110 may generally refer to one of the multiple terminals, and the second terminal 130 may generally refer to another one of the multiple terminals. In this embodiment, only the first terminal 110 and the second terminal 130 are used as examples for illustration.
  • the device types of the first terminal 110 and the second terminal 130 are the same or different, and the device types include: a smart phone, a tablet computer, an e-book reader, a moving picture expert compression standard audio layer 3 (Moving Picture Experts Group Audio Layer III, MP3 ) player, at least one of a moving picture expert compression standard Audio Layer 4 (Moving Picture Experts Group Audio Layer IV, MP4) player, a laptop computer and a desktop computer.
  • a smart phone a tablet computer
  • an e-book reader a moving picture expert compression standard audio layer 3 (Moving Picture Experts Group Audio Layer III, MP3 ) player
  • a moving picture expert compression standard Audio Layer 4 Moving Picture Experts Group Audio Layer IV, MP4
  • terminals Only two terminals are shown in FIG. 1 , but there are multiple other terminals that can access the server 120 in different embodiments.
  • terminals there is also one or more terminals corresponding to the developer, on which a development and editing platform supporting the application program of the virtual environment is installed, and the developer can edit and update the application program on the terminal. , and transmit the updated application installation package to the server 120 through a wired or wireless network, and the first terminal 110 and the second terminal 130 can download the application installation package from the server 120 to update the application.
  • the first terminal 110, the second terminal 130 and other terminals are connected to the server 120 through a wireless network or a wired network.
  • the server 120 includes at least one of a server, a server cluster composed of multiple servers, a cloud computing platform and a virtualization center.
  • the server 120 is used to provide background services for applications supporting a three-dimensional virtual environment.
  • the server 120 undertakes the main computing work, and the terminal undertakes the secondary computing work; or, the server 120 undertakes the secondary computing work, and the terminal undertakes the main computing work; or, a distributed computing architecture is used between the server 120 and the terminal for collaborative computing. .
  • the server 120 includes a memory 121, a processor 122, a user account database 123, a battle service module 124, and a user-oriented input/output interface (Input/Output Interface, I/O interface) 125.
  • the processor 122 is used for loading the instructions stored in the server 120, and processing the data in the user account database 123 and the battle service module 124;
  • the user account database 123 is used for storing the first terminal 110, the second terminal 130 and other terminals used by The data of the user account, such as the avatar of the user account, the nickname of the user account, the combat effectiveness index of the user account, and the service area where the user account is located;
  • the battle service module 124 is used to provide multiple battle rooms for users to battle, such as 1V1 battles, 3V3 battle, 5V5 battle, etc.;
  • the user-oriented I/O interface 125 is used to establish communication and exchange data with the first terminal 110 and/or the second terminal 130 through a wireless network or a wired network.
  • FIG. 2 shows a flowchart of a method for controlling a virtual object to use a virtual prop provided by an exemplary embodiment of the present application.
  • This embodiment is described by taking the method used in the first terminal 110 or the second terminal 130 in the implementation environment shown in FIG. 1 or other terminals in the implementation environment as an example.
  • the method includes the following steps:
  • Step 201 in response to the triggering operation on the target prop control, display the throwing route setting control, the target prop control is the use control corresponding to the airdrop virtual prop, and the throwing route setting control displays the virtual environment map.
  • the air-drop virtual props indicate virtual props that can be attacked along a preset throwing route, and the preset throwing route is determined by the user through gesture operations.
  • the target prop control corresponding to the airdrop virtual prop will be displayed on the user interface, and the user can trigger the target by triggering the Prop controls to control virtual objects using the airdrop class virtual props.
  • the method of the embodiment of the present application is applied in a virtual environment, where the virtual environment includes a first virtual object and a second virtual object, and the first virtual object and the second virtual object belong to different camps.
  • the terminal displays the virtual environment through a virtual environment screen.
  • the virtual environment picture is a picture for observing the virtual environment from the perspective of a virtual object.
  • the viewing angle refers to the observation angle when observing in the virtual environment from the first-person perspective or the third-person perspective of the virtual object.
  • the viewing angle is the angle when the virtual object is observed through the camera model in the virtual environment.
  • the camera model automatically follows the virtual object in the virtual environment, that is, when the position of the virtual object in the virtual environment changes, the camera model changes simultaneously with the position of the virtual object in the virtual environment, and the camera The model is always within a preset distance of the virtual object in the virtual environment.
  • the relative position of the camera model and the virtual object does not change.
  • the camera model refers to the three-dimensional model located around the virtual object in the virtual environment.
  • the camera model is located near the head of the virtual object or on the head of the virtual object;
  • the camera The model can be located behind the virtual object and bound to the virtual object, or it can be located at any position with a preset distance from the virtual object.
  • the virtual object in the virtual environment can be observed from different angles, optional Specifically, when the third-person perspective is a first-person over-the-shoulder perspective, the camera model is located behind the virtual object (eg, the head and shoulders of the virtual character).
  • the perspective also includes other perspectives, such as a top-down perspective; when the top-down perspective is adopted, the camera model can be located above the head of the virtual object, and the top-down perspective is viewed from the sky.
  • the camera model is not actually displayed in the virtual environment, that is, the camera model is not displayed in the virtual environment displayed by the user interface.
  • the camera model is located at any position with a preset distance from the virtual object as an example to illustrate.
  • a virtual object corresponds to a camera model, and the camera model can be rotated with the virtual object as the rotation center. Rotate the camera model at any point of the rotation center.
  • the camera model not only rotates in angle, but also offsets in displacement.
  • the distance between the camera model and the rotation center remains unchanged during rotation. That is, the camera model is rotated on the spherical surface with the center of rotation as the center of the sphere, wherein any point of the virtual object may be the head, torso, or any point around the virtual object.
  • the center of the viewing angle of the camera model points in the direction of the point on the spherical surface where the camera model is located, pointing in the direction of the center of the sphere.
  • the camera model can also observe the virtual object at preset angles in different directions of the virtual object.
  • the first virtual object is a virtual object controlled by a user through a terminal
  • the second virtual object includes at least one of a virtual object controlled by other users and a virtual object controlled by a background server
  • the first virtual object is connected to the second virtual object.
  • Virtual objects belong to different camps.
  • the air-drop virtual props provided by the embodiments of the present application can continuously throw the specified route. Therefore, in a possible implementation, when the terminal receives a trigger operation on the target prop control , the throwing route setting control will be displayed in the current user interface, and the virtual environment map will be displayed through the throwing route setting control, so that the user can set the target throwing route of the airdrop virtual prop in the virtual environment map.
  • the triggering operation of the user on the target prop control may be a click operation, a long-press operation, a double-click operation, etc., which is not limited in this embodiment of the present application.
  • FIG. 3 shows a schematic diagram of a process of controlling virtual objects to use virtual props according to an exemplary embodiment of the present application.
  • a virtual environment screen 301 and target prop controls 302 are displayed in the user interface.
  • the terminal receives the trigger operation of the target item control 302, and displays the throwing route setting control 303 on the upper layer of the current user interface, and the throwing route setting control 303 is used to display the virtual environment map .
  • a virtual object identifier is displayed in the virtual environment map.
  • Step 202 in response to the gesture operation on the throwing route setting control, display the target throwing route corresponding to the air-drop virtual prop in the virtual environment map, the gesture operation includes a first operation position and a second operation position, and the target throwing route passes through the first operation. position and second operating position.
  • the gesture operation may be a one-finger sliding operation, a two-finger sliding operation, a two-finger tap operation, a two-finger long press operation, etc., and only two operation positions need to be determined according to the gesture operation.
  • the type of operation does not constitute a limitation.
  • the user needs to perform a gesture operation in the throwing route setting control displaying the virtual environment map, and the corresponding terminal receives the gesture operation on the throwing route setting control, and determines the first function of the gesture operation.
  • the operation position and the second operation position, that is, the target throwing route can be determined according to the first operation position and the second operation position.
  • the target throwing route indicated by the gesture operation may be displayed in the virtual environment map.
  • the display form of the target throwing route may be in the form of a line segment, in the form of a single arrow or in the form of a double arrow.
  • the terminal determines the first operation position 304 and the second operation position 305, and then the line segment 306 can be determined as the target throwing route.
  • the route between the first operation position and the second operation position can be directly determined as the target throwing route, or the route in the virtual environment map after the first operation can be directly determined as the target throwing route.
  • the straight line between the position and the second operating position is determined as the target throwing route, which is not limited in this embodiment of the present application.
  • FIG. 4 shows a schematic diagram of the process of determining the target throwing route according to the first operating position and the second operating position.
  • a virtual environment map 402 is displayed on the throwing route setting control 401.
  • the terminal receives the gesture operation on the virtual environment map and determines the first operation position 403 and the second operation position 404, the first operation position 403 and the second operation position 404 can be determined.
  • the route between the second operating positions 404 is determined as the target throwing route, or the route between the positions 405 and 406 is determined as the target throwing route (wherein the positions 405 and 406 pass through the first operating position 403 and the second operating position 404 and the boundary of the virtual environment map), or determine the route between the first operation position 403 and the position 406 as the target throwing route; or determine the route between the second operation position 404 and the position 405 as the target throwing route Route, the embodiment of the present application does not limit the target throwing route.
  • the terminal may determine at least one candidate throwing route based on the first operation position and the second operation position operated by the user's gesture, and then the user selects the target throwing route from the multiple candidate throwing routes.
  • the candidate throwing route determined by the terminal includes: the first operation position 403 to the second operation position 404 , the first operation position 403 to the position 406 , the position 405 to the position 406 , and the position 405 to In the second operation position 404
  • prompt information can be displayed on the upper layer of the current user interface to prompt the user to select a target throwing route from a plurality of candidate throwing routes; optionally, the terminal can display the selection corresponding to each candidate throwing route in the prompt box control, when the terminal receives the triggering operation on the target selection control, the candidate throwing route corresponding to the target selection control is determined as the target throwing route.
  • step 203 the air-drop virtual props thrown along the target throwing route are displayed in the virtual environment, and the air-drop virtual props are used to change the attribute value of the virtual object.
  • the corresponding terminal determines the position of the target throwing route in the virtual environment map, that is, according to the virtual environment map and various positions in the virtual environment
  • the mapping relationship is to throw the airdrop virtual props in the virtual environment according to the target throwing route, and the corresponding virtual environment displays the airdrop virtual props thrown along the target throwing route.
  • the property value of the virtual object is to throw the airdrop virtual props in the virtual environment according to the target throwing route, and the corresponding virtual environment displays the airdrop virtual props thrown along the target throwing route.
  • the target throwing route may not be displayed in the virtual environment.
  • On the actual throwing route it means that there are air-drop virtual props thrown along the target throwing route in the virtual environment; when the air-drop virtual props correspond to the connection between at least two throwing positions, they are located on the target throwing route in the virtual environment. If the environment corresponds to the actual throwing route, it means that there are airdrop virtual props thrown along the target throwing route displayed in the virtual environment.
  • the terminal can also report the target throwing route to the server, and the server throws the airdrop virtual props in the virtual environment based on the target throwing route, and feeds back the throwing information of the airdrop virtual props to the server.
  • the terminal enables the terminal to display the airdrop virtual props thrown along the target throwing route in the virtual environment.
  • the terminal and the server can cooperate to complete the throwing of air-dropped virtual items, and the terminal reports the target throwing route to the server, and the server verifies the target throwing route.
  • the target throwing route throws airdrop virtual props.
  • the terminal throws the airdrop virtual props in the virtual environment based on the target throwing route, and feeds back the throwing information of the airdrop virtual props to The server forwards the throwing information to other terminals.
  • the attribute value may be the life value, defense value, attack power, speed, etc. of the virtual object.
  • the air-dropped virtual prop can be loaded by a virtual carrier and thrown according to the target throwing route.
  • the virtual vehicle props may be airplanes, hot air balloons, and the like.
  • the throwing route setting controls are retracted or disappear, and the virtual object 307 appears at the starting point of the throwing route indicated by the target throwing route in the virtual environment, and runs along the target throwing route.
  • the throwing route throws the air-drop virtual prop 308 .
  • air-drop virtual props can be thrown along the target throwing route.
  • the air-dropped virtual props provided by the embodiments of the present application can be thrown along a specified route.
  • the airdrop virtual props can be used to carry out long-range and large-scale attacks on such virtual objects, and improve the ability to attack such virtual objects. This will speed up the game process and effectively control the duration of a single game, thereby reducing the processing pressure on the server.
  • the airdrop virtual props belong to the skill of killing streak, that is, the airdrop virtual props can only be used after the consecutive kill score (or quantity) of the virtual object reaches the preset score threshold (or quantity threshold). Therefore, when the user is a virtual object After equipping the airdrop virtual item and entering the game, although the target item control corresponding to the airdrop virtual item is displayed in the user interface, the target item control is set to a non-triggerable state, and only when the virtual object is killed consecutively After the score meets the preset score threshold, the target prop control will be in a triggerable state, that is, virtual objects can use airdrop virtual props.
  • FIG. 5 shows a flowchart of a method for controlling a virtual object to use a virtual prop provided by another exemplary embodiment of the present application.
  • This embodiment is described by taking the method used in the first terminal 110 or the second terminal 130 in the implementation environment shown in FIG. 1 or other terminals in the implementation environment as an example.
  • the method includes the following steps:
  • Step 501 Acquire the number of second virtual objects defeated by the first virtual object within the target time period, where the first virtual object and the second virtual object belong to different camps.
  • airdrop virtual props can cause damage to virtual objects in a large range and have greater attack power
  • set airdrop virtual props as skill weapons, skill props, or scoring props.
  • the airdrop virtual props have usage conditions, that is, the controlled virtual object needs to meet the usage conditions before the airdrop virtual props can be used.
  • the use condition of the airdrop virtual prop may be that the number of the first virtual object (the controlled virtual object) defeating the second virtual object within the target time period satisfies the number threshold; or It may be that the defeat score obtained by the first virtual object defeating the second virtual object within the target time period satisfies the score threshold.
  • the target time period can be set by the developer.
  • the target time period can be 10 minutes. That is to say, the number of the first virtual object defeating the second virtual object in any continuous 10 minutes is obtained, and then the second virtual object is determined to be defeated. Whether the number of virtual objects can meet the usage conditions of airdrop virtual props.
  • the terminal pre-displays a props equipment interface.
  • the user can select the virtual props to be carried in this game.
  • a continuous scoring item interface is provided in the item equipment interface, and at least one continuous scoring item is displayed on the continuous scoring item interface.
  • the user can select an airdrop virtual item in the continuous scoring item interface, and click the equipment control.
  • the user interface will display the target item controls corresponding to the airdrop virtual item.
  • FIG. 6 which shows a schematic diagram of a props equipment interface of an airdrop virtual prop shown in an exemplary embodiment of the present application
  • the item selection column includes at least one item selection control 602 corresponding to a continuous scoring item, such as an unmanned vanguard, a thunderbolt (that is, the airdrop virtual item provided in the embodiment of this application), an attack helicopter, etc.
  • the usage conditions of the item ie the preset score
  • the unmanned vanguard corresponds to the preset score of 750, that is, if the virtual object is equipped with this skill, it needs to be killed after the score reaches 700.
  • the props introduction corresponding to the thunderbolt item is displayed in the continuous scoring item interface, that is, the required kill score (950) and the function (for the specified
  • the user clicks the equipment control 603 it means that the virtual object is equipped with a thunderbolt prop.
  • airdrop virtual props have certain conditions of use, that is, within a preset time period after entering the game, the user needs to control the first virtual object to defeat a certain number of second virtual objects, or the score obtained by defeating the second virtual object reaches After a certain value, the target prop control will become triggerable, that is, airdrop virtual props can be used. Therefore, in a possible implementation, when the first virtual object equipped with airdrop virtual props enters the game After that, the terminal will acquire in real time the number of the first virtual object defeating the second virtual object, or the score obtained after defeating the second virtual object, to determine the setting state of the target prop control.
  • Step 502 in response to the quantity being higher than the quantity threshold, set the target prop control to a triggerable state.
  • each consecutive scoring item has different usage conditions, which can be the number of defeats or the defeating points.
  • the usage conditions for airdrop virtual items can be consecutive defeats of a certain number ( reach the number threshold), or obtain a certain score by continuously beating the virtual objects (reaching the score threshold), therefore, when the terminal obtains the number of the first virtual object defeating the second virtual object, it compares the number with the number threshold , by comparing the results to determine whether the use conditions of the airdrop virtual props are met, and then determine the setting state corresponding to the target prop control based on the condition judgment result.
  • the terminal will set the target item control to a non-triggerable state until the quantity meets the quantity threshold.
  • the number threshold corresponding to the airdrop virtual props may be 10, that is, the virtual object can use the airdrop virtual props only after the first virtual object defeats 10 second virtual objects.
  • the target prop control can be set to a triggerable state.
  • the non-triggerable state may be that the icon corresponding to the target item control is gray or black, and the corresponding triggerable state may be that the icon corresponding to the target item control is highlighted.
  • a certain score may be obtained, and a score threshold can also be set accordingly.
  • the obtained score is higher than the score threshold, illustratively, the score threshold may be 900 points.
  • the terminal when the score obtained by the first virtual object reaches or exceeds the score threshold after defeating the second virtual object in the game, the conditions for using airdrop virtual props are satisfied, indicating that the first virtual object can Use the airdrop virtual item in the game; that is, when the score obtained by the first virtual object by defeating the second virtual object is lower than the score threshold, the terminal sets the target item control corresponding to the airdrop virtual item to a non-triggerable state; If the score obtained by the first virtual object by defeating the second virtual object is higher than the score threshold, the terminal sets the target prop control corresponding to the airdrop virtual prop to a triggerable state.
  • a concept of killing streak is set, that is, the first virtual object continuously defeats the second virtual object within a predetermined time.
  • the obtained defeat score will be doubled, and the more the number of kill streaks, the more doubled. Therefore, the first virtual object can more easily reach the preset score threshold, thereby improving the activation of airdrop virtual props rate.
  • the prescribed time may be 20min.
  • Step 503 in response to the triggering operation on the target prop control, display the throwing route setting control, the target prop control is the use control corresponding to the airdrop virtual prop, and the throwing route setting control displays the virtual environment map.
  • step 503 For the implementation of step 503, reference may be made to the above embodiments, and details are not described herein in this embodiment.
  • the target item control when the target item control corresponding to the airdrop virtual item is triggered, the target item control changes from the triggerable state to the non-triggerable state. If you need to use the airdrop virtual item again, you need to satisfy the airdrop virtual item again. the corresponding conditions of use.
  • Step 504 Acquire a first operation position corresponding to the first operation signal and a second operation position corresponding to the second operation signal in response to the first operation signal and the second operation signal in the virtual environment map.
  • the embodiment of the present application provides a double-contact operation method, that is, two operation signals can be simultaneously received in the virtual environment map, and the target throwing route can be adjusted by rotation or displacement.
  • the gesture operation when it is a two-touch operation, it may be a two-finger operation, or other gesture operations that can generate two operation signals at the same time.
  • the user sets a control on the throwing route through a gesture operation, and the corresponding terminal receives the first operation signal and the second operation signal in the virtual environment map, that is, the corresponding terminal of the first operation signal is obtained.
  • the first operation position and the second operation position corresponding to the second operation signal and follow the first operation signal and the second operation signal in real time to change the first operation position and the second operation position.
  • Step 505 based on the first operation position and the second operation position, display the candidate throwing routes in the virtual environment map.
  • the gesture operation may be adjusted in real time so as to determine the most suitable target throwing route. Therefore, in a possible implementation, when the terminal receives the gesture operation Then, according to the first operation position and the second operation position corresponding to the gesture operation, the candidate throwing route is determined, and the candidate throwing route determined according to the current gesture operation is displayed in the virtual environment map in real time, so that the user can use the displayed candidate throwing route Throwing route to determine whether the current route meets the user's throwing needs.
  • any line segment passing through the first operation position and the second operation position can be determined as a candidate.
  • the candidate throwing route may take the first operating position as the starting point of the route, and take the second operating position as the ending point of the route.
  • the process of determining the candidate throwing route according to the first operation position and the second operation position may refer to the above embodiment, which will not be repeated in this embodiment.
  • Step 506 in response to the disappearance of the first operation signal and the second operation signal, determine the target throwing route according to the first operation position and the second operation position at the moment when the signals disappear, and display the target throwing route in the virtual environment map.
  • the target throwing route is determined according to the first operating position (that is, the final operating position corresponding to the first operating signal in the gesture operation) and the second operating position (that is, the final operating position corresponding to the second operating signal in the gesture operation) at the moment when the operating signal disappears. .
  • the terminal in order to enable the user to determine the candidate throwing route indicated by the gesture operation in real time, the terminal will follow the change of the user's gesture operation, and change the gesture operation indicated in the virtual environment map in real time.
  • the candidate throwing route until the terminal detects the end of the gesture operation touch, that is, the first operation signal and the second operation signal disappear, then the candidate throwing route corresponding to the first operation position and the second operation position at the time of the disappearance of the signal is determined as The target throwing route, displayed in the virtual environment map.
  • the user can perform the gesture operation again in the virtual environment map; if the user does not need to modify the target throwing route, he can close the throwing route setting control, which will be based on the target throwing route planned by the user. , throwing airdrop virtual props.
  • the throwing route setting control will also be put away until the throwing route setting control is evoked by triggering the target prop control again.
  • the terminal when performing a gesture operation with dual operation signals, if the terminal receives the first operation signal or the second operation signal disappears, it cannot be used to determine the target throwing route, and the throwing route setting controls will not be put away, and the user Gestures can continue in the throwing route settings controls.
  • Step 507 Determine the actual throwing route corresponding to the target throwing route in the virtual environment based on the position mapping relationship between the virtual environment map and the virtual environment.
  • the target throwing route indicates the route on the virtual environment map in the throwing route setting control
  • you need to throw air-dropped virtual props in the actual virtual environment you need to map the determined target throwing route to the actual virtual environment.
  • the position mapping relationship between the position in the virtual environment map and the position in the virtual environment is preset, so that after the target throwing route is determined in the throwing route setting control, it can be determined based on the position mapping relation.
  • the actual throwing route indicated by the target throwing route in the virtual environment is obtained, so as to realize the mapping of the target throwing route into the virtual environment.
  • the method for determining the actual throwing route may include the following steps:
  • the actual position coordinates of the route start point and route end point of the target throwing route in the virtual environment map respectively corresponding to the virtual environment can be obtained.
  • the target throwing route corresponds to the actual throwing route in the virtual environment.
  • three points may be pre-calibrated in the virtual environment, and coordinate positions corresponding to the pre-calibrated three points may be determined in the virtual environment map, thereby establishing a virtual environment map and a virtual environment
  • the starting point of the route and the three points in the virtual environment map can be connected to determine the three directional line segments, and then in the virtual environment according to the three directional line segments, namely Three points can be determined, and the average value of these three points can be obtained, that is, the first position coordinates of the starting point of the route in the virtual environment can be obtained.
  • the second position coordinates of the end point of the route in the virtual environment can be obtained.
  • the linear or nonlinear relationship (position mapping relationship) between the virtual environment map and the location in the virtual environment can also be determined according to the three pre-calibrated points, and the coordinates corresponding to the starting point of the route can be directly brought into With this position mapping relationship, the first position coordinates corresponding to the starting point of the route can be obtained, and similarly, the second position coordinates corresponding to the end point of the route can also be obtained.
  • the position of the starting point of the route in the virtual environment and the position of the end point of the route in the virtual environment are determined, that is, the corresponding direction and length of the actual throwing route in the virtual environment are determined, so as to determine the airdrop category The actual throwing route of the virtual prop in the virtual environment.
  • Step 508 throw the air-drop virtual props in the virtual environment according to the actual throwing route.
  • the terminal can control the actual throwing starting point of the virtual prop to appear in the virtual environment, And throw airdrop virtual props along the actual throwing route until the actual throwing end point is reached.
  • Step 509 displaying the air-drop virtual props thrown along the actual throwing route in the virtual environment.
  • the terminal controls the virtual vehicle to throw the airdrop virtual props along the actual throwing route in the virtual environment, and the corresponding virtual environment will display the airdrop virtual props thrown along the actual throwing route.
  • the terminal may also upload the throwing information of the airdrop virtual props to the server, and the server will forward the throwing information to other terminals.
  • the server can throw airdrop virtual props.
  • the terminal determines the target throwing route, it can determine the target throwing route in the virtual environment based on the position mapping relationship between the virtual environment map and the virtual environment. The actual throwing route in the environment, and then report the actual throwing route to the server. Based on the actual throwing route, the server controls the virtual vehicle to throw air-drop virtual props in the virtual environment along the actual throwing route; the corresponding server will feedback the throwing information
  • the terminal can display the air-dropped virtual props thrown along the actual throwing route in the virtual environment based on the head throwing information.
  • whether the airdrop virtual prop can be used is determined by acquiring the number of the first virtual object defeating the second virtual object, thereby determining the setting state of the target prop control;
  • the target throwing route determined in the setting control is mapped to the virtual environment, so as to determine the actual throwing route of the airdrop virtual prop in the virtual environment, so as to throw the airdrop virtual prop in the virtual environment.
  • the throwing route setting control can obtain the positions of all virtual objects (including the same camp and different camps) in the virtual environment, so that the user can The position of each virtual object in the virtual environment map is used to perform gesture operation to determine the appropriate target throwing route.
  • FIG. 7 shows a flowchart of a method for controlling a virtual object to use a virtual prop provided by another exemplary embodiment of the present application.
  • This embodiment is described by taking the method used in the first terminal 110 or the second terminal 130 in the implementation environment shown in FIG. 1 or other terminals in the implementation environment as an example.
  • the method includes the following steps:
  • Step 701 in response to the triggering operation on the target prop control, display the throwing route setting control, the target prop control is the use control corresponding to the airdrop virtual prop, and the throwing route setting control displays the virtual environment map.
  • Step 702 Obtain the geographic location of each virtual object in the virtual environment, where the virtual environment includes a first virtual object and a second virtual object, and the first virtual object and the second virtual object belong to different camps.
  • the throwing route setting control can scan the geographical positions corresponding to each virtual in the virtual environment, and map the position of each virtual object in the virtual environment to the virtual environment map.
  • the second virtual object may be a virtual object controlled by other users, or a virtual object (human-machine) controlled by a computer.
  • Step 703 based on the geographic location, display virtual object identifiers in the virtual environment map, wherein virtual objects belonging to different camps correspond to different virtual object identifiers.
  • virtual objects belonging to the same camp may be represented by the same virtual object identifier, and different camps may be represented by the same virtual object identifier.
  • the virtual objects are represented by different virtual object identifiers, and the different virtual object identifiers are displayed in the virtual environment map according to their geographical positions in the virtual environment.
  • different virtual object logos may use graphics of different shapes, such as squares, circles, triangles, etc.; or, different virtual object logos may also use graphics of different colors, for example, the virtual objects of the first camp use red circles.
  • the virtual objects of the second camp use color-changing circles, etc., or, different virtual object identifiers may also use the avatars corresponding to each virtual object; the embodiment of the present application does not limit the virtual object identifiers.
  • FIG. 8 shows a schematic diagram of a process of displaying the position of a virtual object shown in an exemplary embodiment of the present application.
  • the route setting control 803 is first displayed in the user interface.
  • the route setting control 803 may only display the virtual environment map 804 (ie position of each virtual obstacle in the virtual environment), after that, the route setting control 803 scans and obtains the position of each virtual object in the virtual environment, and displays the virtual environment map 804 based on the position of each virtual object in the virtual environment Object identifiers 805 and virtual object identifiers 806, wherein different virtual object identifiers represent virtual objects belonging to different camps.
  • the virtual object identifier and the virtual environment map are not displayed in the throwing route setting control at the same time.
  • the virtual environment map is first displayed in the throwing route setting control, and the terminal The geographic location corresponding to each virtual object in the virtual environment is acquired, and the corresponding virtual object identifier is displayed in the virtual environment map based on the geographic location. It can be seen that the virtual object identifier is displayed after the virtual environment map is displayed.
  • an object identification display control can be added around the throwing route setting control, and the object identification display control is used to trigger the display of the virtual object identification in the virtual environment map;
  • the terminal displays the throwing route control, only the virtual environment map is displayed in the throwing route control.
  • the object identification display control can be triggered, and the corresponding terminal receives The triggering operation of the object identification display control obtains the geographic location corresponding to each virtual object in the virtual environment, and then displays the virtual object identification in the virtual environment map based on the geographic location; otherwise, if the user does not need to display the virtual object identification in the virtual environment map , the object identification display control may not be triggered, thereby reducing the computing logic for the terminal to obtain the geographic location corresponding to each virtual object, and further reducing the power consumption of the terminal.
  • the virtual object identifier can be displayed in the throwing route setting control at the same time as the virtual environment map.
  • the terminal obtains the virtual The geographic location corresponding to each virtual object in the environment, so that while displaying the virtual environment map in the throwing route setting control, the virtual object identifier is displayed in the virtual environment map based on the geographic location corresponding to the virtual object, so that the virtual environment map and the virtual object identifier can be displayed.
  • it is displayed in the throwing route setting control, and there is no visual display delay.
  • Step 704 in response to the gesture operation on the throwing route setting control, display the target throwing route corresponding to the air-drop virtual prop in the virtual environment map, the gesture operation includes a first operation position and a second operation position, and the target throwing route passes through the first operation. position and second operating position.
  • Step 705 Obtain the distribution quantity of the second virtual object on the target throwing route.
  • the distribution quantity is used to indicate the quantity of the second virtual objects in the preset area corresponding to each throwing point on the target throwing route, and the preset area may be the props scope of the airdrop virtual props.
  • the distribution quantity of the second virtual object may be based on to throw airdrop virtual props.
  • Step 706 Throw the air-dropped virtual props in the virtual environment according to the distributed quantity, wherein the thrown quantity of the air-dropped virtual props is positively correlated with the distributed quantity.
  • the throwing quantity of the air-drop virtual props at the position is determined, and it is set that the throwing quantity of the air-drop virtual props is positively correlated with the distribution quantity, that is, based on the first 2.
  • airdrop virtual props with a larger number of throws can be placed; on the contrary, for areas with a small number of second virtual objects or no second virtual objects, airdrop virtual props with a smaller number of throws can be selected. props, or no airdrop virtual props.
  • the second virtual object may be selected at the second point. Throws 5 airdrop virtual items at the first point, and throws 3 airdrop virtual items at the first point.
  • Step 707 displaying the air-drop virtual props thrown along the target throwing route in the virtual environment.
  • the virtual object identifiers of each virtual object are scanned and displayed in the virtual environment map through the throwing route setting control, so that the user can determine the target throwing route based on the position of each virtual object in the virtual environment, so as to avoid throwing the target to his teammates.
  • the air-dropped virtual props have certain throwing properties, for example, a preset throwing distance, that is, the air-dropped virtual props can only be thrown in the virtual environment according to the preset throwing distance; or a preset throwing distance Quantity, that is, airdrop virtual props cannot be placed indefinitely, and there is a limit on the number of airdrops. Therefore, when throwing airdrop virtual props in the virtual environment according to the target throwing route, it is also necessary to consider the throwing attribute information corresponding to the airdrop virtual props. .
  • a preset throwing distance that is, the air-dropped virtual props can only be thrown in the virtual environment according to the preset throwing distance
  • a preset throwing distance Quantity that is, airdrop virtual props cannot be placed indefinitely, and there is a limit on the number of airdrops. Therefore, when throwing airdrop virtual props in the virtual environment according to the target throwing route, it is also necessary to consider the throwing attribute information corresponding to the airdrop virtual props. .
  • step 203 may include steps 203A to 203C.
  • Step 203A Obtain throwing attribute information corresponding to the air-drop virtual item, where the throwing attribute information includes at least one of a preset throwing distance and a preset throwing quantity.
  • the preset throwing distance indicates that air-drop virtual props need to be thrown at certain distances, that is, the distance between the corresponding throwing positions of two adjacent air-drop virtual props is a preset throwing distance; optionally, the preset throwing distance
  • the distance can be set by the scope of the props of the airdrop virtual props, so as to avoid the waste of the airdrop virtual props due to the duplication of the scope of the props.
  • the preset throwing distance may be 10m.
  • the preset throwing distance may be the distance in the actual virtual environment, and the corresponding preset throwing distance may be 10m; or, the preset throwing distance may also be the distance in the virtual environment map, and the corresponding preset throwing distance may be 1cm. .
  • the preset number of throws indicates the total number of air-drop virtual items that can be thrown by a single trigger of the air-drop virtual items.
  • the preset number of throws may be 40.
  • an air-drop virtual item can have two throwing attribute information, a preset throwing distance and a preset throwing quantity.
  • it needs to be restricted by the two throwing attribute information at the same time. , that is, when the terminal throws the air-dropped virtual props according to the target throwing route, the preset throwing distance and the preset throwing quantity need to be considered at the same time.
  • the user can switch to use at least one type of throw based on the actual situation. Attribute information, you can choose to use only the preset throw distance to throw the airdrop virtual props, you can choose to only use the preset throw quantity to throw the airdrop virtual items, or you can choose to use the preset throw distance and the preset throw quantity to throw the airdrop Class virtual props.
  • the air-dropped virtual item only has a single throwing attribute information. If the air-dropped virtual item only has the throwing attribute information of the preset throwing distance, then during the use of the air-dropped virtual item, it will be thrown according to the preset throwing distance. Airdrop virtual props; if the airdrop virtual props only have the throwing attribute information of the preset number of throws, during the use of the airdrop virtual props, the airdrop virtual props will be thrown according to the preset number of throws.
  • the terminal after determining the target throwing route, acquires throwing attribute information corresponding to the airdrop virtual prop, so as to throw the airdrop virtual prop in the virtual environment based on the throwing attribute information.
  • Step 203B according to the throwing attribute information and the target throwing route, throw the airdrop type virtual item in the virtual environment.
  • the terminal after the terminal acquires the throwing attribute information corresponding to the airdrop virtual prop, it can throw the airdrop virtual prop in the virtual environment according to the throwing attribute information and the target throwing route.
  • the process of throwing the air-drop virtual props in the virtual environment according to the throwing attribute information and the target throwing route may include the following steps 1 to 2:
  • the preset throwing distance and the route length corresponding to the target throwing route determine the number of target throws corresponding to the airdrop virtual props.
  • the number of target throws is positively correlated with the route length of the target throwing route.
  • the preset throwing distance may be the distance in the virtual environment map, or may be the corresponding actual distance in the virtual environment.
  • the value of the preset throwing distance can be 1cm
  • the value of the preset throwing distance can be 10m.
  • the number of throws corresponding to each throwing position is the same, and the number of throws corresponding to a single throwing position can be preset by the developer.
  • the number of throws corresponding to the throwing position can be 3.
  • the target throwing route corresponds to the The shorter the route length, the less the number of airdrop virtual props required for this throw, that is, the number of target throws is positively correlated with the route length of the target throw route.
  • the relationship between the number of target throwing, the target throwing route and the preset throwing distance can be expressed as:
  • N 1 (L/d 1 )*n 1
  • N 1 indicates the number of target throws
  • L indicates the route length of the target throwing route
  • d 1 indicates the preset throwing distance
  • n 1 indicates the number of throws corresponding to a single throwing position.
  • L and d 1 need to be the corresponding values in the same coordinate system, that is, if d 1 is the preset throwing distance in the actual virtual environment, then L should also be the target throwing route in the actual virtual environment.
  • the corresponding route length; if d 1 is the preset throwing distance in the virtual environment map, then L should also be the route length corresponding to the target throwing route in the virtual environment map.
  • the length of the target throwing route is 10cm, and an airdrop virtual item is thrown every 1cm, 10 airdrop virtual items need to be thrown, and 3 airdrop virtual items may be thrown for each throw. props, the number of target throws of the corresponding airdrop virtual props is 30. If the length of the target throwing route is 5cm, and the airdrop virtual props are thrown every 1cm, 5 throws are required, and 3 pieces are thrown each time, the target number of airdrop virtual props to be thrown is 15.
  • the airdrop virtual props can be thrown at every preset throwing distance in the target throwing route, and the airdrop virtual props of the target throwing quantity can be thrown in total.
  • FIG. 10 shows a schematic diagram of throwing air-drop virtual props in a virtual environment according to a preset throwing distance and a target throwing route according to an exemplary embodiment of the present application.
  • the preset throwing distance is 1004.
  • the process of throwing the air-drop virtual props in the virtual environment according to the throwing attribute information and the target throwing route may include the following steps 3 to 4:
  • the target throwing distance corresponding to the airdrop virtual props is positively correlated with the route length corresponding to the target throwing route.
  • the preset number of throws may be 40.
  • the throwing distance of the corresponding airdrop virtual item needs to be larger, so that the target throwing route can be covered in a wider range. If the throwing route is short, the throwing distance of the corresponding airdrop virtual props can be reduced, thereby increasing the hit rate of virtual objects on the target throwing route, that is, the target throwing distance is positively correlated with the route length corresponding to the target throwing route.
  • the relationship between the preset number of throws, the target throw distance and the target throw route can be expressed as:
  • d 2 represents the target throwing distance
  • L represents the route length of the target throwing route
  • N 2 represents the preset number of throws
  • n 2 represents the number of throws corresponding to a single throw position.
  • L and d 2 need to be the corresponding values in the same coordinate system, that is to say, if L is the route length corresponding to the target throwing route in the actual virtual environment, the calculated d 2 should also be the actual virtual environment.
  • the target throwing distance in the environment; if L is the corresponding route length of the target throwing route in the virtual environment map, the calculated d 1 should also be the target throwing distance in the virtual environment map.
  • the target throwing distance determined by the terminal is the distance in the virtual environment map
  • the target throwing distance needs to be converted into the target throwing distance in the actual virtual environment, and then the target throwing distance should be converted according to the target throwing distance. Throwing in an actual virtual environment.
  • the corresponding route length of the target throwing route in the actual virtual environment is 200m, and the number of throws corresponding to a single throwing position is 4, then the target throwing distance is 20m; The corresponding route length in the virtual environment is 400m, and the target throwing distance is 40m.
  • the air-drop virtual prop can be thrown once every target throwing distance on the target throwing route until reaching the end point of the target throwing route.
  • the airdrop virtual prop is thrown every 15m.
  • FIG. 11 it shows a schematic diagram of throwing air-drop virtual props in a virtual environment according to a preset throw quantity and a target throw route according to an exemplary embodiment of the present application.
  • the preset number of throws is 8 ⁇ 5 (that is, including 8 throwing positions, each It can be seen that when the virtual props of the airdrop type are thrown on the actual throwing route according to the preset number of throws (that is, the number of the throwing positions 1106 is the same), because the route length of the route 1103 is longer than that of the route If the route length of 1102 is set, the corresponding throw distance 1105 on route 1103 is greater than the throw distance 1104 on route 1102 .
  • the air-dropped virtual props have both the preset throwing distance and the preset throwing number two throwing attributes, in an exemplary example, according to the throwing attribute information and the target throwing route
  • the process of throwing airdrop virtual props in a virtual environment may include the following steps 5 to 6.
  • the route length corresponding to the target throwing route will only affect the throwing quantity of units corresponding to each throwing position; and when the route length corresponding to the target throwing route is longer, the corresponding target throwing route.
  • the relationship between the number of target unit throws, the preset number of throws, the preset throw distance, and the target throwing route can be expressed as:
  • n 3 N 3 /(L/d 3 )
  • n3 identifies the number of target unit throws, that is, the number of throws corresponding to a single throwing position
  • N3 represents the preset number of throws
  • L represents the route length corresponding to the target throwing route
  • d3 represents the preset throwing distance.
  • L and d 3 need to be the corresponding values in the same coordinate system, that is, if d 3 is the preset throwing distance in the actual virtual environment, then L should also be the target throwing route in the actual virtual environment.
  • the corresponding route length; if d 3 is the preset throwing distance in the virtual environment map, then L should also be the route length corresponding to the target throwing route in the virtual environment map
  • the route length corresponding to the target throwing route is 400m
  • the preset throwing distance is 40
  • the preset throwing quantity is 40
  • the corresponding target unit throwing quantity is 4; if the route length corresponding to the target throwing route is 200m, Then the number of throws of the target unit is 8.
  • the target unit is thrown at every preset throwing distance in the virtual environment until the predetermined number of air-dropped virtual props are thrown.
  • the air-drop virtual props when the air-drop virtual props only have a single throwing attribute information, or the air-drop virtual props only use a single throwing attribute information for throwing, correspondingly, the air-drop virtual props correspond to a fixed number of unit throws, and the number of unit throws is determined by the developer. Personnel pre-set; if the air-drop virtual item has two throwing attribute information, and the two throwing attribute information is used for throwing at the same time, the air-drop virtual item dynamic unit throwing quantity, the unit throwing quantity is determined by the route length of the target throwing route.
  • Step 203C displaying the air-drop virtual props thrown along the target throwing route in the virtual environment.
  • the throwing attributes corresponding to the air-drop virtual props are added as the additional throwing basis, so that the air-drop virtual props can be thrown more accurately in the virtual environment and the waste of the air-drop virtual props can be avoided. At the same time, improve the hit rate of airdrop virtual props.
  • FIG. 12 shows a flowchart of a method for controlling a virtual object to use a virtual prop provided by another exemplary embodiment of the present application.
  • This embodiment is described by taking the method used in the first terminal 110 or the second terminal 130 in the implementation environment shown in FIG. 1 or other terminals in the implementation environment as an example.
  • the method includes the following steps:
  • Step 1201 in response to the triggering operation on the target prop control, display the throwing route setting control, the target props control is the use control corresponding to the airdrop virtual prop, and the throwing route setting control displays the virtual environment map.
  • Step 1202 in response to the gesture operation on the throwing route setting control, display the target throwing route corresponding to the air-drop virtual prop in the virtual environment map, the gesture operation includes a first operation position and a second operation position, and the target throwing route passes through the first operation. position and second operating position.
  • Step 1203 Display the air-drop virtual props thrown along the target throwing route in the virtual environment, and the air-drop virtual props are used to change the attribute value of the virtual object.
  • steps 1201 to 1203 For the implementation of steps 1201 to 1203, reference may be made to the above embodiments, and details are not described herein in this embodiment.
  • Step 1204 in response to the collision between the airdrop virtual prop and the virtual obstacle, display the prop action range, and the prop action range is a circular area with the collision point of the airdrop virtual prop as the center and the preset distance as the radius.
  • the airdrop virtual props are thrown from the sky above the virtual environment, during the falling process of the airdrop virtual props, they may collide with virtual objects or with virtual obstacles. corresponding trigger mechanism.
  • the attribute value (life value) corresponding to the virtual object is directly reduced to 0, but the airdrop virtual prop will not be triggered (or exploded), Still continue to fall until it touches the virtual obstacle.
  • the airdrop virtual prop collides with a virtual obstacle during the falling process, it is triggered (that is, the airdrop virtual prop explodes), and a burning area is generated around the collision point (that is, the effect of the prop Scope).
  • the scope of action of the prop is a circular area with a preset radius centered on the collision point.
  • the virtual obstacle may be a virtual building, ground, etc., which is not limited in this embodiment of the present application.
  • trailing smoke is also generated, which is used to block the line of sight of the virtual object.
  • Step 1205 in response to the virtual object being within the scope of the prop, change the attribute value of the virtual object.
  • the terminal detects the relationship between the nearby virtual object and the prop action area in real time, and when it is determined that the virtual object is located in the prop action area, the life value of the virtual object is reduced.
  • the way of judging whether the virtual object is located in the action area of the prop may be: by judging the distance between the virtual object and the collision point, if the distance is smaller than the preset distance corresponding to the action area of the prop, it is determined that the virtual object is located in the action area of the prop , the life value corresponding to the virtual object is reduced.
  • the attribute value reduction value is negatively correlated with the distance between the virtual object and the collision point, that is, the closer the virtual object is to the collision point (the shorter the distance), the more the attribute value is reduced, and vice versa.
  • FIG. 13 shows a schematic diagram of a throwing process of an air-drop virtual prop shown in an exemplary embodiment of the present application.
  • the cargo prop is displayed on the virtual environment screen 1301, and the cargo prop throws the air-drop virtual prop 1303 along the target throwing route 1302.
  • the air-drop virtual prop 1303 encounters the virtual obstacle during the throwing process
  • the airdrop virtual prop 1303 is triggered, a burning area 1304 is generated, and smoke is generated.
  • the virtual object 1305 enters the burning area 1304, the health value of the virtual object 1305 will be reduced.
  • the action area of the prop is triggered and displayed. In order to reduce the attribute value of the virtual object located in the effect area of the prop.
  • FIG. 14 the process of controlling a virtual object to use a virtual prop is shown in FIG. 14 .
  • Step 1401 Equip the virtual object with airdrop virtual props.
  • the virtual object By equipping the virtual object with an airdrop virtual prop, the virtual object can use the airdrop virtual prop in the game.
  • Step 1402 whether the target prop control corresponding to the airdrop virtual prop meets the activation condition.
  • the activation condition may be the number of consecutively defeated virtual objects, or the score obtained by defeating virtual objects.
  • the activation condition is the use condition of the airdrop virtual prop in the above embodiment.
  • Step 1403 the target prop control is highlighted.
  • the airdrop virtual track meets the activation conditions (use conditions)
  • its corresponding target item control will be highlighted, and the highlighted item control will indicate that the target item control is in a triggerable state.
  • Step 1404 whether to receive a trigger operation on the target prop control.
  • Step 1405 call out the notebook, scan and display the position of each virtual object in the virtual environment.
  • the notebook is the throwing route setting control in the above embodiment.
  • a virtual environment map is displayed in the notebook, and at the same time, the virtual object identifiers are displayed in the virtual environment map based on the scanned geographic location of each virtual object.
  • Step 1406 determine whether the target throwing route is determined.
  • Step 1407 Throw the airdrop virtual prop from the starting point of the target throwing route.
  • Step 1408 whether the airdrop virtual prop collides with the virtual object during the landing process.
  • the airdrop virtual prop During the landing of the airdrop virtual prop, if the airdrop virtual prop directly collides with the virtual object, the virtual object's health value will drop to 0, but the airdrop virtual prop will not be triggered and continue to fall until it collides with the virtual obstacle , it will be triggered, and a burning area and smoke will be generated at the same time. The life value of the virtual object entering the burning area will be reduced, and the smoke can block the realization range of the virtual object.
  • the airdrop virtual props will also produce trailing smoke during the falling process.
  • Step 1409 the life value of the virtual object is reduced to 0.
  • step 1410 the airdrop virtual props continue to fall.
  • Step 1411 whether the airdrop virtual prop collides with a virtual obstacle during the falling process.
  • step 1412 the airdrop virtual prop is triggered and generates a burning area and smoke.
  • combustion area is the scope of action of the prop in the above embodiment.
  • Step 1413 whether the virtual object enters the burning area.
  • Step 1414 reducing the life value of the virtual object.
  • FIG. 15 is a structural block diagram of an apparatus for controlling virtual objects to use virtual props provided by an exemplary embodiment of the present application.
  • the apparatus includes:
  • the first display module 1501 is configured to display a throwing route setting control in response to a triggering operation on a target item control, the target item control is a usage control corresponding to an airdrop virtual item, and the throwing route setting control displays a virtual environment map ;
  • the second display module 1502 is configured to display the target throwing route corresponding to the air-drop virtual prop in the virtual environment map in response to a gesture operation on the throwing route setting control, where the gesture operation includes a first operation position and a second operating position, the target throwing route passes through the first operating position and the second operating position;
  • the third display module 1503 is configured to display the air-drop virtual props thrown along the target throwing route in the virtual environment, and the air-drop virtual props are used to change the attribute value of the virtual object.
  • the third display module 1503 includes:
  • mapping unit configured to determine the actual throwing route corresponding to the target throwing route in the virtual environment based on the position mapping relationship between the virtual environment map and the virtual environment;
  • a first throwing unit configured to throw the air-drop virtual prop in the virtual environment according to the actual throwing route
  • a first display unit configured to display the air-dropped virtual props thrown along the actual throwing route in the virtual environment.
  • mapping unit is further used for:
  • the actual throwing route in the virtual environment is determined according to the first position coordinates and the second position coordinates.
  • the device further includes:
  • a determining module configured to determine the actual throwing route corresponding to the target throwing route in the virtual environment based on the position mapping relationship between the virtual environment map and the virtual environment;
  • a sending module configured to report the actual throwing route to a server, and the server is configured to throw the air-drop virtual item in the virtual environment according to the actual throwing route.
  • the second display module 1502 includes:
  • a first obtaining unit configured to obtain the first operation position corresponding to the first operation signal and the second operation signal corresponding to the first operation signal in response to the first operation signal and the second operation signal in the virtual environment map the second operating position;
  • a first determining unit configured to display the candidate throwing route in the virtual environment map based on the first operating position and the second operating position
  • a second determining unit configured to, in response to the disappearance of the first operation signal and the second operation signal, determine the target throwing route according to the first operation position and the second operation position at the moment when the signals disappear, and The target throwing route is displayed in the virtual environment map.
  • the device further includes:
  • a first obtaining module configured to obtain the geographic location of each of the virtual objects in the virtual environment, where the virtual environment includes a first virtual object and a second virtual object, the first virtual object and the second virtual object Virtual objects belong to different camps;
  • the fourth display module is configured to display virtual object identifiers in the virtual environment map based on the geographic location, wherein the virtual objects belonging to different camps correspond to different virtual object identifiers.
  • the third display module 1503 includes:
  • a second obtaining unit configured to obtain the distribution quantity of the second virtual object on the target throwing route
  • a second throwing unit configured to throw the air-dropped virtual props in the virtual environment according to the distributed quantity, wherein the thrown quantity of the air-dropped virtual props is positively correlated with the distributed quantity
  • the second display unit is configured to display the air-drop virtual props thrown along the target throwing route in the virtual environment.
  • the device further includes:
  • a second acquiring module configured to acquire the number of second virtual objects defeated by the first virtual object within the target time period, where the first virtual object and the second virtual object belong to different camps;
  • a first setting module configured to set the target prop control to a non-triggerable state in response to the quantity being lower than the quantity threshold
  • a second setting module configured to set the target prop control to a triggerable state in response to the quantity being higher than the quantity threshold.
  • the third display module 1503 further includes:
  • a third acquiring unit configured to acquire throwing attribute information corresponding to the air-drop virtual item, where the throwing attribute information includes at least one of a preset throwing distance and a preset throwing quantity;
  • a third throwing unit configured to throw the air-drop virtual prop in the virtual environment according to the throwing attribute information and the target throwing route
  • a third display unit configured to display the air-drop virtual props thrown along the target throwing route in the virtual environment.
  • the throwing attribute information is the preset throwing distance
  • the third throwing unit is also used for:
  • the target throwing quantity corresponding to the air-drop virtual item determines the target throwing quantity corresponding to the air-drop virtual item, and the target throwing quantity is positively correlated with the route length of the target throwing route;
  • the air-dropped virtual props of the target throwing quantity are thrown in the virtual environment according to the target throwing route.
  • the throwing attribute information is the preset throwing quantity
  • the third throwing unit is also used for:
  • the target throwing distance corresponding to the air-drop virtual prop is determined, and the target throwing distance is positively correlated with the route length corresponding to the target throwing route;
  • the air-drop virtual props are thrown in the virtual environment according to the target throwing distance.
  • the throwing attribute information is the preset throwing quantity and the preset throwing distance
  • the third throwing unit is also used for:
  • the route length corresponding to the target throwing route the preset throwing number and the preset throwing distance, the number of target unit throws corresponding to the air-drop virtual prop is determined, and the target unit throw number is the same as the target throw
  • the length of the route corresponding to the route is negatively correlated;
  • the air-drop virtual props are thrown in the virtual environment according to the throwing quantity of the target unit and the preset throwing distance.
  • the device further includes:
  • a fifth display module configured to display the prop action range in response to the collision between the airdrop virtual prop and the virtual obstacle, and the prop action range is centered on the collision point of the airdrop virtual prop and a preset distance is a circular area with a radius;
  • a control module configured to change the attribute value of the virtual object in response to the virtual object being located within the scope of action of the prop.
  • air-drop virtual props can be thrown along the target throwing route.
  • the air-dropped virtual props provided by the embodiments of the present application can be thrown along a specified route.
  • the airdrop virtual props can be used to carry out long-range and large-scale attacks on such virtual objects, and improve the ability to attack such virtual objects. This will speed up the game process and effectively control the duration of a single game, thereby reducing the processing pressure on the server.
  • FIG. 16 shows a structural block diagram of a terminal 1600 provided by an exemplary embodiment of the present application.
  • the terminal 1600 may be a portable mobile terminal, such as a smart phone, a tablet computer, an MP3 player, and an MP4 player.
  • Terminal 1600 may also be referred to as user equipment, portable terminal, or other names.
  • the terminal 1600 includes: a processor 1601 and a memory 1602 .
  • the processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
  • the processor 1601 can use at least one hardware form among digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), and programmable logic array (Programmable Logic Array, PLA).
  • DSP Digital Signal Processing
  • FPGA field programmable gate array
  • PLA programmable logic array
  • the processor 1601 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in a wake-up state, also called a central processing unit (CPU); the coprocessor is a A low-power processor for processing data in a standby state.
  • CPU central processing unit
  • the processor 1601 may be integrated with a graphics processor (Graphics Processing Unit, GPU), and the GPU is used for rendering and drawing the content that needs to be displayed on the display screen.
  • the processor 1601 may further include an artificial intelligence (Artificial Intelligence, AI) processor for processing computing operations related to machine learning.
  • AI Artificial Intelligence
  • Memory 1602 may include one or more computer-readable storage media, which may be tangible and non-transitory. Memory 1602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more disk storage devices, flash storage devices. In some embodiments, a non-transitory computer-readable storage medium in the memory 1602 is used to store at least one instruction, where the at least one instruction is used to be executed by the processor 1601 to implement the methods provided by the embodiments of the present application.
  • the terminal 1600 may also optionally include: a peripheral device interface 1603 and at least one peripheral device.
  • the peripheral device includes: at least one of a radio frequency circuit 1604 , a touch display screen 1605 , a camera assembly 1606 , an audio circuit 1607 , a positioning assembly 1608 and a power supply 1609 .
  • terminal 1600 also includes one or more sensors 1610 .
  • the one or more sensors 1610 include, but are not limited to, an acceleration sensor 1611 , a gyro sensor 1612 , a pressure sensor 1613 , a fingerprint sensor 1614 , an optical sensor 1615 , and a proximity sensor 1616 .
  • FIG. 16 does not constitute a limitation on the terminal 1600, and may include more or less components than the one shown, or combine some components, or adopt different component arrangements.
  • Embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement the control described in the above embodiments Methods for virtual objects to use virtual props.
  • a computer program product or computer program comprising computer instructions stored in a computer readable storage medium.
  • the processor of the terminal reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the terminal executes the method for controlling the virtual object to use the virtual prop provided in various optional implementation manners of the above aspect.
  • Computer-readable storage media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a storage medium can be any available medium that can be accessed by a general purpose or special purpose computer.

Abstract

一种控制虚拟对象使用虚拟道具的方法、装置、终端及介质,属于虚拟场景技术领域。该方法包括:响应于对目标道具控件的触发操作,显示投掷路线设置控件,目标道具控件为空投类虚拟道具对应的使用控件,投掷路线设置控件展示有虚拟环境地图;响应于对投掷路线设置控件的手势操作,在虚拟环境地图中显示空投类虚拟道具对应的目标投掷路线,手势操作包括第一操作位置和第二操作位置,目标投掷路线经过第一操作位置和第二操作位置;在虚拟环境中显示沿目标投掷路线投掷的空投类虚拟道具。该方法扩大了虚拟道具的投掷范围,使得虚拟道具的投掷范围不易被其他虚拟对象躲避,从而提高了虚拟道具的命中率。

Description

控制虚拟对象使用虚拟道具的方法、装置、终端及介质
本申请要求于2020年09月17日提交的申请号为202010983118.6、发明名称为“控制虚拟对象使用虚拟道具的方法、装置、终端及介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及虚拟场景技术领域,特别涉及一种控制虚拟对象使用虚拟道具的方法、装置、终端及介质。
背景技术
第一人称射击类游戏(First-Person Shooting game,FPS)是一种基于三维虚拟环境的应用程序,用户可以操控虚拟环境中的虚拟对象进行行走、奔跑、攀爬、射击等动作,并且多个用户可以在线组队在同一个虚拟环境中协同完成某项任务。
相关技术中,虚拟对象可以在对战开始前预先装备投掷类虚拟道具(比如,手榴弹),相应的,用户可以控制虚拟对象对目标使用对象使用投掷类虚拟道具,用户控制虚拟对象发起伤害的过程如下:点击虚拟道具控件,确定投掷位置,并控制虚拟对象向该投掷位置投掷该虚拟道具。
然后,相关技术中提供的投掷类虚拟道具,均需要控制虚拟对象进行投掷,且每次投掷仅能向单一定点位置进行投掷,从投掷类虚拟道具到落地位置之间具有一定的时间间隔,而由于投掷类虚拟道具的定点使用方式,导致投掷类虚拟道具的作用范围较小,使得作用范围易被发现和躲避,从而导致投掷类虚拟道具的命中率较低。
发明内容
本申请实施例提供了一种控制虚拟对象使用虚拟道具的方法、装置、终端及介质,能够丰富虚拟道具的种类,且使用该虚拟道具可以改变目标投掷路线上各个虚拟对象的属性值,提高虚拟道具的命中率。所述技术方案如下:
一方面,本申请实施例提供了一种控制虚拟对象使用虚拟道具的方法,所述方法应用于终端,所述方法包括:
响应于对目标道具控件的触发操作,显示投掷路线设置控件,所述目标道具控件为空投类虚拟道具对应的使用控件,所述投掷路线设置控件展示有虚拟环境地图;
响应于对所述投掷路线设置控件的手势操作,在所述虚拟环境地图中显示所述空投类虚拟道具对应的目标投掷路线,所述手势操作包括第一操作位置和第二操作位置,所述目标投掷路线经过所述第一操作位置和所述第二操作位置;
在虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具,所述空投类虚拟道具用于改变虚拟对象的属性值。
另一方面,本申请实施例提供了一种控制虚拟对象使用虚拟道具的装置,所述装置包括:
第一显示模块,用于响应于对目标道具控件的触发操作,显示投掷路线设置控件,所述目标道具控件为空投类虚拟道具对应的使用控件,所述投掷路线设置控件展示有虚拟环境地图;
第二显示模块,用于响应于对所述投掷路线设置控件的手势操作,在所述虚拟环境地图中显示所述空投类虚拟道具对应的目标投掷路线,所述手势操作包括第一操作位置和第二操作位置,所述目标投掷路线经过所述第一操作位置和所述第二操作位置;
第三显示模块,用于在虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具,所述空投类虚拟道具用于改变虚拟对象的属性值。
另一方面,本申请实施例提供了一种终端,所述终端包括处理器和存储器,所述存储器 中存储有至少一段程序,所述至少一段程序由所述处理器加载并执行以实现如上述方面所述的控制虚拟对象使用虚拟道具的方法。
另一方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一段程序,所述至少一段程序由处理器加载并执行以实现如上述方面所述的控制虚拟对象使用虚拟道具的方法。
另一方面,本申请实施例提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。终端的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该终端执行上述方面的各种可选实现方式中提供的控制虚拟对象使用虚拟道具的方法。
本申请实施例提供的技术方案的有益效果至少包括:
本申请实施例中,通过在虚拟道具中引入空投类虚拟道具,且用户可以通过手势操作在虚拟环境地图中规划出一条投掷路线,使得空投类虚拟道具可以沿目标投掷路线进行投掷,相比于相关技术中仅能定点投掷虚拟道具,本申请实施例提供的空投类虚拟道具可以沿指定路线投掷,一方面扩大了虚拟道具的投掷范围,使得虚拟道具的投掷范围不易被其他虚拟对象躲避,从而提高了虚拟道具的命中率;另一方面,当某些虚拟对象采取蹲守或远程攻击策略时,使用该空投类虚拟道具可以对这类虚拟对象进行远程且大范围的攻击,提高对这类虚拟对象的命中率,从而加快对局进程,并有效控制单局时长,进而降低服务器的处理压力。
附图说明
图1示出了本申请一个实施例提供的计算机系统的架构示意图;
图2示出了本申请一个示例性实施例提供的控制虚拟对象使用虚拟道具的方法的流程图;
图3示出了本申请一个示例性实施例示出的控制虚拟对象使用虚拟道具的过程示意图;
图4示出了根据第一操作位置和第二操作位置确定目标投掷路线的过程示意图;
图5示出了本申请另一个示例性实施例提供的控制虚拟对象使用虚拟道具的方法的流程图;
图6示出了本申请一个示例性实施例示出的空投类虚拟道具的道具装备界面示意图;
图7示出了本申请另一个示例性实施例提供的控制虚拟对象使用虚拟道具的方法的流程图;
图8示出了本申请一个示例性实施例示出的虚拟对象位置显示过程的示意图;
图9示出了本申请另一个示例性实施例提供的控制虚拟对象使用虚拟道具的方法的流程图;
图10示出了本申请一个示例性实施例示出的根据预设投掷间距和目标投掷路线在虚拟环境中投掷空投类虚拟道具的示意图;
图11示出了本申请一个示例性实施例示出的根据预设投掷数量和目标投掷路线在虚拟环境中投掷空投类虚拟道具的示意图;
图12示出了本申请另一个示例性实施例提供的控制虚拟对象使用虚拟道具的方法的流程图;
图13示出了本申请一个示例性实施例示出的空投类虚拟道具的投掷过程示意图;
图14示出了本申请另一个示例性实施例提供的控制虚拟对象使用虚拟道具的方法的流程图;
图15是本申请一个示例性实施例提供的控制虚拟对象使用虚拟道具的装置的结构框图;
图16示出了本申请一个示例性实施例提供的终端的结构框图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
首先,对本申请实施例中涉及的名词进行介绍:
虚拟环境:是应用程序在终端上运行时显示(或提供)的虚拟环境。该虚拟环境可以是对真实世界的仿真环境,也可以是半仿真半虚构的环境,还可以是纯虚构的环境。虚拟环境可以是二维虚拟环境、2.5维虚拟环境和三维虚拟环境中的任意一种,本申请对此不加以限定。下述实施例以虚拟环境是三维虚拟环境为例进行说明。
虚拟对象:是指虚拟环境中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物等,比如:在三维虚拟环境中显示的人物、动物。可选地,虚拟对象是基于动画骨骼技术创建的三维立体模型。每个虚拟对象在三维虚拟环境中具有自身的形状和体积,占据三维虚拟环境中的一部分空间。
射击游戏:包括第一人称射击游戏和第三人称射击游戏。其中,第一人称射击游戏是指用户能够以第一人称视角进行的射击游戏,游戏中的虚拟环境的画面是以第一虚拟对象的视角对虚拟环境进行观察的画面。第三人称射击游戏则是指通过第三人称视角进行的射击游戏,游戏中的虚拟环境的画面是以第三人称视角(比如位于第一虚拟对象的头部后方)对虚拟环境进行观察的画面。
在游戏中,至少两个虚拟对象在虚拟环境中进行单局对战模式,虚拟对象通过躲避其他虚拟对象发起的伤害和虚拟环境中存在的危险(比如,毒气圈、沼泽地等)来达到在虚拟环境中存活的目的,当虚拟对象在虚拟环境中的生命值为零时,虚拟对象在虚拟环境中的生命结束,最后存活在虚拟环境中的虚拟对象是获胜方。可选地,该对战以第一个客户端加入对战的时刻作为开始时刻,以最后一个客户端退出对战的时刻作为结束时刻,每个客户端可以控制虚拟环境中的一个或多个虚拟对象。可选地,该对战的竞技模式可以包括单人对战模式、双人小组对战模式或者多人大组对战模式,本申请实施例对对战模式不加以限定。
虚拟道具:是指虚拟对象在虚拟环境中能够使用的道具,包括能够改变其他虚拟对象的属性值的虚拟武器,子弹等补给道具,盾牌、盔甲、装甲车等防御道具,虚拟光束、虚拟冲击波等用于虚拟对象释放技能时通过手部展示的虚拟道具,以及虚拟对象的部分身体躯干,比如手部、腿部。其中,能够改变其他虚拟对象的属性值的虚拟道具,包括手枪、步枪、狙击枪等远距离虚拟道具,匕首、刀、剑、绳索等近距离虚拟道具,飞斧、飞刀、手榴弹、闪光弹、烟雾弹等投掷类虚拟道具。
请参考图1,其示出了本申请一个实施例提供的计算机系统的架构示意图。该计算机系统可以包括:第一终端110、服务器120和第二终端130。
第一终端110运行有支持虚拟环境的应用程序111,该应用程序111可以是多人在线对战程序。当第一终端运行应用程序111时,第一终端110的屏幕上显示应用程序111的用户界面。该应用程序111可以是军事仿真程序、多人在线战术竞技(Multiplayer Online Battle Arena,MOBA)游戏、大逃杀射击游戏、模拟战略游戏(Simulation Game,SLG)的任意一种。在本实施例中,以该应用程序111是FPS游戏来举例说明。第一终端110是第一用户112使用的终端,第一用户112使用第一终端110控制位于虚拟环境中的第一虚拟对象进行活动,第一虚拟对象可以称为第一用户112的主控虚拟对象。第一虚拟对象的活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、飞行、跳跃、驾驶、拾取、射击、攻击、投掷、释放技能中的至少一种。示意性的,第一虚拟对象是第一虚拟人物,比如仿真人物或动漫人物。
第二终端130运行有支持虚拟环境的应用程序131,该应用程序131可以是多人在线对战程序。当第二终端130运行应用程序131时,第二终端130的屏幕上显示应用程序131的用户界面。该客户端可以是军事仿真程序、MOBA游戏、大逃杀射击游戏、SLG游戏中的任意一种,在本实施例中,以该应用程序131是FPS游戏来举例说明。第二终端130是第二用户132使用的终端,第二用户132使用第二终端130控制位于虚拟环境中的第二虚拟对象进行活动,第二虚拟对象可以称为第二用户132的主控虚拟角色。示意性的,第二虚拟对象是第二虚拟人物,比如仿真人物或动漫人物。
可选地,第一虚拟对象和第二虚拟对象处于同一虚拟世界中。可选地,第一虚拟对象和第二虚拟对象可以属于同一个阵营、同一个队伍、同一个组织、具有好友关系或具有临时性的通讯权限。可选的,第一虚拟对象和第二虚拟对象可以属于不同的阵营、不同的队伍、不同的组织或具有敌对关系。
可选地,第一终端110和第二终端130上运行的应用程序是相同的,或两个终端上运行的应用程序是不同操作系统平台(安卓或IOS)上的同一类型应用程序。第一终端110可以泛指多个终端中的一个,第二终端130可以泛指多个终端中的另一个,本实施例仅以第一终端110和第二终端130来举例说明。第一终端110和第二终端130的设备类型相同或不同,该设备类型包括:智能手机、平板电脑、电子书阅读器、动态影像专家压缩标准音频层面3(Moving Picture Experts Group Audio Layer III,MP3)播放器、动态影像专家压缩标准音频层面4(Moving Picture Experts Group Audio Layer IV,MP4)播放器、膝上型便携计算机和台式计算机中的至少一种。
图1中仅示出了两个终端,但在不同实施例中存在多个其它终端可以接入服务器120。可选地,还存在一个或多个终端是开发者对应的终端,在该终端上安装有支持虚拟环境的应用程序的开发和编辑平台,开发者可在该终端上对应用程序进行编辑和更新,并将更新后的应用程序安装包通过有线或无线网络传输至服务器120,第一终端110和第二终端130可从服务器120下载应用程序安装包实现对应用程序的更新。
第一终端110、第二终端130以及其它终端通过无线网络或有线网络与服务器120相连。
服务器120包括一台服务器、多台服务器组成的服务器集群、云计算平台和虚拟化中心中的至少一种。服务器120用于为支持三维虚拟环境的应用程序提供后台服务。可选地,服务器120承担主要计算工作,终端承担次要计算工作;或者,服务器120承担次要计算工作,终端承担主要计算工作;或者,服务器120和终端之间采用分布式计算架构进行协同计算。
在一个示意性的例子中,服务器120包括存储器121、处理器122、用户账号数据库123、对战服务模块124、面向用户的输入/输出接口(Input/Output Interface,I/O接口)125。其中,处理器122用于加载服务器120中存储的指令,处理用户账号数据库123和对战服务模块124中的数据;用户账号数据库123用于存储第一终端110、第二终端130以及其它终端所使用的用户账号的数据,比如用户账号的头像、用户账号的昵称、用户账号的战斗力指数,用户账号所在的服务区;对战服务模块124用于提供多个对战房间供用户进行对战,比如1V1对战、3V3对战、5V5对战等;面向用户的I/O接口125用于通过无线网络或有线网络和第一终端110和/或第二终端130建立通信交换数据。
请参考图2,其示出了本申请一个示例性实施例提供的控制虚拟对象使用虚拟道具的方法的流程图。本实施例以该方法用于图1所示实施环境中的第一终端110或第二终端130或该实施环境中的其它终端为例进行说明,该方法包括如下步骤:
步骤201,响应于对目标道具控件的触发操作,显示投掷路线设置控件,目标道具控件为空投类虚拟道具对应的使用控件,投掷路线设置控件展示有虚拟环境地图。
其中,空投类虚拟道具指示可以沿预设投掷路线进行攻击的虚拟道具,该预设投掷路线由用户通过手势操作确定。
在一种可能的实施方式中,当虚拟对象装备该空投类虚拟道具,并进入对局后,即会在用户界面中显示该空投类虚拟道具对应的目标道具控件,用户即可通过触发该目标道具控件来控制虚拟对象使用该空投类虚拟道具。
本申请实施例的方法应用于虚拟环境中,虚拟环境中包括第一虚拟对象和第二虚拟对象,第一虚拟对象和第二虚拟对象属于不同阵营。在一种可能的实施方式中,终端通过虚拟环境画面显示虚拟环境。可选地,虚拟环境画面是以虚拟对象的视角对虚拟环境进行观察的画面。视角是指以虚拟对象的第一人称视角或者第三人称视角在虚拟环境中进行观察时的观察角度。可选地,本申请的实施例中,视角是在虚拟环境中通过摄像机模型对虚拟对象进行观察 时的角度。
可选地,摄像机模型在虚拟环境中对虚拟对象进行自动跟随,即,当虚拟对象在虚拟环境中的位置发生改变时,摄像机模型跟随虚拟对象在虚拟环境中的位置同时发生改变,且该摄像机模型在虚拟环境中始终处于虚拟对象的预设距离范围内。可选地,在自动跟随过程中,摄像头模型和虚拟对象的相对位置不发生变化。
摄像机模型是指在虚拟环境中位于虚拟对象周围的三维模型,当采用第一人称视角时,该摄像机模型位于虚拟对象的头部附近或者位于虚拟对象的头部;当采用第三人称视角时,该摄像机模型可以位于虚拟对象的后方并与虚拟对象进行绑定,也可以位于与虚拟对象相距预设距离的任意位置,通过该摄像机模型可以从不同角度对位于虚拟环境中的虚拟对象进行观察,可选地,该第三人称视角为第一人称的过肩视角时,摄像机模型位于虚拟对象(比如,虚拟人物的头肩部)的后方。可选地,除第一人称视角和第三人称视角外,视角还包括其他视角,比如俯视视角;当采用俯视视角时,该摄像机模型可以位于虚拟对象头部的上空,俯视视角是以从空中俯视的角度进行观察虚拟环境的视角。可选地,该摄像机模型在虚拟环境中不会进行实际显示,即,在用户界面显示的虚拟环境中不显示该摄像机模型。
对该摄像机模型位于与虚拟对象相距预设距离的任意位置为例进行说明,可选地,一个虚拟对象对应一个摄像机模型,该摄像机模型可以以虚拟对象为旋转中心进行旋转,如:以虚拟对象的任意一点为旋转中心对摄像机模型进行旋转,摄像机模型在旋转过程中的不仅在角度上有转动,还在位移上有偏移,旋转时摄像机模型与该旋转中心之间的距离保持不变,即,将摄像机模型在以该旋转中心作为球心的球体表面进行旋转,其中,虚拟对象的任意一点可以是虚拟对象的头部、躯干、或者虚拟对象周围的任意一点,本申请实施例对此不加以限定。可选地,摄像机模型在对虚拟对象进行观察时,该摄像机模型的视角的中心指向为该摄像机模型所在球面的点指向球心的方向。
可选地,该摄像机模型还可以在虚拟对象的不同方向以预设的角度对虚拟对象进行观察。可选地,第一虚拟对象是用户通过终端控制的虚拟对象,第二虚拟对象包括其他用户所控制的虚拟对象和后台服务器控制的虚拟对象中的至少一种,且第一虚拟对象与第二虚拟对象属于不同阵营。
不同于相关技术中的定点投掷方式,本申请实施例提供的空投类虚拟道具可以对指定路线进行连续投掷,因此,在一种可能的实施方式中,当终端接收到对目标道具控件的触发操作时,会在当前用户界面中显示投掷路线设置控件,通过该投掷路线设置控件来展示虚拟环境地图,以便用户在该虚拟环境地图中设置该空投类虚拟道具的目标投掷路线。
其中,用户对该目标道具控件的触发操作可以是点击操作、长按操作、双击操作等,本申请实施例对此不构成限定。
如图3所示,其示出了本申请一个示例性实施例示出的控制虚拟对象使用虚拟道具的过程示意图,当进入对局后,在用户界面中显示有虚拟环境画面301和目标道具控件302,当用户点击该目标道具控件302后,终端即接收到对该目标道具控件302的触发操作,则在当前用户界面上层显示投掷路线设置控件303,该投掷路线设置控件303用于显示虚拟环境地图。可选的,该虚拟环境地图中显示有虚拟对象标识。
步骤202,响应于对投掷路线设置控件的手势操作,在虚拟环境地图中显示空投类虚拟道具对应的目标投掷路线,手势操作包括第一操作位置和第二操作位置,目标投掷路线经过第一操作位置和第二操作位置。
其中,该手势操作可以是单指滑动操作、双指滑动操作、双指点击操作、双指长按操作等,只需要根据该手势操作确定出两个操作位置即可,本申请实施例对手势操作的类型不构成限定。
在一种可能的实施方式中,用户需要在显示有虚拟环境地图的投掷路线设置控件中进行手势操作,对应的终端接收到对投掷路线设置控件的手势操作,确定出该手势操作作用的第 一操作位置和第二操作位置,即可以根据该第一操作位置和第二操作位置确定出目标投掷路线。
可选的,为了使得用户可以明确自己规划的目标投掷路线,可以基于手势操作所指示的目标投掷路线显示在虚拟环境地图中。示意性的,该目标投掷路线的显示形式可以是线段形式、单箭头形式或双箭头形式。
在一个示例性的例子中,如图3所示,以单指滑动操作为例,当用户在投掷路线设置控件303中进行滑动操作,即由第一操作位置304滑动至第二操作位置305,则终端确定第一操作位置304和第二操作位置305,则可以将线段306确定为目标投掷路线。
针对根据第一操作位置和第二操作位置确定目标投掷路线时,可以直接将第一操作位置和第二操作位置之间的路线确定为目标投掷路线,也可以将虚拟环境地图中经过第一操作位置和第二操作位置的直线确定为目标投掷路线,本申请实施例对此不构成限定。
在一个示例性的例子中,如图4所示,其示出了根据第一操作位置和第二操作位置确定目标投掷路线的过程示意图。在投掷路线设置控件401上显示有虚拟环境地图402,当终端接收到对虚拟环境地图的手势操作后,确定出第一操作位置403和第二操作位置404后,可以将第一操作位置403和第二操作位置404之间的路线确定为目标投掷路线,或位置405和位置406之间的路线确定为目标投掷路线(其中,位置405和位置406为经过第一操作位置403和第二操作位置404的直线与虚拟环境地图边界交界的位置),或将第一操作位置403和位置406之间路线确定为目标投掷路线;或将第二操作位置404和位置405之间的路线确定为目标投掷路线,本申请实施例对目标投掷路线不构成限定。
可选的,终端可以基于用户手势操作作用的第一操作位置和第二操作位置确定出至少一条候选投掷路线,再由用户从多条候选投掷路线中选择出目标投掷路线。示意性的,以图4为例,若终端确定出的候选投掷路线包括:第一操作位置403~第二操作位置404、第一操作位置403~位置406、位置405~位置406、位置405~第二操作位置404,可以在当前用户界面上层显示提示信息,以提示用户从多条候选投掷路线中选择出目标投掷路线;可选的,终端可以在提示框中显示各个候选投掷路线对应的选择控件,当终端接收到对目标选择控件的触发操作时,则将该目标选择控件对应的候选投掷路线确定为目标投掷路线。
步骤203,在虚拟环境中显示沿目标投掷路线投掷的空投类虚拟道具,空投类虚拟道具用于改变虚拟对象的属性值。
在一种可能的实施方式中,当用户在投掷路线设置控件中规划好目标投掷路线后,对应终端确定目标投掷路线在虚拟环境地图中的位置,即可以根据虚拟环境地图和虚拟环境中各个位置的映射关系,按照该目标投掷路线在虚拟环境中投掷空投类虚拟道具,对应虚拟环境中显示有沿目标投掷路线投掷的空投类虚拟道具,该空投类虚拟道具用于改变该目标投掷路线上各个虚拟对象的属性值。
需要说明的是,在虚拟环境中显示沿目标投掷路线投掷的空投类虚拟道具时,虚拟环境中可以不显示该目标投掷路线,当空投类虚拟道具的投掷位置位于目标投掷路线在虚拟环境中对应的实际投掷路线上,则表示该虚拟环境中显示有沿目标投掷路线投掷的空投类虚拟道具;当空投类虚拟道具对应至少两个投掷位置之间的连线,位于所述目标投掷路线在虚拟环境中对应实际投掷路线上,则表示该虚拟环境中显示有沿目标投掷路线投掷的空投类虚拟道具。
可选的,当终端确定出目标投掷路线后,也可以将目标投掷路线上报给服务器,由服务器基于目标投掷路线在虚拟环境中投掷空投类虚拟道具,并将空投类虚拟道具的投掷信息反馈给终端,使得终端可以在虚拟环境中显示沿目标投掷路线投掷的空投类虚拟道具。
可选的,也可以由终端和服务器协同完成空投类虚拟道具的投掷,由终端将目标投掷路线上报给服务器,由服务器对该目标投掷路线进行校验,并在校验通过后,允许终端沿该目标投掷路线投掷空投类虚拟道具,对应的,终端在接收到校验通过指令后,基于所述目标投 掷路线在虚拟环境中投掷空投类虚拟道具,并将空投类虚拟道具的投掷信息反馈至服务器,由服务器将该投掷信息转发至其他终端。
其中,该属性值可以为虚拟对象的生命值、防御值、攻击力、速度等。
可选的,可以由虚拟载物道具装载该空投类虚拟道具,并按照目标投掷路线进行投掷。其中,虚拟载物道具可以为飞机、热气球等。
示意性的,如图3所示,当用户确定出目标投掷路线后则投掷路线设置控件收起或消失,虚拟载物道具307出现在目标投掷路线在虚拟环境中指示的投掷路线起点,并沿投掷路线投掷空投类虚拟道具308。
综上所述,通过在虚拟道具中引入空投类虚拟道具,且用户可以通过手势操作在虚拟环境地图中规划出一条投掷路线,使得空投类虚拟道具可以沿目标投掷路线进行投掷,相比于相关技术中仅能定点投掷虚拟道具,本申请实施例提供的空投类虚拟道具可以沿指定路线投掷,一方面扩大了虚拟道具的投掷范围,使得虚拟道具的投掷范围不易被其他虚拟对象躲避,从而提高了虚拟道具的命中率;另一方面,当某些虚拟对象采取蹲守或远程攻击策略时,使用该空投类虚拟道具可以对这类虚拟对象进行远程且大范围的攻击,提高对这类虚拟对象的命中率,从而加快对局进程,并有效控制单局时长,进而降低服务器的处理压力。
由于空投类虚拟道具属于连杀技能,即虚拟对象的连续击杀分数(或数量)达到预设分数阈值(或数量阈值)后,该空投类虚拟道具才可以使用,因此,当用户为虚拟对象装备该空投类虚拟道具,并进入对局后,虽然用户界面中显示有空投类虚拟道具对应的目标道具控件,但是该目标道具控件被设置为不可触发状态,只有在该虚拟对象的连续击杀分数满足预设分数阈值后,该目标道具控件才会处于可触发状态,即虚拟对象可以使用空投类虚拟道具。
请参考图5,其示出了本申请另一个示例性实施例提供的控制虚拟对象使用虚拟道具的方法的流程图。本实施例以该方法用于图1所示实施环境中的第一终端110或第二终端130或该实施环境中的其它终端为例进行说明,该方法包括如下步骤:
步骤501,获取第一虚拟对象在目标时间段内击败的第二虚拟对象的数量,第一虚拟对象和第二虚拟对象属于不同阵营。
可选的,由于空投类虚拟道具可以对较大范围内的虚拟对象产生伤害,具备较大攻击力,因此,将空投类虚拟道具设置为连杀技能武器、连杀技能道具或连续得分道具,对应,空投类虚拟道具具备使用条件,也即被控虚拟对象需要满足该使用条件,才可以使用该空投类虚拟道具。
可选的,基于空投类虚拟道具所属的道具种类,空投类虚拟道具的使用条件可以是第一虚拟对象(被控虚拟对象)在目标时间段内击败第二虚拟对象的数量满足数量阈值;也可以是第一虚拟对象在目标时间段内击败第二虚拟对象所得到的击败分数满足分数阈值。
其中,目标时间段可以由开发人员进行设置,示意性的,目标时间段可以是10min,也就是说,获取第一虚拟对象在任意连续10min内击败第二虚拟对象的数量,进而确定击败第二虚拟对象的数量是否可以满足空投类虚拟道具的使用条件。
在一种可能的实施方式中,终端在显示具有虚拟环境画面的用户界面之前,预先显示有道具装备界面,在该道具装备界面中,用户可以选择本次对局所需要携带的虚拟道具,在本申请实施例中,在道具装备界面中提供有连续得分道具界面,该连续得分道具界面中显示有至少一种连续得分道具,用户可以在该连续得分道具界面中选择空投类虚拟道具,点击装备控件后,进入对局,则用户界面中会显示该空投类虚拟道具对应的目标道具控件。
在一个示例性的例子中,如图6所示,其示出了本申请一个示例性实施例示出的空投类虚拟道具的道具装备界面示意图,在连续得分道具界面中,显示有道具选择栏601,该道具选择栏中包括至少一个连续得分道具对应的道具选择控件602,比如,无人先锋机、霹雳弹(即本申请实施例提供的空投类虚拟道具)、攻击直升机等,且每个连续得分道具中均显示有该道具的使用条件(即预设分数),比如,无人先锋机对应的是预设分数为750,即虚拟对象 若装备该技能,则需要在击杀分数达到700后,才可以使用该技能。当用户点击霹雳弹道具对应的道具选择控件602后,则在连续得分道具界面中显示该霹雳弹道具对应的道具介绍,即霹雳弹604对应的所需击杀分数(950)和功能(对指定路线进行爆炸物轰炸);当用户点击装备控件603后,即表示为虚拟对象装备霹雳弹道具。
由于空投类虚拟道具具有一定的使用条件,即进入对局后的预设时间段内用户需要控制第一虚拟对象击败一定数量的第二虚拟对象后,或击败第二虚拟对象获取到的分数达到一定值后,该目标道具控件才会变为可触发状态,即可以使用空投类虚拟道具,因此,在一种可能的实施方式中,当装备有空投类虚拟道具的第一虚拟对象进入对局后,终端会实时获取第一虚拟对象击败第二虚拟对象的数量,或击败第二虚拟对象后得到的分数,来确定目标道具控件的设置状态。
步骤502,响应于数量高于数量阈值,将目标道具控件设置为可触发状态。
可选的,对于连续得分道具,每个连续得分道具均设置有不同的使用条件,可以是击败数量,也可以是击败得分,比如,空投类虚拟道具对应的使用条件可以是连续击败一定数量(达到数量阈值)的虚拟对象,或通过连续击败虚拟对象得到一定分数(达到分数阈值),因此,当终端获取到第一虚拟对象击败第二虚拟对象的数量时,即将该数量与数量阈值进行比较,通过比较结果确定是否满足空投类虚拟道具的使用条件,进而基于条件判断结果来确定目标道具控件对应的设置状态。
针对空投类虚拟道具的使用规则,当用户控制第一虚拟对象刚进入对局,或目标时间段内第一虚拟对象击败第二虚拟对象的数量低于数量阈值时(也即不满足空投类虚拟道具对应的使用条件),终端会将该目标道具控件设置为不可触发状态,直至数量满足数量阈值。
其中,空投类虚拟道具对应的数量阈值可以是10,即当第一虚拟对象击败10个第二虚拟对象后,虚拟对象才可以使用空投类虚拟道具。
对应的,当终端确定出第一虚拟对象击败第二虚拟对象的数量达到或超过数量阈值后,可以将目标道具控件设置为可触发状态。
其中,不可触发状态可以是目标道具控件对应的图标为灰色、或黑色,则对应的可触发状态可以是目标道具控件对应的图标为高亮显示。
可选的,当用户控制第一虚拟对象击败第二虚拟对象时,可能会获取到一定分数,对应也可以设置分数阈值,设置空投类虚拟道具的使用条件为:第一虚拟对象在目标时间内获取到的分数高于分数阈值,示意性的,分数阈值可以为900分。
在一种可能的实施方式中,当第一虚拟对象在对局中击败第二虚拟对象后,得到的分数达到或超过分数阈值时,满足空投类虚拟道具的使用条件,表示第一虚拟对象可以在对局中使用该空投类虚拟道具;即当第一虚拟对象通过击败第二虚拟对象获得的分数低于该分数阈值时,终端将空投类虚拟道具对应的目标道具控件设置为不可触发状态;若第一虚拟对象通过击败第二虚拟对象获得的分数高于该分数阈值时,终端将空投类虚拟道具对应的目标道具控件设置为可触发状态。
可选的,考虑到第一虚拟对象通过挨个击败第二虚拟对象获得一定分数耗时较长,在其他可能的实施方式中,设置连杀概念,即第一虚拟对象在预定时间内连续的击败第二虚拟对象时,获得的击败分数会翻倍,连杀数量越多,翻倍越多,因此,第一虚拟对象可以更容易的达到预设分数阈值,从而提高了空投类虚拟道具的激活速率。其中,规定时间可以是20min。
步骤503,响应于对目标道具控件的触发操作,显示投掷路线设置控件,目标道具控件为空投类虚拟道具对应的使用控件,投掷路线设置控件展示有虚拟环境地图。
步骤503的实施方式可以参考上文实施例,本实施例在此不做赘述。
可选的,当空投类虚拟道具对应的目标道具控件被触发后,目标道具控件即由可触发状态变为不可触发状态,若需要再次使用该空投类虚拟道具,需要再次满足该空投类虚拟道具对应的使用条件。
步骤504,响应于虚拟环境地图内的第一操作信号和第二操作信号,获取第一操作信号对应的第一操作位置,以及第二操作信号对应的第二操作位置。
为了区别于相关技术中的定点投掷方式,本申请实施例提供了一种双触点操作方式,即虚拟环境地图内可以同时接收到两个操作信号,并通过旋转或位移来调整目标投掷路线。
其中,当手势操作为双触点操作方式时,其可以是双指操作,或其他可以同时产生两个操作信号的手势操作。
在一种可能的实施方式中,用户通过对投掷路线设置控件的手势操作,则对应的终端接收到虚拟环境地图内的第一操作信号和第二操作信号,即获取到该第一操作信号对应的第一操作位置和第二操作信号对应的第二操作位置,并实时跟随第一操作信号和第二操作信号更改第一操作位置和第二操作位置。
步骤505,基于第一操作位置和第二操作位置,在虚拟环境地图中显示候选投掷路线。
由于用户可能无法立即在虚拟环境地图中确定出目标投掷路线,可能会实时调整手势操作,以便确定出最合适的目标投掷路线,因此,在一种可能的实施方式中,当终端接收到手势操作后,即根据手势操作所对应的第一操作位置和第二操作位置,确定候选投掷路线,并实时在虚拟环境地图中显示根据当前手势操作确定出的候选投掷路线,以便用户根据显示出的候选投掷路线来确定当前路线是否符合用户投掷需求。
可选的,与目标投掷路线相似,在基于手势操作对应的第一操作位置和第二操作位置,确定候选投掷路线时,可以将经过第一操作位置和第二操作位置的任意线段确定为候选投掷路线,比如,候选投掷路线可以以第一操作位置为路线起点,以第二操作位置为路线终点。
其中,根据第一操作位置和第二操作位置确定候选投掷路线的过程可以参考上文实施例,本实施例在此不做赘述。
步骤506,响应于第一操作信号和第二操作信号消失,根据信号消失时刻的第一操作位置和第二操作位置确定目标投掷路线,以及在虚拟环境地图中显示目标投掷路线。
为了提高确定出的目标投掷路线的准确性,在一种可能的实施方式中,当终端确定出第一操作信号和第二操作信号消失时,确定用户手势操作结束,已经确定出目标投掷路线,则根据操作信号消失时刻的第一操作位置(即手势操作中第一操作信号对应的最终操作位置)和第二操作位置(即手势操作中第二操作信号对应的最终操作位置)确定目标投掷路线。
可选的,在确定目标投掷路线的过程中,为了使得用户可以实时确定出手势操作所指示的候选投掷路线,终端会跟随用户手势操作的改变,实时改变手势操作在虚拟环境地图中所指示的候选投掷路线,直至终端检测到手势操作触控结束,也即第一操作信号和第二操作信号消失,则将信号消失时刻的第一操作位置和第二操作位置所对应的候选投掷路线确定为目标投掷路线,显示在虚拟环境地图中。
可选的,若用户还需要修改目标投掷路线,可以重新在虚拟环境地图中进行手势操作;若用户无需修改目标投掷路线,可以关闭投掷路线设置控件,对应后续会基于用户规划出的目标投掷路线,投掷空投类虚拟道具。
可选的,当确定出目标投掷路线后(即手势操作对应的操作信号消失时),投掷路线设置控件也会收起,直至再次通过触发目标道具控件唤起该投掷路线设置控件。
可选的,当进行具有双操作信号的手势操作时,若终端接收到第一操作信号或第二操作信号消失,则无法用于确定目标投掷路线,投掷路线设置控件也不会收起,用户可以在投掷路线设置控件中继续进行手势操作。
步骤507,基于虚拟环境地图与虚拟环境之间位置映射关系,确定目标投掷路线在虚拟环境中对应的实际投掷路线。
由于目标投掷路线指示投掷路线设置控件中虚拟环境地图上的路线,若需要在实际虚拟环境中投掷空投类虚拟道具,则需要将确定出的目标投掷路线映射到实际虚拟环境中,因此,在一种可能的实施方式中,预设有虚拟环境地图中的位置与虚拟环境中的位置之间的位置映 射关系,以便在投掷路线设置控件中确定出目标投掷路线后,可以基于位置映射关系,确定出该目标投掷路线在虚拟环境中所指示的实际投掷路线,以实现将目标投掷路线映射到虚拟环境中。
在一种可能的实施方式中,确定实际投掷路线的方法可以包括以下步骤:
一、获取目标投掷路线在虚拟环境地图中的路线起点和路线终点。
在一种可能的实施方式中,根据两点确定直线的原理,可以获取到目标投掷路线在虚拟环境地图中的路线起点和路线终点分别在虚拟环境中对应的实际位置坐标,即可以确定出该目标投掷路线在虚拟环境中对应的实际投掷路线。
二、基于位置映射关系,确定路线起点在虚拟环境中的第一位置坐标,以及路线终点在虚拟环境中的第二位置坐标。
在一种可能的实施方式中,可以在虚拟环境中预先标定三个点,并在虚拟环境地图中确定出该预先标定的三个点对应的坐标位置,以此建立起虚拟环境地图和虚拟环境之间的位置映射关系,当确定实际投掷路线时,可以将路线起点与虚拟环境地图中的三个点连起来,确定出三个方向线段,然后在虚拟环境中根据该三个方向线段,即可以确定出三个点,并将这三个点取平均值,即可以求得路线起点在虚拟环境中的第一位置坐标,同理,可以获得路线终点在虚拟环境中的第二位置坐标。
可选的,也可以是根据预先标定的三个点,确定出虚拟环境地图和虚拟环境中位置之间的线性或非线性关系(位置映射关系),则可以直接将路线起点对应的坐标带入该位置映射关系,即可以求得路线起点对应的第一位置坐标,同理,也可以求得路线终点对应的第二位置坐标。
三、根据第一位置坐标和第二位置坐标,确定虚拟环境中的实际投掷路线。
在一种可能的实施方式中,确定出路线起点在虚拟环境中的位置和路线终点在虚拟环境中的位置,即确定出实际投掷路线在虚拟环境中对应的方向和长度,从而确定出空投类虚拟道具在虚拟环境中的实际投掷路线。
步骤508,按照实际投掷路线在虚拟环境中投掷空投类虚拟道具。
在一种可能的实施方式中,当终端确定出虚拟环境中的实际投掷起点和实际投掷终点后(也即实际投掷路线),终端可以控制虚拟载物道具出现在虚拟环境中的实际投掷起点,并沿实际投掷路线投掷空投类虚拟道具,直至达到实际投掷终点。
步骤509,在虚拟环境中显示沿实际投掷路线投掷的空投类虚拟道具。
终端控制虚拟载具在虚拟环境中沿实际投掷路线投掷空投类虚拟道具,对应虚拟环境中会显示沿该实际投掷路线投掷的空投类虚拟道具。
可选的,终端还可以将空投类虚拟道具的投掷信息上传至服务器,由服务器将该投掷信息转发至其他终端。
在另一种可能的应用场景中,可以由服务器投掷空投类虚拟道具,当终端确定出目标投掷路线后,可以基于虚拟环境地图和虚拟环境之间的位置映射关系,确定出目标投掷路线在虚拟环境中的实际投掷路线,进而将该实际投掷路线上报至服务器,由服务器基于该实际投掷路线,控制虚拟载具沿实际投掷路线在虚拟环境中投掷空投类虚拟道具;对应服务器会将投掷信息反馈给各个终端,使得终端可以基于该头投掷信息在虚拟环境中显示沿实际投掷路线投掷的空投类虚拟道具。
本实施例中,通过获取第一虚拟对象击败第二虚拟对象的数量来确定空投类虚拟道具是否可以被使用,从而确定目标道具控件的设置状态;此外,通过位置映射关系,将用户在投掷路线设置控件中确定的目标投掷路线映射到虚拟环境中,从而确定出空投类虚拟道具在虚拟环境中的实际投掷路线,以便在虚拟环境中投掷该空投类虚拟道具。
由于用户在确定目标投掷路线时,是在路线设置控件中进行选择设置的,而为了避免在虚拟环境中投掷空投类虚拟道具时,对属于同一阵营的虚拟对象造成误伤,且尽可能的对其 他阵营的虚拟对象造成更大范围的伤害,因此,在一种可能的实施方式中,该投掷路线设置控件可以获取到虚拟环境中所有虚拟对象(包括同一阵营和不同阵营)的位置,以便用户根据各个虚拟对象在虚拟环境地图中的位置来进行手势操作,以确定出合适的目标投掷路线。
请参考图7,其示出了本申请另一个示例性实施例提供的控制虚拟对象使用虚拟道具的方法的流程图。本实施例以该方法用于图1所示实施环境中的第一终端110或第二终端130或该实施环境中的其它终端为例进行说明,该方法包括如下步骤:
步骤701,响应于对目标道具控件的触发操作,显示投掷路线设置控件,目标道具控件为空投类虚拟道具对应的使用控件,投掷路线设置控件展示有虚拟环境地图。
本步骤的实施方式可以参考上文实施例,本实施例在此不做赘述。
步骤702,获取各个虚拟对象在虚拟环境中的地理位置,虚拟环境中包括第一虚拟对象和第二虚拟对象,第一虚拟对象和第二虚拟对象属于不同阵营。
当显示投掷路线设置控件后,为了便于用户基于虚拟环境中不同虚拟对象的位置分布情况来确定空投类虚拟道具对应的目标投掷路线,从而避免误伤队友,并对敌对方造成更高命中率,在一种可能的实施方式中,该投掷路线设置控件可以扫描虚拟环境中各个虚拟对应的地理位置,并将各个虚拟对象在虚拟环境中的位置映射到虚拟环境地图中。
可选的,第二虚拟对象可以是被其他用户控制的虚拟对象,也可以是由电脑控制的虚拟对象(人机)。
步骤703,基于地理位置,在虚拟环境地图中显示虚拟对象标识,其中,属于不同阵营的虚拟对象对应不同虚拟对象标识。
由于虚拟环境中包含不同阵营的虚拟对象,为了便于用户区分己方队友和敌方队友,在一种可能的实施方式中,可以将属于同一阵营的虚拟对象以相同的虚拟对象标识表示,将不同阵营的虚拟对象以不同的虚拟对象标识表示,并将不同虚拟对象标识根据其在虚拟环境中的地理位置显示在虚拟环境地图中。
可选的,不同虚拟对象标识可以采用不同形状的图形,比如,方形、圆形、三角形等;或,不同虚拟对象标识也可以采用不同颜色的图形,比如,第一阵营的虚拟对象采用红色圆形,第二阵营的虚拟对象采用换色圆形等等,或,不同虚拟对象标识也可以采用各个虚拟对象对应的头像;本申请实施例对虚拟对象标识不构成限定。
在一个示例性的例子中,如图8所示,其示出了本申请一个示例性实施例示出的虚拟对象位置显示过程的示意图。当用户在虚拟环境画面801中点击空投类虚拟道具对应的目标道具控件802后,首先在用户界面中显示路线设置控件803,此时的路线设置控件803中可能仅显示有虚拟环境地图804(即虚拟环境中各个虚拟障碍物的位置),之后,路线设置控件803对虚拟环境中的各个虚拟对象的位置进行扫描获取,并基于各个虚拟对象在虚拟环境中的位置在虚拟环境地图804中显示虚拟对象标识805和虚拟对象标识806,其中,不同虚拟对象标识表示属于不同阵营的虚拟对象。
上文实施例中,虚拟对象标识和虚拟环境地图并非同时展示在投掷路线设置控件中,当终端接收到对目标道具控件的触发操作时,首先在投掷路线设置控件中展示虚拟环境地图,同时终端会获取虚拟环境中各个虚拟对象对应的地理位置,并基于该地理位置在虚拟环境地图中显示对应的虚拟对象标识。可见,虚拟对象标识是在虚拟环境地图展示之后显示的。
可选的,针对并非同时展示的情景,为了降低终端功耗,可以在投掷路线设置控件周侧增设对象标识显示控件,该对象标识显示控件用于触发在虚拟环境地图中显示虚拟对象标识;在一种可能的实施方式中,当终端显示投掷路线控件后,仅在投掷路线控件中显示虚拟环境地图,若用户需要查看各个虚拟对象的所在位置,则可以触发对象标识显示控件,对应终端接收到对对象标识显示控件的触发操作,获取虚拟环境中各个虚拟对象对应的地理位置,进而基于该地理位置在虚拟环境地图中显示虚拟对象标识;反之,若用户无需在虚拟环境地图中展示虚拟对象标识,则可以不触发该对象标识显示控件,则可以减少终端获取各个虚拟对 象对应地理位置的运算逻辑,进一步降低终端功耗。
可选的,虚拟对象标识可以与虚拟环境地图同时展示在投掷路线设置控件中,在一种可能的实施方式中,当用户点击目标道具控件后,在显示投掷路线设置控件的同时,终端获取虚拟环境中各个虚拟对象对应的地理位置,以便在投掷路线设置控件中展示虚拟环境地图的同时,基于虚拟对象对应的地理位置在虚拟环境地图中显示虚拟对象标识,使得虚拟环境地图和虚拟对象标识可以同时展示在投掷路线设置控件中,视觉上不存在显示延迟。
步骤704,响应于对投掷路线设置控件的手势操作,在虚拟环境地图中显示空投类虚拟道具对应的目标投掷路线,手势操作包括第一操作位置和第二操作位置,目标投掷路线经过第一操作位置和第二操作位置。
本步骤的实施方式可以参考上文实施例,本实施例在此不做赘述。
步骤705,获取第二虚拟对象在目标投掷路线上的分布数量。
其中,分布数量用于指示在目标投掷路线上各个投掷点对应预设区域中第二虚拟对象的数量情况,,该预设区域可以是空投类虚拟道具的道具作用范围。
由于投掷路线设置控件中已经标识有各个第二虚拟对象的位置,则对应的在投掷空投类虚拟道具时,为了获得对第二虚拟对象更高的命中率,可以基于第二虚拟对象的分布数量来投掷空投类虚拟道具。
步骤706,根据分布数量在虚拟环境中投掷空投类虚拟道具,其中,空投类虚拟道具的投掷数量与分布数量呈正相关关系。
由于空投类虚拟道具具备一定的投掷数量限制,为了避免对空投类虚拟道具的浪费,以及在投掷数量限制内可以对第二虚拟对象产生更高的命中率,在一种可能的实施方式中,根据第二虚拟对象在目标投掷路线上的分布数量,来确定该位置处空投类虚拟道具的投掷数量,且设定空投类虚拟道具的投掷数量与分布数量呈正相关关系,也就是说,基于第二虚拟对象分布数量较多的区域,投放投掷数量更多的空投类虚拟道具;反之,对于第二虚拟对象数量较少或者没有第二虚拟对象的区域,可以选择投掷数量较少的空投类虚拟道具,或者不投放空投类虚拟道具。
在一个示例性的例子中,若目标投掷路线的第一个点处第二虚拟对象对应的数量为4个,第二个点处第二虚拟对象的数量为7个,则可以选择在第二个点处投掷5个空投类虚拟道具,而在第一个点处投掷3个空投类虚拟道具。
步骤707,在虚拟环境中显示沿目标投掷路线投掷的空投类虚拟道具。
本步骤的实施方式可以参考上文实施例,实施例在此不做赘述。
本实施例中,通过投掷路线设置控件来扫描并在虚拟环境地图中显示各个虚拟对象的虚拟对象标识,使得用户可以基于各个虚拟对象在虚拟环境中的位置来确定目标投掷路线,避免对己方队友造成伤害,从而提高对敌方的命中率;此外,在虚拟环境中投掷空投类虚拟道具时,设定第二虚拟对象(不属于同一阵营)的数量与该投掷数量为正相关关系,既可以实现对第二虚拟对象较集中的区域投掷更多的空投类虚拟道具,从而提高空投类虚拟道具的命中率;也可以对第二虚拟对象较少的区域投掷较少的空投类虚拟道具,避免对空投类虚拟道具的浪费。
在另一种可能的应用场景中,空投类虚拟道具具有一定的投掷属性,比如,预设投掷间距,即只能按照该预设投掷间距在虚拟环境中投掷空投类虚拟道具;或预设投掷数量,即空投类虚拟道具并不能无限制的投放,具有投放数量限制,因此,在按照目标投掷路线在虚拟环境中投掷空投类虚拟道具时,也需要同时考虑空投类虚拟道具对应的投掷属性信息。
在图2的基础上,如图9所示,步骤203可以包括步骤203A~步骤203C。
步骤203A,获取空投类虚拟道具对应的投掷属性信息,投掷属性信息包括预设投掷间距和预设投掷数量中的至少一种。
其中,预设投掷间距指示需要每隔一定距离来投掷空投类虚拟道具,也就是说,相邻两 个空投类虚拟道具对应投掷位置之间为预设投掷间距;可选的,该预设投掷间距可以由空投类虚拟道具的道具作用范围来设置,避免道具作用范围重复而对空投类虚拟道具的浪费。示意性的,预设投掷间距可以是10m。
可选的,预设投掷间距可以是实际虚拟环境中的间距,对应预设投掷间距可以是10m;或,预设投掷间距也可以是虚拟环境地图中的间距,对应预设投掷间距可以是1cm。
可选的,预设投掷数量指示空投类虚拟道具单次触发所能投放的空投类虚拟道具的总数量,示意性的,预设投掷数量可以为40个。
可选的,空投类虚拟道具可以同时具备预设投掷间距和预设投掷数量两个投掷属性信息,对应的,在空投类虚拟道具的使用过程中,需要同时受到该两个投掷属性信息的限制,也即当终端按照目标投掷路线投掷空投类虚拟道具时,需要同时考虑预设投掷间距和预设投掷数量。
可选的,若空投类虚拟道具同时具备预设投掷间距和预设投掷数量两个投掷属性信息,在空投类虚拟道具的实际使用过程中,可以由用户基于实际情况来切换使用至少一种投掷属性信息,可以选择仅使用预设投掷间距来投掷空投类虚拟道具,也可以选择仅使用预设投掷数量来投掷空投类虚拟道具,或可以选择使用预设投掷间距以及预设投掷数量来投掷空投类虚拟道具。
可选的,空投类虚拟道具仅具备单一投掷属性信息,若空投类虚拟道具仅具备预设投掷间距这一投掷属性信息时,则在空投类虚拟道具的使用过程中,按照预设投掷间距投掷空投类虚拟道具;若空投类虚拟道具仅具备预设投掷数量这一投掷属性信息时,则在空投类虚拟道具的使用过程中,按照预设投掷数量投掷空投类虚拟道具。
在一种可能的实施方式中,终端在确定出目标投掷路线后,即获取该空投类虚拟道具对应的投掷属性信息,从而基于该投掷属性信息在虚拟环境中投掷空投类虚拟道具。
步骤203B,根据投掷属性信息和目标投掷路线,在虚拟环境中投掷空投类虚拟道具。
在一种可能的实施方式中,当终端获取到空投类虚拟道具对应的投掷属性信息后,即可以根据该投掷属性信息和目标投掷路线在虚拟环境中投掷空投类虚拟道具。
其中,当投掷属性信息为预设投掷间距时,则根据投掷属性信息和目标投掷路线在虚拟环境中投掷空投类虚拟道具的过程可以包括以下步骤一~步骤二:
一、根据预设投掷间距以及目标投掷路线对应的路线长度,确定空投类虚拟道具对应的目标投掷数量,目标投掷数量与目标投掷路线的路线长度呈正相关关系。
其中,预设投掷间距可以是在虚拟环境地图中的间距,也可以是在虚拟环境中对应的实际间距。当预设投掷间距为虚拟环境地图中的间距时,预设投掷间距的取值可以为1cm,当预设投掷间距为虚拟环境中的实际间距时,预设投掷间距的取值可以为10m。
需要说明的是,当空投类虚拟道具的投掷属性信息为预设投掷间距时,每个投掷位置对应的投掷数量相同,单个投掷位置对应的投掷数量可以由开发人员预先设置,示意性的,单个投掷位置对应的投掷数量可以是3颗。
由于空投类虚拟道具对应的投掷间距是一定的,显然,目标投掷路线对应的路线长度越长,本次投掷所需要的空投类虚拟道具对应的投掷数量就越多,反之,目标投掷路线对应的路线长度越短,本次投掷所需要的空投类虚拟道具的投掷数量就越少,即目标投掷数量与目标投掷路线的路线长度呈正相关关系。
示例性的,目标投掷数量、目标投掷路线以及预设投掷间距之间的关系可以表示为:
N 1=(L/d 1)*n 1
其中,N 1表示目标投掷数量,L表示目标投掷路线的路线长度,d 1表示预设投掷间距,n 1表示单个投掷位置对应的投掷数量。需要说明的是,L和d 1需要为同一坐标系下对应的数值,也就是说,若d 1为实际虚拟环境中的预设投掷间距,则L也应该为目标投掷路线在实际虚拟环境中对应的路线长度;若d 1为虚拟环境地图中的预设投掷间距,则L也应该为目标投 掷路线在虚拟环境地图中对应的路线长度。
在一个示例性的例子中,若目标投掷路线的长度为10cm,且每隔1cm即投掷一次空投类虚拟道具,需要投掷10次空投类虚拟道具,若每次投掷可能会投掷3个空投类虚拟道具,则对应的空投类虚拟道具的目标投掷数量为30个。若目标投掷路线的长度为5cm,且每隔1cm投掷一次空投类虚拟道具,需要投掷5次,每次投掷3颗,则空投类虚拟道具的目标投掷数量为15颗。
二、按照目标投掷路线在虚拟环境中投掷目标投掷数量的空投类虚拟道具。
在一种可能的实施方式中,当确定出目标投掷数量后,即可以在目标投掷路线中每隔预设投掷间距来投掷空投类虚拟道具,且总共投掷目标投掷数量的空投类虚拟道具。
如图10所示,其示出了本申请一个示例性实施例示出的根据预设投掷间距和目标投掷路线在虚拟环境中投掷空投类虚拟道具的示意图。当确定出目标投掷路线在俯视平面图1001(即虚拟环境的俯视图)中指示的实际投掷路线如路线1002和路线1003所示,预设投掷间距为1004,可见,当按照预设投掷间距1004在实际投掷路线上投掷空投类虚拟道具时,由于路线1003的路线长度大于路线1002的路线长度,则对应的在路线1003上的投掷位置1005多于路线1002上的投掷位置1005,每个投掷位置1005均投掷一定数量的空投类虚拟道具,路线1003所需要的空投类虚拟道具对应的投掷数量大于路线1002所需要的空投类虚拟道具的投掷数量。
其中,当投掷属性信息为预设投掷数量时,则根据投掷属性信息和目标投掷路线在虚拟环境中投掷空投类虚拟道具的过程可以包括以下步骤三~步骤四:
三、根据目标投掷路线对应的路线长度以及预设投掷数量,确定空投类虚拟道具对应的目标投掷间距,目标投掷间距与目标投掷路线对应的路线长度呈正相关关系。
其中,预设投掷数量可以为40个。
为了实现在目标投掷路线上均匀投放空投类虚拟道具,避免在目标投掷路线开始时投放较多的空投类虚拟道具,导致后续空投类虚拟道具的剩余数量较少,从而无法覆盖整个目标投掷路线,因此,需要根据预设投掷数量来规划空投类虚拟道具在目标投掷路线上的目标投掷间距,从而使得空投类虚拟道具的投掷范围可以覆盖整个目标投掷路线。
由于空投类虚拟道具的投掷数量是一定的,因此,若目标投掷路线较长,则对应的空投类虚拟道具的投掷间距需要较大,才可以更大范围的覆盖目标投掷路线,反之,若目标投掷路线较短,则对应的空投类虚拟道具的投掷间距可以缩小,从而增加对目标投掷路线上虚拟对象的命中率,即目标投掷间距与目标投掷路线对应的路线长度呈正相关关系。
在一个示例性的例子中,预设投掷数量、目标投掷间距以及目标投掷路线之间的关系可以表示为:
d 2=L/(N 2/n 2)
其中,d 2表示目标投掷间距,L表示目标投掷路线的路线长度,N 2表示预设投掷数量,n 2表示单个投掷位置对应的投掷数量。需要说明的是,L和d 2需要为同一坐标系下对应的数值,也就是说,若L为目标投掷路线在实际虚拟环境中对应的路线长度,则计算得到的d 2也应该为实际虚拟环境中的目标投掷间距;若L为目标投掷路线在虚拟环境地图中对应的路线长度,则计算得到的d 1也应该为虚拟环境地图中的目标投掷间距。
可选的,若终端确定出的目标投掷间距为虚拟环境地图中的间距,则在实际投掷过程中,需要将该目标投掷间距转换为实际虚拟环境中的目标投掷间距,进而按照该目标投掷间距在实际虚拟环境中进行投掷。
示意性的,若预设投掷数量为40,目标投掷路线在实际虚拟环境中对应的路线长度为200m,单个投掷位置对应的投掷数量为4,则目标投掷间距为20m;若目标投掷路线在实际虚拟环境中对应的路线长度为400m,则目标投掷间距为40m。
四、按照目标投掷间距在虚拟环境中投掷空投类虚拟道具。
在一种可能的实施方式中,当确定出目标投掷间距后,即可以在目标投掷路线上每隔目标投掷间距,投掷一次空投类虚拟道具,直至到达目标投掷路线的终点处。
在一个示例性的例子中,若目标投掷间距为15m,则在虚拟环境中的投掷路线上,每隔15m投掷一次空投类虚拟道具。
如图11所示,其示出了本申请一个示例性实施例示出的根据预设投掷数量和目标投掷路线在虚拟环境中投掷空投类虚拟道具的示意图。当确定出目标投掷路线在俯视平面图1101(即虚拟环境的俯视图)中指示的实际投掷路线如路线1102和路线1103所示,预设投掷数量为8×5个(即包括8个投掷位置,每个投掷位置投掷5个空投类虚拟道具),可见,当按照预设投掷数量在实际投掷路线上投掷空投类虚拟道具时(即投掷位置1106的个数相同),由于路线1103的路线长度大于路线1102的路线长度,则对应的在路线1103上的投掷间距1105大于在路线1102上的投掷间距1104。
可选的,在其他可选的实施例中,若空投类虚拟道具同时具备预设投掷间距和预设投掷数量两个投掷属性,在一个示例性的例子中,根据投掷属性信息和目标投掷路线在虚拟环境中投掷空投类虚拟道具的过程可以包括以下步骤五~步骤六。
五、根据目标投掷路线对应的路线长度、预设投掷数量以及预设投掷间距,确定空投类虚拟道具对应的目标单位投掷数量,目标单位投掷数量与目标投掷路线对应的路线长度呈负相关关系。
当空投类虚拟道具的投掷数量以及投掷间距固定时,则目标投掷路线对应的路线长度仅会影响每隔投掷位置对应的单位投掷数量;且当目标投掷路线对应的路线长度越长,对应的目标单位投掷数量越少,目标投掷路线对应的路线长度越短,对应的目标单位投掷数量越多,也就是说,目标单位投掷数量与目标投掷路线对应的路线长度呈负相关关系。
在一个示例性的例子中,目标单位投掷数量、预设投掷数量、预设投掷间距以及目标投掷路线之间的关系可以表示为:
n 3=N 3/(L/d 3)
其中,n 3标识目标单位投掷数量,也即单个投掷位置对应的投掷数量,N 3表示预设投掷数量,L表示目标投掷路线对应的路线长度,d 3表示预设投掷间距。需要说明的是,L和d 3需要为同一坐标系下对应的数值,也就是说,若d 3为实际虚拟环境中的预设投掷间距,则L也应该为目标投掷路线在实际虚拟环境中对应的路线长度;若d 3为虚拟环境地图中的预设投掷间距,则L也应该为目标投掷路线在虚拟环境地图中对应的路线长度
示意性的,若目标投掷路线对应的路线长度为400m,预设投掷间距为40,预设投掷数量为40,则对应的目标单位投掷数量为4;若目标投掷路线对应的路线长度为200m,则目标单位投掷数量为8。
六、按照目标单位投掷数量和预设投掷间距,在虚拟环境中投掷空投类虚拟道具。
在一种可能的实施方式中,当确定出目标单位投掷数量后,在虚拟环境中每隔预设投掷间距投掷目标单位投掷数量,直至将预设投掷数量的空投类虚拟道具投掷完成。
需要说明的,当空投类虚拟道具仅具备单个投掷属性信息时,或空投类虚拟道具仅使用单个投掷属性信息进行投掷,对应的,空投类虚拟道具对应固定单位投掷数量,该单位投掷数量由开发人员预先设置;若空投类虚拟道具具备两个投掷属性信息时,且同时采用两个投掷属性信息进行投掷,空投类虚拟道具动态单位投掷数量,该单位投掷数量由目标投掷路线的路线长度确定。
步骤203C,在虚拟环境中显示沿目标投掷路线投掷的空投类虚拟道具。
本步骤的实施方式可以参考上文实施例,本实施例在此不做赘述。
本实施例中,在按照目标投掷路线投掷空投类虚拟道具时,增加空投类虚拟道具对应的投掷属性作为附加投掷依据,可以更准确在虚拟环境中投掷空投类虚拟道具,避免空投类虚拟道具浪费的同时,提高空投类虚拟道具的命中率。
上文实施例均描述的均是空投类虚拟道具的投掷过程,本实施例着重描述空投类虚拟道具的触发场景。
请参考图12,其示出了本申请另一个示例性实施例提供的控制虚拟对象使用虚拟道具的方法的流程图。本实施例以该方法用于图1所示实施环境中的第一终端110或第二终端130或该实施环境中的其它终端为例进行说明,该方法包括如下步骤:
步骤1201,响应于对目标道具控件的触发操作,显示投掷路线设置控件,目标道具控件为空投类虚拟道具对应的使用控件,投掷路线设置控件展示有虚拟环境地图。
步骤1202,响应于对投掷路线设置控件的手势操作,在虚拟环境地图中显示空投类虚拟道具对应的目标投掷路线,手势操作包括第一操作位置和第二操作位置,目标投掷路线经过第一操作位置和第二操作位置。
步骤1203,在虚拟环境中显示沿目标投掷路线投掷的空投类虚拟道具,空投类虚拟道具用于改变虚拟对象的属性值。
步骤1201至步骤1203的实施方式可以参考上文实施例,本实施例在此不做赘述。
步骤1204,响应于空投类虚拟道具与虚拟障碍物发生碰撞,显示道具作用范围,道具作用范围是以空投类虚拟道具的碰撞点为中心,以预设距离为半径的圆形区域。
由于空投类虚拟道具是由虚拟环境的上空进行投掷,则在空投类虚拟道具下落过程中,可能会与虚拟对象发生碰撞,也可能与虚拟障碍物发生碰撞,针对这两种碰撞情况,设置有对应的触发机制。
其中,当空投类虚拟道具在下落过程中与虚拟对象发生碰撞时,则该虚拟对象对应的属性值(生命值)直接减为0,但空投类虚拟道具并不会被触发(或爆炸),仍然继续下落直至接触到虚拟障碍物。
在一种可能的实施方式中,当空投类虚拟道具在下落过程中与虚拟障碍物发生碰撞后,被触发(即空投类虚拟道具爆炸),并以碰撞点为中心产生燃烧区域(即道具作用范围)。其中,该道具作用范围是以碰撞点为中心,预设半径的圆形区域。
可选的,当空投类虚拟道具爆炸后,还会持续产生大量的烟雾,用于遮挡该区域内虚拟对象的视线。
可选的,虚拟障碍物可以是虚拟建筑物、地面等,本申请实施例对此不构成限定。
可选的,在空投类虚拟道具下落过程中,也会产生拖尾烟雾,用于遮挡虚拟对象的视线。
步骤1205,响应于虚拟对象位于道具作用范围内,改变虚拟对象的属性值。
在一种可能的实施方式中,终端实时检测附近虚拟对象与道具作用区域之间的关系,当确定出虚拟对象位于道具作用区域内,即减少该虚拟对象的生命值。
其中,判断虚拟对象是否位于道具作用区域的方式可以是:通过判断虚拟对象与碰撞点之间的距离,若该距离小于道具作用区域对应的预设距离,则确定该虚拟对象位于道具作用区域中,对该虚拟对象对应的生命值进行减少。
可选的,属性值减少值与该虚拟对象与碰撞点之间的距离呈负相关关系,即虚拟对象越接近碰撞点(距离越短),属性值减少越多,反之越少。
在一个示例性的例子中,如图13所示,其示出了本申请一个示例性实施例示出的空投类虚拟道具的投掷过程示意图。当确定出目标投掷路线后,在虚拟环境画面1301上显示载物道具,该载物道具沿目标投掷路线1302投掷空投类虚拟道具1303,当空投类虚拟道具1303在投掷过程中与虚拟障碍物发生碰撞(比如,落地),则空投类虚拟道具1303被触发,产生燃烧区域1304,并产生烟雾。当虚拟对象1305进入燃烧区域1304后,会对该虚拟对象1305的生命值进行减少。
本实施例中,通过判断空投类虚拟道具在下落过程中的碰撞情况,即是否与虚拟障碍物发生碰撞,仅在空投类虚拟道具与虚拟障碍物发生碰撞时,被触发并显示道具作用区域,以便对位于该道具作用区域中的虚拟对象的属性值进行减少。
结合上述各个实施例,在一个示意性的例子中,控制虚拟对象使用虚拟道具的过程如图14所示。
步骤1401,为虚拟对象装备空投类虚拟道具。
通过为虚拟对象装备空投类虚拟道具,则虚拟对象可以在对局中使用该空投类虚拟道具。
步骤1402,空投类虚拟道具对应的目标道具控件是否满足激活条件。
其中,该激活条件可以是连续击败虚拟对象的数量,或击败虚拟对象获得的分数。该激活条件即上文实施例中空投类虚拟道具的使用条件。
步骤1403,目标道具控件高亮显示。
当空投类虚拟道满足激活条件(使用条件)后,其对应的目标道具控件高亮显示,道具控件高亮显示即表示该目标道具控件处于可触发状态。
当空投类虚拟道具不满足激活条件后,其对应的目标道具控件维持不可触发状态。
步骤1404,是否接收对目标道具控件的触发操作。
步骤1405,呼出笔记本,扫描并显示各个虚拟对象在虚拟环境中的位置。
其中,笔记本即上文实施例中的投掷路线设置控件。该笔记本中显示有虚拟环境地图,同时基于扫描到的各个虚拟对象的地理位置,将虚拟对象标识显示在虚拟环境地图中。
步骤1406,是否确定出目标投掷路线。
步骤1407,从目标投掷路线的路线起点投掷空投类虚拟道具。
步骤1408,空投类虚拟道具在落地过程中是否与虚拟对象发生碰撞。
在空投类虚拟道具落地过程中,若空投类虚拟道具直接与虚拟对象发生碰撞,则虚拟对象生命值降为0,但是空投类虚拟道具不会被触发,继续下落,直至与虚拟障碍物发生碰撞,才会被触发,同时产生燃烧区域和烟雾,进入燃烧区域的虚拟对象的生命值会被减少,而烟雾可以阻隔虚拟对象的实现范围。
可选的,空投类虚拟道具在下落过程中,也会产生拖尾烟雾。
步骤1409,虚拟对象生命值减至0。
步骤1410,空投类虚拟道具继续下落。
步骤1411,空投类虚拟道具在下落过程中是否与虚拟障碍物发生碰撞。
步骤1412,空投类虚拟道具被触发并产生燃烧区域和烟雾。
其中,燃烧区域即上文实施例中的道具作用范围。
步骤1413,虚拟对象是否进入燃烧区域。
步骤1414,对虚拟对象的生命值进行减少。
图15是本申请一个示例性实施例提供的控制虚拟对象使用虚拟道具的装置的结构框图,该装置包括:
第一显示模块1501,用于响应于对目标道具控件的触发操作,显示投掷路线设置控件,所述目标道具控件为空投类虚拟道具对应的使用控件,所述投掷路线设置控件展示有虚拟环境地图;
第二显示模块1502,用于响应于对所述投掷路线设置控件的手势操作,在所述虚拟环境地图中显示所述空投类虚拟道具对应的目标投掷路线,所述手势操作包括第一操作位置和第二操作位置,所述目标投掷路线经过所述第一操作位置和所述第二操作位置;
第三显示模块1503,用于在虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具,所述空投类虚拟道具用于改变虚拟对象的属性值。
可选的,所述第三显示模块1503,包括:
映射单元,用于基于所述虚拟环境地图与所述虚拟环境之间位置映射关系,确定所述目标投掷路线在所述虚拟环境中对应的实际投掷路线;
第一投掷单元,用于按照所述实际投掷路线在所述虚拟环境中投掷所述空投类虚拟道具;
第一显示单元,用于在所述虚拟环境中显示沿所述实际投掷路线投掷的所述空投类虚拟 道具。
可选的,所述映射单元,还用于:
获取所述目标投掷路线在所述虚拟环境地图中的路线起点和路线终点;
基于所述位置映射关系,确定所述路线起点在所述虚拟环境中的第一位置坐标,以及所述路线终点在所述虚拟环境中的第二位置坐标;
根据所述第一位置坐标和所述第二位置坐标,确定所述虚拟环境中的所述实际投掷路线。
可选的,所述装置还包括:
确定模块,用于基于所述虚拟环境地图与所述虚拟环境之间的位置映射关系,确定所述目标投掷路线在所述虚拟环境中对应的实际投掷路线;
发送模块,用于向服务器上报所述实际投掷路线,所述服务器用于按照所述实际投掷路线在所述虚拟环境中投掷所述空投类虚拟道具。
可选的,所述第二显示模块1502,包括:
第一获取单元,用于响应于所述虚拟环境地图内的第一操作信号和第二操作信号,获取所述第一操作信号对应的所述第一操作位置以及所述第二操作信号对应的所述第二操作位置;
第一确定单元,用于基于所述第一操作位置和所述第二操作位置,在所述虚拟环境地图中显示所述候选投掷路线;
第二确定单元,用于响应于所述第一操作信号和所述第二操作信号消失,根据信号消失时刻的所述第一操作位置和所述第二操作位置确定所述目标投掷路线,以及在所述虚拟环境地图中显示所述目标投掷路线。
可选的,所述装置还包括:
第一获取模块,用于获取各个所述虚拟对象在所述虚拟环境中的地理位置,所述虚拟环境中包括第一虚拟对象和第二虚拟对象,所述第一虚拟对象和所述第二虚拟对象属于不同阵营;
第四显示模块,用于基于所述地理位置,在所述虚拟环境地图中显示虚拟对象标识,其中,属于不同阵营的所述虚拟对象对应不同虚拟对象标识。
可选的,所述第三显示模块1503,包括:
第二获取单元,用于获取所述第二虚拟对象在所述目标投掷路线上的分布数量;
第二投掷单元,用于根据所述分布数量在所述虚拟环境中投掷所述空投类虚拟道具,其中,所述空投类虚拟道具的投掷数量与所述分布数量呈正相关关系;
第二显示单元,用于在所述虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具。
可选的,所述装置还包括:
第二获取模块,用于获取第一虚拟对象在目标时间段内击败的第二虚拟对象的数量,所述第一虚拟对象和所述第二虚拟对象属于不同阵营;
第一设置模块,用于响应于所述数量低于数量阈值,将所述目标道具控件设置为不可触发状态;
第二设置模块,用于响应于所述数量高于所述数量阈值,将所述目标道具控件设置为可触发状态。
可选的,所述第三显示模块1503,还包括:
第三获取单元,用于获取所述空投类虚拟道具对应的投掷属性信息,所述投掷属性信息包括预设投掷间距和预设投掷数量中的至少一种;
第三投掷单元,用于根据所述投掷属性信息和所述目标投掷路线,在所述虚拟环境中投掷所述空投类虚拟道具;
第三显示单元,用于在所述虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟 道具。
可选的,所述投掷属性信息为所述预设投掷间距;
所述第三投掷单元,还用于:
根据所述预设投掷间距以及所述目标投掷路线对应的路线长度,确定所述空投类虚拟道具对应的目标投掷数量,所述目标投掷数量与所述目标投掷路线的路线长度呈正相关关系;
按照所述目标投掷路线在所述虚拟环境中投掷所述目标投掷数量的所述空投类虚拟道具。
可选的,所述投掷属性信息为所述预设投掷数量;
所述第三投掷单元,还用于:
根据所述目标投掷路线对应的路线长度以及所述预设投掷数量,确定所述空投类虚拟道具对应的目标投掷间距,所述目标投掷间距与所述目标投掷路线对应的路线长度呈正相关关系;
按照所述目标投掷间距在所述虚拟环境中投掷所述空投类虚拟道具。
可选的,所述投掷属性信息为所述预设投掷数量和所述预设投掷间距;
所述第三投掷单元,还用于:
根据所述目标投掷路线对应的路线长度、所述预设投掷数量以及所述预设投掷间距,确定所述空投类虚拟道具对应的目标单位投掷数量,所述目标单位投掷数量与所述目标投掷路线对应的路线长度呈负相关关系;
按照所述目标单位投掷数量和所述预设投掷间距在所述虚拟环境中投掷所述空投类虚拟道具。
可选的,所述装置还包括:
第五显示模块,用于响应于所述空投类虚拟道具与虚拟障碍物发生碰撞,显示道具作用范围,所述道具作用范围是以所述空投类虚拟道具的碰撞点为中心,以预设距离为半径的圆形区域;
控制模块,用于响应于所述虚拟对象位于所述道具作用范围内,改变所述虚拟对象的所述属性值。
综上所述,通过在虚拟道具中引入空投类虚拟道具,且用户可以通过手势操作在虚拟环境地图中规划出一条投掷路线,使得空投类虚拟道具可以沿目标投掷路线进行投掷,相比于相关技术中仅能定点投掷虚拟道具,本申请实施例提供的空投类虚拟道具可以沿指定路线投掷,一方面扩大了虚拟道具的投掷范围,使得虚拟道具的投掷范围不易被其他虚拟对象躲避,从而提高了虚拟道具的命中率;另一方面,当某些虚拟对象采取蹲守或远程攻击策略时,使用该空投类虚拟道具可以对这类虚拟对象进行远程且大范围的攻击,提高对这类虚拟对象的命中率,从而加快对局进程,并有效控制单局时长,进而降低服务器的处理压力。
请参考图16,其示出了本申请一个示例性实施例提供的终端1600的结构框图。该终端1600可以是便携式移动终端,比如:智能手机、平板电脑、MP3播放器、MP4播放器。终端1600还可能被称为用户设备、便携式终端等其他名称。
通常,终端1600包括有:处理器1601和存储器1602。
处理器1601可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1601可以采用数字信号处理(Digital Signal Processing,DSP)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)、可编程逻辑阵列(Programmable Logic Array,PLA)中的至少一种硬件形式来实现。处理器1601也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称中央处理器(Central Processing Unit,CPU);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1601可以在集成有图像处理器(Graphics Processing Unit,GPU),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1601还可以包括人工智能(Artificial  Intelligence,AI)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1602可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是有形的和非暂态的。存储器1602还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1602中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1601所执行以实现本申请实施例提供的方法。
在一些实施例中,终端1600还可选包括有:外围设备接口1603和至少一个外围设备。具体地,外围设备包括:射频电路1604、触摸显示屏1605、摄像头组件1606、音频电路1607、定位组件1608和电源1609中的至少一种。
在一些实施例中,终端1600还包括有一个或多个传感器1610。该一个或多个传感器1610包括但不限于:加速度传感器1611、陀螺仪传感器1612、压力传感器1613、指纹传感器1614、光学传感器1615以及接近传感器1616。
本领域技术人员可以理解,图16中示出的结构并不构成对终端1600的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
本申请实施例还提供了一种计算机可读存储介质,该计算机可读存储介质存储有至少一条指令,所述至少一条指令由所述处理器加载并执行以实现如上各个实施例所述的控制虚拟对象使用虚拟道具的方法。
根据本申请的一个方面,提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。终端的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该终端执行上述方面的各种可选实现方式中提供的控制虚拟对象使用虚拟道具的方法。
本领域技术人员应该可以意识到,在上述一个或多个示例中,本申请实施例所描述的功能可以用硬件、软件、固件或它们的任意组合来实现。当使用软件实现时,可以将这些功能存储在计算机可读存储介质中或者作为计算机可读存储介质上的一个或多个指令或代码进行传输。计算机可读存储介质包括计算机存储介质和通信介质,其中通信介质包括便于从一个地方向另一个地方传送计算机程序的任何介质。存储介质可以是通用或专用计算机能够存取的任何可用介质。
以上所述仅为本申请的可选实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (29)

  1. 一种控制虚拟对象使用虚拟道具的方法,其特征在于,所述方法应用于终端,所述方法包括:
    响应于对目标道具控件的触发操作,显示投掷路线设置控件,所述目标道具控件为空投类虚拟道具对应的使用控件,所述投掷路线设置控件展示有虚拟环境地图;
    响应于对所述投掷路线设置控件的手势操作,在所述虚拟环境地图中显示所述空投类虚拟道具对应的目标投掷路线,所述手势操作包括第一操作位置和第二操作位置,所述目标投掷路线经过所述第一操作位置和所述第二操作位置;
    在虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具,所述空投类虚拟道具用于改变虚拟对象的属性值。
  2. 根据权利要求1所述的方法,其特征在于,所述在虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具,包括:
    基于所述虚拟环境地图与所述虚拟环境之间的位置映射关系,确定所述目标投掷路线在所述虚拟环境中对应的实际投掷路线;
    按照所述实际投掷路线在所述虚拟环境中投掷所述空投类虚拟道具;
    在所述虚拟环境中显示沿所述实际投掷路线投掷的所述空投类虚拟道具。
  3. 根据权利要求2所述的方法,其特征在于,所述基于所述虚拟环境地图与所述虚拟环境之间的位置映射关系,确定所述目标投掷路线在所述虚拟环境中对应的实际投掷路线,包括:
    获取所述目标投掷路线在所述虚拟环境地图中的路线起点和路线终点;
    基于所述位置映射关系,确定所述路线起点在所述虚拟环境中的第一位置坐标,以及所述路线终点在所述虚拟环境中的第二位置坐标;
    根据所述第一位置坐标和所述第二位置坐标,确定所述虚拟环境中的所述实际投掷路线。
  4. 根据权利要求1所述的方法,其特征在于,所述在虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具之前,所述方法还包括:
    基于所述虚拟环境地图与所述虚拟环境之间的位置映射关系,确定所述目标投掷路线在所述虚拟环境中对应的实际投掷路线;
    向服务器上报所述实际投掷路线,所述服务器用于按照所述实际投掷路线在所述虚拟环境中投掷所述空投类虚拟道具。
  5. 根据权利要求1至3任一所述的方法,其特征在于,所述响应于对所述投掷路线设置控件的手势操作,在所述虚拟环境地图中显示所述空投类虚拟道具对应的目标投掷路线,包括:
    响应于所述虚拟环境地图内的第一操作信号和第二操作信号,获取所述第一操作信号对应的所述第一操作位置,以及所述第二操作信号对应的所述第二操作位置;
    基于所述第一操作位置和所述第二操作位置,在所述虚拟环境地图中显示候选投掷路线;
    响应于所述第一操作信号和所述第二操作信号消失,根据信号消失时刻的所述第一操作位置和所述第二操作位置确定所述目标投掷路线,以及在所述虚拟环境地图中显示所述目标投掷路线。
  6. 根据权利要求1至3任一所述的方法,其特征在于,所述响应于对目标道具控件的触发操作,显示投掷路线设置控件之后,所述方法还包括:
    获取各个所述虚拟对象在所述虚拟环境中的地理位置,所述虚拟环境中包括第一虚拟对象和第二虚拟对象,所述第一虚拟对象和所述第二虚拟对象属于不同阵营;
    基于所述地理位置,在所述虚拟环境地图中显示虚拟对象标识,其中,属于不同阵营的所述虚拟对象对应不同虚拟对象标识。
  7. 根据权利要求6所述的方法,其特征在于,所述在虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具,包括:
    获取所述第二虚拟对象在所述目标投掷路线上的分布数量;
    根据所述分布数量在所述虚拟环境中投掷所述空投类虚拟道具,其中,所述空投类虚拟道具的投掷数量与所述分布数量呈正相关关系;
    在所述虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具。
  8. 根据权利要求1至3任一所述的方法,其特征在于,所述响应于对目标道具控件的触发操作之前,所述方法还包括:
    获取第一虚拟对象在目标时间段内击败的第二虚拟对象的数量,所述第一虚拟对象和所述第二虚拟对象属于不同阵营;
    响应于所述数量低于数量阈值,将所述目标道具控件设置为不可触发状态;
    响应于所述数量高于所述数量阈值,将所述目标道具控件设置为可触发状态。
  9. 根据权利要求1至3任一所述的方法,其特征在于,所述在虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具,还包括:
    获取所述空投类虚拟道具对应的投掷属性信息,所述投掷属性信息包括预设投掷间距和预设投掷数量中的至少一种;
    根据所述投掷属性信息和所述目标投掷路线,在所述虚拟环境中投掷所述空投类虚拟道具;
    在所述虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具。
  10. 根据权利要求9所述的方法,其特征在于,所述投掷属性信息为所述预设投掷间距;
    所述根据所述投掷属性信息和所述目标投掷路线,在所述虚拟环境中投掷所述空投类虚拟道具,包括:
    根据所述预设投掷间距以及所述目标投掷路线对应的路线长度,确定所述空投类虚拟道具对应的目标投掷数量,所述目标投掷数量与所述目标投掷路线的路线长度呈正相关关系;
    按照所述目标投掷路线在所述虚拟环境中投掷所述目标投掷数量的所述空投类虚拟道具。
  11. 根据权利要求9所述的方法,其特征在于,所述投掷属性信息为所述预设投掷数量;
    所述根据所述投掷属性信息和所述目标投掷路线,在所述虚拟环境中投掷所述空投类虚拟道具,还包括:
    根据所述目标投掷路线对应的路线长度以及所述预设投掷数量,确定所述空投类虚拟道具对应的目标投掷间距,所述目标投掷间距与所述目标投掷路线对应的路线长度呈正相关关系;
    按照所述目标投掷间距在所述虚拟环境中投掷所述空投类虚拟道具。
  12. 根据权利要求9所述的方法,其特征在于,所述投掷属性信息为所述预设投掷数量和所述预设投掷间距;
    所述根据所述投掷属性信息和所述目标投掷路线,在所述虚拟环境中投掷所述空投类虚拟道具,包括:
    根据所述目标投掷路线对应的路线长度、所述预设投掷数量以及所述预设投掷间距,确定所述空投类虚拟道具对应的目标单位投掷数量,所述目标单位投掷数量与所述目标投掷路线对应的路线长度呈负相关关系;
    按照所述目标单位投掷数量和所述预设投掷间距,在所述虚拟环境中投掷所述空投类虚拟道具。
  13. 根据权利要求1至3任一所述的方法,其特征在于,所述在虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具之后,所述方法还包括:
    响应于所述空投类虚拟道具与虚拟障碍物发生碰撞,显示道具作用范围,所述道具作用 范围是以所述空投类虚拟道具的碰撞点为中心,以预设距离为半径的圆形区域;
    响应于所述虚拟对象位于所述道具作用范围内,改变所述虚拟对象的所述属性值。
  14. 一种控制虚拟对象使用虚拟道具的装置,其特征在于,所述装置包括:
    第一显示模块,用于响应于对目标道具控件的触发操作,显示投掷路线设置控件,所述目标道具控件为空投类虚拟道具对应的使用控件,所述投掷路线设置控件展示有虚拟环境地图;
    第二显示模块,用于响应于对所述投掷路线设置控件的手势操作,在所述虚拟环境地图中显示所述空投类虚拟道具对应的目标投掷路线,所述手势操作包括第一操作位置和第二操作位置,所述目标投掷路线经过所述第一操作位置和所述第二操作位置;
    第三显示模块,用于在虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具,所述空投类虚拟道具用于改变虚拟对象的属性值。
  15. 根据权利要求14所述的装置,其特征在于,所述第三显示模块,包括:
    映射单元,用于基于所述虚拟环境地图与所述虚拟环境之间的位置映射关系,确定所述目标投掷路线在所述虚拟环境中对应的实际投掷路线;
    第一投掷单元,用于按照所述实际投掷路线在所述虚拟环境中投掷所述空投类虚拟道具;
    第一显示单元,用于在所述虚拟环境中显示沿所述实际投掷路线投掷的所述空投类虚拟道具。
  16. 根据权利要求15所述的装置,其特征在于,所述映射单元,还用于:
    获取所述目标投掷路线在所述虚拟环境地图中的路线起点和路线终点;
    基于所述位置映射关系,确定所述路线起点在所述虚拟环境中的第一位置坐标,以及所述路线终点在所述虚拟环境中的第二位置坐标;
    根据所述第一位置坐标和所述第二位置坐标,确定所述虚拟环境中的所述实际投掷路线。
  17. 根据权利要求14所述的装置,其特征在于,所述装置还包括:
    确定模块,用于基于所述虚拟环境地图与所述虚拟环境之间的位置映射关系,确定所述目标投掷路线在所述虚拟环境中对应的实际投掷路线;
    发送模块,用于向服务器上报所述实际投掷路线,所述服务器用于按照所述实际投掷路线在所述虚拟环境中投掷所述空投类虚拟道具。
  18. 根据权利要求14至16任一所述的装置,其特征在于,所述第二显示模块,包括:
    第一获取单元,用于响应于所述虚拟环境地图内的第一操作信号和第二操作信号,获取所述第一操作信号对应的所述第一操作位置,以及所述第二操作信号对应的所述第二操作位置;
    第一确定单元,用于基于所述第一操作位置和所述第二操作位置,在所述虚拟环境地图中显示候选投掷路线;
    第二确定单元,用于响应于所述第一操作信号和所述第二操作信号消失,根据信号消失时刻的所述第一操作位置和所述第二操作位置确定所述目标投掷路线,以及在所述虚拟环境地图中显示所述目标投掷路线。
  19. 根据权利要求14至16任一所述的装置,其特征在于,所述装置还包括:
    第一获取模块,用于获取各个所述虚拟对象在所述虚拟环境中的地理位置,所述虚拟环境中包括第一虚拟对象和第二虚拟对象,所述第一虚拟对象和所述第二虚拟对象属于不同阵营;
    第四显示模块,用于基于所述地理位置,在所述虚拟环境地图中显示虚拟对象标识,其中,属于不同阵营的所述虚拟对象对应不同虚拟对象标识。
  20. 根据权利要求19所述的装置,其特征在于,所述第三显示模块,包括:
    第二获取单元,用于获取所述第二虚拟对象在所述目标投掷路线上的分布数量;
    第二投掷单元,用于根据所述分布数量在所述虚拟环境中投掷所述空投类虚拟道具,其 中,所述空投类虚拟道具的投掷数量与所述分布数量呈正相关关系;
    第二显示单元,用于在所述虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具。
  21. 根据权利要求14至16任一所述的装置,其特征在于,所述装置还包括:
    第二获取模块,用于获取第一虚拟对象在目标时间段内击败的第二虚拟对象的数量,所述第一虚拟对象和所述第二虚拟对象属于不同阵营;
    第一设置模块,用于响应于所述数量低于数量阈值,将所述目标道具控件设置为不可触发状态;
    第二设置模块,用于响应于所述数量高于所述数量阈值,将所述目标道具控件设置为可触发状态。
  22. 根据权利要求14至16任一所述的装置,其特征在于,所述第三显示模块,还包括:
    第三获取单元,用于获取所述空投类虚拟道具对应的投掷属性信息,所述投掷属性信息包括预设投掷间距和预设投掷数量中的至少一种;
    第三投掷单元,用于根据所述投掷属性信息和所述目标投掷路线,在所述虚拟环境中投掷所述空投类虚拟道具;
    第三显示单元,用于在所述虚拟环境中显示沿所述目标投掷路线投掷的所述空投类虚拟道具。
  23. 根据权利要求22所述的装置,其特征在于,所述投掷属性信息为所述预设投掷间距;
    所述第三投掷单元,还用于:
    根据所述预设投掷间距以及所述目标投掷路线对应的路线长度,确定所述空投类虚拟道具对应的目标投掷数量,所述目标投掷数量与所述目标投掷路线的路线长度呈正相关关系;
    按照所述目标投掷路线在所述虚拟环境中投掷所述目标投掷数量的所述空投类虚拟道具。
  24. 根据权利要求22所述的装置,其特征在于,所述投掷属性信息为所述预设投掷数量;
    所述第三投掷单元,还用于:
    根据所述目标投掷路线对应的路线长度以及所述预设投掷数量,确定所述空投类虚拟道具对应的目标投掷间距,所述目标投掷间距与所述目标投掷路线对应的路线长度呈正相关关系;
    按照所述目标投掷间距在所述虚拟环境中投掷所述空投类虚拟道具。
  25. 根据权利要求22所述的装置,其特征在于,所述投掷属性信息为所述预设投掷数量和所述预设投掷间距;
    所述第三投掷单元,还用于:
    根据所述目标投掷路线对应的路线长度、所述预设投掷数量以及所述预设投掷间距,确定所述空投类虚拟道具对应的目标单位投掷数量,所述目标单位投掷数量与所述目标投掷路线对应的路线长度呈负相关关系;
    按照所述目标单位投掷数量和所述预设投掷间距在所述虚拟环境中投掷所述空投类虚拟道具。
  26. 根据权利要求14至16任一所述的装置,其特征在于,所述装置还包括:
    第五显示模块,用于响应于所述空投类虚拟道具与虚拟障碍物发生碰撞,显示道具作用范围,所述道具作用范围是以所述空投类虚拟道具的碰撞点为中心,以预设距离为半径的圆形区域;
    第二控制模块,用于响应于所述虚拟对象位于所述道具作用范围内,改变所述虚拟对象的所述属性值。
  27. 一种终端,其特征在于,所述终端包括处理器和存储器,所述存储器中存储有至少一段程序,所述至少一段程序由所述处理器加载并执行以实现如权利要求1至13任一所述的控 制虚拟对象使用虚拟道具的方法。
  28. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有至少一段程序,所述至少一段程序由处理器加载并执行以实现如权利要求1至13任一所述的控制虚拟对象使用虚拟道具的方法。
  29. 一种计算机程序产品,其特征在于,所述计算机程序产品包括计算机程序或计算机指令,所述计算机程序或所述计算机指令被处理器执行以实现如权利要求1至13任一所述的控制虚拟对象使用虚拟道具的方法。
PCT/CN2021/116014 2020-09-17 2021-09-01 控制虚拟对象使用虚拟道具的方法、装置、终端及介质 WO2022057624A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/984,114 US20230068653A1 (en) 2020-09-17 2022-11-09 Method and apparatus for controlling virtual object to use virtual prop, terminal, and medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010983118.6 2020-09-17
CN202010983118.6A CN112076467B (zh) 2020-09-17 2020-09-17 控制虚拟对象使用虚拟道具的方法、装置、终端及介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/984,114 Continuation US20230068653A1 (en) 2020-09-17 2022-11-09 Method and apparatus for controlling virtual object to use virtual prop, terminal, and medium

Publications (1)

Publication Number Publication Date
WO2022057624A1 true WO2022057624A1 (zh) 2022-03-24

Family

ID=73737354

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/116014 WO2022057624A1 (zh) 2020-09-17 2021-09-01 控制虚拟对象使用虚拟道具的方法、装置、终端及介质

Country Status (3)

Country Link
US (1) US20230068653A1 (zh)
CN (1) CN112076467B (zh)
WO (1) WO2022057624A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024037559A1 (zh) * 2022-08-18 2024-02-22 北京字跳网络技术有限公司 信息交互方法、人机交互方法、装置、电子设备和存储介质

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112076467B (zh) * 2020-09-17 2023-03-10 腾讯科技(深圳)有限公司 控制虚拟对象使用虚拟道具的方法、装置、终端及介质
CN112587927B (zh) * 2020-12-29 2023-07-07 苏州幻塔网络科技有限公司 道具的控制方法和装置、电子设备和存储介质
CN113101648B (zh) * 2021-04-14 2023-10-24 北京字跳网络技术有限公司 一种基于地图的交互方法、设备及存储介质
CN113318438B (zh) * 2021-06-30 2023-08-15 北京字跳网络技术有限公司 虚拟道具控制方法、装置、设备和计算机可读存储介质
CN113633972B (zh) * 2021-08-31 2023-07-21 腾讯科技(深圳)有限公司 虚拟道具的使用方法、装置、终端及存储介质
CN113680061B (zh) * 2021-09-03 2023-07-25 腾讯科技(深圳)有限公司 虚拟道具的控制方法、装置、终端及存储介质
CN114939275A (zh) * 2022-05-24 2022-08-26 北京字跳网络技术有限公司 对象交互的方法、装置、设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060094502A1 (en) * 2004-11-02 2006-05-04 Nintendo Co., Ltd. Video game device and video game program
CN108245888A (zh) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置及计算机设备
CN110585712A (zh) * 2019-09-20 2019-12-20 腾讯科技(深圳)有限公司 在虚拟环境中投掷虚拟爆炸物的方法、装置、终端及介质
CN111135566A (zh) * 2019-12-06 2020-05-12 腾讯科技(深圳)有限公司 虚拟道具的控制方法和装置、存储介质及电子装置
CN112076467A (zh) * 2020-09-17 2020-12-15 腾讯科技(深圳)有限公司 控制虚拟对象使用虚拟道具的方法、装置、终端及介质

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6581341B2 (ja) * 2014-10-15 2019-09-25 任天堂株式会社 情報処理装置、情報処理プログラム、情報処理方法、および情報処理システム
CN109364475A (zh) * 2017-12-15 2019-02-22 鲸彩在线科技(大连)有限公司 虚拟角色控制方法、装置、终端、系统及介质
CN108351652A (zh) * 2017-12-26 2018-07-31 深圳市道通智能航空技术有限公司 无人飞行器路径规划方法、装置和飞行管理方法、装置
CN108295466B (zh) * 2018-03-08 2021-09-07 网易(杭州)网络有限公司 虚拟对象运动控制方法、装置、电子设备及存储介质
CN109200582A (zh) * 2018-08-02 2019-01-15 腾讯科技(深圳)有限公司 控制虚拟对象与投掷物交互的方法、装置及存储介质
CN109911405B (zh) * 2019-02-22 2024-04-19 广东佰合包装科技有限公司 用于低空空投的货物包装装置、包装方法
CN110507990B (zh) * 2019-09-19 2021-08-06 腾讯科技(深圳)有限公司 基于虚拟飞行器的互动方法、装置、终端及存储介质
CN111111218A (zh) * 2019-12-19 2020-05-08 腾讯科技(深圳)有限公司 虚拟无人机的控制方法和装置、存储介质及电子装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060094502A1 (en) * 2004-11-02 2006-05-04 Nintendo Co., Ltd. Video game device and video game program
CN108245888A (zh) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置及计算机设备
CN110585712A (zh) * 2019-09-20 2019-12-20 腾讯科技(深圳)有限公司 在虚拟环境中投掷虚拟爆炸物的方法、装置、终端及介质
CN111135566A (zh) * 2019-12-06 2020-05-12 腾讯科技(深圳)有限公司 虚拟道具的控制方法和装置、存储介质及电子装置
CN112076467A (zh) * 2020-09-17 2020-12-15 腾讯科技(深圳)有限公司 控制虚拟对象使用虚拟道具的方法、装置、终端及介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024037559A1 (zh) * 2022-08-18 2024-02-22 北京字跳网络技术有限公司 信息交互方法、人机交互方法、装置、电子设备和存储介质

Also Published As

Publication number Publication date
CN112076467A (zh) 2020-12-15
US20230068653A1 (en) 2023-03-02
CN112076467B (zh) 2023-03-10

Similar Documents

Publication Publication Date Title
WO2022057624A1 (zh) 控制虚拟对象使用虚拟道具的方法、装置、终端及介质
WO2021213026A1 (zh) 虚拟对象的控制方法、装置、设备及存储介质
WO2022017063A1 (zh) 控制虚拟对象恢复属性值的方法、装置、终端及存储介质
WO2022083449A1 (zh) 虚拟投掷道具的使用方法、装置、终端及存储介质
WO2021244322A1 (zh) 瞄准虚拟对象的方法、装置、设备及存储介质
US20220168647A1 (en) Virtual prop control method and apparatus, storage medium and electronic device
US9833712B2 (en) Game system, server system, processing method, and information storage medium
WO2022156486A1 (zh) 虚拟道具的投放方法、装置、终端、存储介质及程序产品
WO2022242400A1 (zh) 虚拟对象的技能释放方法、装置、设备、介质及程序产品
US20220305384A1 (en) Data processing method in virtual scene, device, storage medium, and program product
US20230057151A1 (en) Virtual object control method and apparatus, device, storage medium, and program product
US20230330530A1 (en) Prop control method and apparatus in virtual scene, device, and storage medium
WO2022105480A1 (zh) 虚拟对象控制方法、装置、终端、存储介质及程序产品
JP2023164787A (ja) 仮想環境の画面表示方法、装置、機器及びコンピュータプログラム
CN111202983A (zh) 虚拟环境中的道具使用方法、装置、设备及存储介质
US20230030619A1 (en) Method and apparatus for displaying aiming mark
US20230016383A1 (en) Controlling a virtual objectbased on strength values
CN114042309B (zh) 虚拟道具的使用方法、装置、终端及存储介质
CN117298580A (zh) 虚拟对象的互动方法、装置、设备、介质及程序产品
CN114210062A (zh) 虚拟道具的使用方法、装置、终端、存储介质及程序产品
CN118022330A (zh) 虚拟对象的互动方法、装置、设备、介质及程序产品
CN113680061A (zh) 虚拟道具的控制方法、装置、终端及存储介质
CN117224945A (zh) 一种信息展示的方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21868451

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 03/08/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21868451

Country of ref document: EP

Kind code of ref document: A1