CN112933601B - Virtual throwing object operation method, device, equipment and medium - Google Patents

Virtual throwing object operation method, device, equipment and medium Download PDF

Info

Publication number
CN112933601B
CN112933601B CN202110227863.2A CN202110227863A CN112933601B CN 112933601 B CN112933601 B CN 112933601B CN 202110227863 A CN202110227863 A CN 202110227863A CN 112933601 B CN112933601 B CN 112933601B
Authority
CN
China
Prior art keywords
virtual
throwing
obstacle
environment
projectile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110227863.2A
Other languages
Chinese (zh)
Other versions
CN112933601A (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110227863.2A priority Critical patent/CN112933601B/en
Publication of CN112933601A publication Critical patent/CN112933601A/en
Application granted granted Critical
Publication of CN112933601B publication Critical patent/CN112933601B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an operation method, device, equipment and medium of a virtual throwing object, and relates to the field of virtual environments. The method comprises the following steps: displaying a virtual object of the handheld virtual projectile, the virtual object being in a virtual environment; receiving a pre-throwing operation for aiming a throwing of a virtual projectile in a virtual environment; displaying track indication information of the virtual projectile in the virtual environment based on the pre-projectile operation, the track indication information being used to indicate a flight track of the virtual projectile in the virtual environment; in response to receiving the throwing operation, throwing the virtual projectile to a target location in the virtual environment; a virtual obstacle is displayed at the target location, the virtual obstacle to block virtual injury to the virtual object from the second side in response to the virtual object being located on the first side of the virtual obstacle. The virtual object is protected by arranging the virtual barrier in the virtual environment by the virtual throwing object, so that the operation diversity of the virtual throwing object is improved.

Description

Virtual throwing object operation method, device, equipment and medium
Technical Field
The present disclosure relates to the field of virtual environments, and in particular, to a method, an apparatus, a device, and a medium for operating a virtual throwing object.
Background
In applications that include virtual environments, it is often necessary to perform activities in the virtual environment by controlling virtual objects in the virtual environment, such as: walking, driving, swimming, battle, picking up items, etc., wherein a virtual object may use a portion of the virtual prop to perform a certain function, e.g., the virtual object may perform a different function by throwing a carried virtual grenade, virtual flash projectile, virtual smoke projectile, etc.
In the related art, the virtual throwing prop includes a combat prop and a tactical prop, wherein the combat prop is a virtual throwing prop which can cause virtual injury to an enemy virtual object, such as a virtual grenade, while the tactical prop does not cause injury to the enemy virtual object, but can form a certain tactical effect, such as a virtual flash bullet.
However, the use mode and the realization effect of the two virtual throwing props are single.
Disclosure of Invention
The embodiment of the application provides an operation method, device, equipment and medium for a virtual throwing object, which can improve the diversity of the operation method for the virtual throwing object. The technical scheme is as follows:
in one aspect, there is provided a method of operating a virtual projectile, the method comprising:
Displaying a virtual object of a handheld virtual projectile, the virtual object being in a virtual environment;
receiving a pre-cast operation for aiming a cast of the virtual projectile in the virtual environment;
displaying track indication information of the virtual projectile in the virtual environment based on the pre-projectile operation, the track indication information being used for indicating a flight track of the virtual projectile in the virtual environment;
in response to receiving a throwing operation, throwing the virtual projectile to a target location in the virtual environment;
displaying a virtual obstacle at the target location, the virtual obstacle to block virtual injury to the virtual object from a second side in response to the virtual object being located on a first side of the virtual obstacle, wherein the first side and the second side are opposite.
In another aspect, there is provided an operating device for a virtual projectile, the device comprising:
the display module is used for displaying a virtual object of the handheld virtual throwing object, and the virtual object is in a virtual environment;
a receiving module for a pre-throwing operation for aiming a throwing of the virtual thrower in the virtual environment;
The display module is further used for displaying track indication information of the virtual throwing object in the virtual environment based on the pre-throwing operation, wherein the track indication information is used for indicating the flight track of the virtual throwing object in the virtual environment;
the display module is further used for throwing the virtual throwing object to a target position in the virtual environment in response to receiving throwing operation;
the display module is further configured to display a virtual obstacle at the target location, the virtual obstacle configured to block virtual injury to the virtual object from a second side in response to the virtual object being located on a first side of the virtual obstacle, wherein the first side and the second side are opposite.
In another aspect, a computer device is provided, where the terminal includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, where the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the method for operating a virtual projectile according to any one of the embodiments of the present application.
In another aspect, a computer readable storage medium having at least one piece of program code stored therein is provided, the program code loaded and executed by a processor to implement the method of operating a virtual projectile in accordance with any of the embodiments of the present application.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method of operating the virtual projectile according to any one of the above embodiments.
The technical scheme provided by the application at least comprises the following beneficial effects:
in the virtual environment comprising the virtual object, the virtual object holding the virtual throwing object is controlled to throw the virtual throwing object to a target position in the virtual environment, so that the virtual obstacle is displayed at the target position, the virtual obstacle can protect the virtual object on the first side from virtual injury from the second side, the first side is opposite to the second side, the operation diversity of the virtual throwing object is improved, and the functions of the virtual throwing object are enriched.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a block diagram of an electronic device provided in an exemplary embodiment of the present application;
FIG. 2 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 3 is a flow chart of a method of operation of a virtual projectile provided in one exemplary embodiment of the present application;
FIG. 4 is a schematic illustration of an equipment interface provided in accordance with an exemplary embodiment of the present application;
FIG. 5 is a flow chart of a method of operation of a virtual projectile provided in accordance with another exemplary embodiment of the present application;
FIG. 6 is a first preset range schematic diagram of a virtual obstacle provided in an exemplary embodiment of the present application;
FIG. 7 is a flowchart of a method of operation of a virtual projectile provided in accordance with another exemplary embodiment of the present application;
FIG. 8 is a schematic illustration of a throwing trace provided by an exemplary embodiment of the present application;
FIG. 9 is a schematic illustration of a virtual obstacle provided in an exemplary embodiment of the present application;
FIG. 10 is a schematic diagram of virtual obstacle orientation determination provided by an exemplary embodiment of the present application;
FIG. 11 is a flow chart corresponding to a method of operating a virtual projectile provided in an exemplary embodiment of the present application;
FIG. 12 is a block diagram of the operating device for a virtual projectile provided in one exemplary embodiment of the present application;
FIG. 13 is a block diagram of the operating device for a virtual projectile in accordance with another exemplary embodiment of the present application;
fig. 14 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, the terms involved in the embodiments of the present application will be briefly described:
virtual environment: is a virtual environment that an application displays (or provides) while running on a terminal. The virtual environment can be a simulation environment for the real world, a semi-simulation and semi-fictional three-dimensional environment, or a pure fictional three-dimensional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, and the following embodiments are exemplified by the virtual environment being a three-dimensional virtual environment, but are not limited thereto. Optionally, the virtual environment is further used for virtual environment combat between at least two virtual characters. Optionally, the virtual environment is further configured to use a virtual firearm to perform a combat between at least two virtual characters. Optionally, the virtual environment is further configured to use the virtual firearm to combat between at least two virtual characters within a target area range that is continuously decreasing over time in the virtual environment.
Virtual object: refers to movable objects in a virtual environment. The movable object may be a virtual character, a virtual animal, a cartoon character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the three-dimensional virtual environment. Alternatively, the virtual object is a three-dimensional stereoscopic model created based on animated skeleton techniques. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
Virtual throwing: refers to props in which virtual objects trigger target functions in a virtual environment in a throwing manner. Illustratively, the virtual throws include combat props and tactical props, divided by function. Wherein, the combat prop is a throwing prop which can cause virtual injury to a virtual object, for example: virtual torches, virtual combustion bombs, virtual viscous bombs, and the like. Tactical props are throwing props that do not cause virtual injury to virtual objects, but do cause functional effects, such as: virtual smoke bombs, virtual shock bombs, and the like. Optionally, the throwing prop can trigger the target function when the throwing duration reaches the preset duration, and also can trigger the target function when the throwing and collision conditions exist. Taking a virtual grenade in a virtual environment as a combat prop and taking a virtual flash bomb as a tactical prop for illustration, wherein the virtual grenade is a prop which triggers a detonation function when the throwing duration in the virtual environment reaches a preset duration, a player controls a target virtual object to throw the virtual grenade, and when the throwing duration of the virtual grenade reaches the preset duration, the virtual grenade detonates in the virtual environment and causes damage to the virtual object positioned in a preset distance range of a detonation point; the virtual flash projectile is a prop which is thrown in the virtual environment and triggers a flashing action when a collision event exists, a player controls a target virtual object to throw the virtual flash projectile, a flashing function is triggered when the virtual flash projectile falls to the ground in the virtual environment, and the vision of the virtual object in a flashing range is blocked.
The methods provided in the present application may be applied to virtual reality applications, three-dimensional map programs, first-person shooter games (FPS), third-person shooter games (Third-Person Shooting game, TPS), multiplayer online tactical game games (Multiplayer Online Battle Arena Games, MOBA), etc., and the following embodiments are exemplified by the application in games.
Games based on virtual environments are often composed of one or more maps of the game world, the virtual environments in the games simulate the scenes of the real world, users can control virtual objects in the games to walk, run, jump, shoot, fight, drive, switch to use virtual weapons, attack other virtual objects by using virtual weapons and the like in the virtual environments, the interactivity is high, and a plurality of users can form on-line team to play competitive games. When the user controls the virtual object to use the virtual weapon to attack the target virtual object, the user selects a proper virtual weapon to attack the virtual object according to the position or operation habit of the target virtual object. Wherein the virtual weapon comprises at least one of a virtual firearm, a virtual near body weapon, and a virtual throwing weapon, wherein the virtual firearm comprises a firearm of a rifle, sniper, pistol, shotgun, or the like type, the virtual near body weapon comprises at least one of a dagger, knife, axe, sword, stick, pot (e.g., pan), and the virtual throwing weapon comprises a virtual grenade, a virtual viscous grenade, a virtual flash bomb, a virtual smoke bomb, or the like type.
The terminals in this application may be desktop computers, laptop portable computers, cell phones, tablet computers, e-book readers, MP3 (Moving Picture Experts Group Audio Layer III, moving picture experts compression standard audio layer 3) players, MP4 (Moving Picture Experts Group Audio Layer IV, moving picture experts compression standard audio layer 4) players, and the like. The terminal has installed and running therein an application supporting a virtual environment, such as an application supporting a three-dimensional virtual environment. The application may be any one of a virtual reality application, a three-dimensional map program, a TPS game, an FPS game, and a MOBA game. Alternatively, the application may be a stand-alone application, such as a stand-alone 3D game, or a network-on-line application.
Fig. 1 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 100 includes: an operating system 120 and application programs 122.
Operating system 120 is the underlying software that provides applications 122 with secure access to computer hardware.
The application 122 is an application supporting a virtual environment. Alternatively, the application 122 is an application that supports a three-dimensional virtual environment. The application 122 may be any one of a virtual reality application, a three-dimensional map program, a TPS game, an FPS game, a MOBA game, and a multiplayer warfare survival game. The application 122 may be a stand-alone application, such as a stand-alone 3D game program.
FIG. 2 illustrates a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 200 includes: a terminal 210, a server 220, and a communication network 230.
The terminal 210 installs and runs an application supporting a virtual environment. The application may be any one of a virtual reality application, a three-dimensional map program, a TPS game, an FPS game, a MOBA game, and a multiplayer warfare survival game. The user may perform an activity by controlling a virtual object located in the virtual environment using the terminal 210. Such activities include, but are not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing. Illustratively, the virtual object is a virtual character, such as a simulated character or a cartoon character.
The terminal 210 is connected to the server 220 through a communication network 230. The communication network 230 includes a wireless network or a wired network. The device types corresponding to the terminal 210 include: at least one of a game console, a desktop computer, a smart phone, a tablet computer, an electronic book reader, an MP3 player, an MP4 player, and a laptop portable computer.
Server 220 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. Server 220 is used to provide background services for applications that support a three-dimensional virtual environment. Optionally, the server 220 takes on primary computing work and the terminal 210 takes on secondary computing work; alternatively, the server 220 takes on secondary computing work and the terminal 210 takes on primary computing work; alternatively, the server 220 and the terminal 210 perform cooperative computing by using a distributed computing architecture.
The user logs in an application program corresponding to the virtual environment through the terminal 210, the terminal 210 establishes a long connection with the server 220, the server 220 authenticates a request sent by the terminal 210, if the request is legal, the server 220 processes the request, and a processing result of the request is returned to the terminal 210. Schematically, the user controls the virtual object in the virtual environment to use the virtual throwing object, the terminal 210 generates a corresponding use request, the use request is sent to the server 220, the server 220 calculates a drop point of the virtual throwing object according to the use request, and returns a logic processing result to the terminal 210, and after receiving the processing result, the terminal 210 correspondingly displays a target function of the virtual throwing object.
Referring to fig. 3, a flowchart of a method for operating a virtual throwing object according to an exemplary embodiment is shown, and the method is applied to a terminal for explanation, where the method is described by way of example, and includes:
step 301, a virtual object holding a virtual projectile is displayed.
The terminal displays a virtual environment interface, wherein the virtual environment interface comprises a picture for observing the virtual environment based on the visual angle of the virtual object. Optionally, the screen is a screen for observing the virtual environment with a first person viewing angle of the virtual object, and may also be a screen for observing the virtual environment with a third person viewing angle of virtual wake-up. A user may control a virtual object in a virtual environment through a terminal to implement various activities in the virtual environment through the virtual object, including but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, skill use, attack, the virtual object may be a simulated persona or a cartoon persona.
The virtual object is in a virtual environment, the virtual object having a virtual projectile. The virtual throwing object may be picked up from the virtual environment, or may be obtained through equipment in a preset interface before the virtual object enters the virtual environment, where the virtual object enters the virtual environment to carry the virtual throwing object. In one example, as shown in fig. 4, before opening the virtual game, the user enters the equipment interface 400, a virtual prop list 410 is displayed in the equipment interface 400, the virtual prop list 410 includes virtual props that can be equipped with various virtual objects, including the virtual throwing object 411, the user can select the virtual throwing object 411 in the virtual prop list 410, and equip the virtual throwing object 411 into the virtual object corresponding to the user account through the equipment control 420, and when the user controls the virtual object to enter the virtual game, the virtual object carries the virtual throwing object 411.
The virtual throwing object is a prop for displaying a virtual obstacle in a virtual environment after being thrown, wherein the virtual throwing object can display the virtual obstacle when the throwing duration reaches a preset duration, or can display the virtual obstacle when a collision event exists in the virtual environment after being thrown. In the embodiment of the application, the virtual throwing prop can achieve a tactical effect and a combat effect.
Optionally, the virtual throwing object may be a prop thrown on the ground in the virtual environment, or may be a prop thrown on a designated position, or may be a prop thrown on any other position. Illustratively, the virtual projectile may be thrown in any one of a virtual ground, a virtual desktop, a surface of a virtual object, etc. in a virtual environment.
In the embodiment of the application, the virtual object of the handheld virtual throwing object is displayed through the terminal. Illustratively, when the virtual projectile is in the pre-assembled state (the virtual projectile is displayed in the waist position of the virtual object), the virtual projectile is switched to the assembled state by the projectile equipment control/shortcut, i.e., the virtual object holding the virtual projectile is displayed in the virtual environment screen. When the virtual throwing object is in the virtual knapsack, the user can display the virtual object holding the virtual throwing object after carrying out the use operation on the virtual throwing object in the knapsack interface corresponding to the virtual knapsack; or switching the virtual throwing object to the equipment state through a virtual throwing object switching control on the virtual environment interface.
Optionally, the virtual throwing object may trigger the target function when the throwing duration reaches the preset duration, or may trigger the target function when the throwing and collision situations exist, where the explanation is given for the triggering target function when the throwing duration reaches the preset duration. Optionally, the thrown time period is divided into two phases: the virtual object throwing device comprises a pre-throwing stage and a throwing stage, wherein the pre-throwing stage refers to a stage that a virtual object holds a virtual throwing object and determines the throwing direction of the virtual throwing object, and the throwing stage refers to a stage that the virtual object throws the virtual throwing object along the throwing direction, and then the virtual throwing object is thrown to a corresponding position and triggers a target function, wherein the thrown duration can be counted from the starting time of the pre-throwing stage or from the starting time of the throwing stage.
Step 302, a pre-roll operation is received.
The pre-throwing operation is used for aiming throwing of the virtual throwing object in the virtual environment. The pre-throwing operation corresponds to the pre-throwing stage in the throwing duration, that is, when the user can determine to throw in the current throwing direction through the pre-throwing operation, the drop point position of the virtual throwing object in the virtual environment can be the virtual ground, the virtual wall surface, the surface of the virtual object and the like in the virtual environment, and the drop point position can be the same as or different from the position triggering the virtual throwing object. The user may pre-judge the target location of the virtual projectile trigger function based on the drop point location.
Illustratively, after receiving the pre-cast operation, the user may also cancel the aiming of the casting of the virtual cast through the cast cancellation operation, without the virtual cast being consumed.
The pre-throwing operation and the throwing cancellation operation can be realized through a preset control, can also be realized through a preset shortcut key, and are not limited herein.
Step 303, displaying track indication information of the virtual projectile in the virtual environment based on the pre-projectile operation.
The trajectory indication information is used to indicate a flight trajectory of the virtual projectile in the virtual environment. Illustratively, the throwing indication information is determined by the corresponding throwing direction and throwing angle of the virtual throwing object under the pre-throwing operation. In one example, the track indication information is displayed in a virtual parabola of a preset transparency in the virtual environment.
Illustratively, the drop point indication information of the virtual throwing object is determined through the track indication information, and the drop point indication information is used for indicating the drop point condition of the virtual throwing object in the virtual environment. In one example, a drop point position is determined according to a virtual parabola corresponding to the trajectory indication information and a collision point of the virtual object in the virtual environment, the drop point indication information is displayed in the virtual environment based on the drop point position, for example, when the virtual parabola corresponding to the virtual thrown object in the pre-throwing operation collides with the virtual wall, the drop point position information is displayed at the collision point.
Step 304, in response to receiving the throwing operation, throwing the virtual projectile to a target location in the virtual environment.
The throwing operation is to instruct throwing of the virtual thrown object through the virtual object to a target location in the virtual environment.
Illustratively, the virtual environment interface includes a throwing control, through which a user can control a virtual object to use the virtual throwing object; the user may also control the virtual object to use the virtual projectile by presetting a shortcut key, which is not limited herein. Taking the example of controlling a virtual object to use a virtual throwing object through a throwing control, the terminal displays the throwing control, and the throwing control is overlapped on a virtual environment picture; and controlling the virtual object to throw the virtual throwing object in response to receiving the triggering operation for the throwing control. The terminal receives throwing operation through the throwing control, generates a throwing signal according to the throwing operation, sends the throwing signal to the server, determines the last drop point of the virtual throwing object according to the throwing signal, determines the target position, returns feedback information comprising the target position to the terminal, and displays a corresponding virtual environment picture according to the feedback information, namely, the virtual throwing object is thrown to the target position in the virtual environment by the virtual object.
Step 305, displaying a virtual obstacle at a target location.
Schematically, the virtual throwing object corresponds to a target trigger time length, when the virtual throwing object is thrown to the virtual environment, the corresponding throwing time length needs to reach the target trigger time length, and the virtual obstacle is displayed at the target position, namely, when the virtual throwing object is thrown to the target position in the virtual environment and the throwing time length reaches the target trigger time length, the virtual obstacle is displayed at the target position; virtual obstacles may also be displayed when thrown and in the presence of a collision; the virtual object may be triggered to display a virtual obstacle when the virtual object approaches a virtual object thrown at a target position in the virtual environment within a certain range; the virtual obstacle may be displayed at the target position when the virtual projectile is thrown into the virtual environment and the obstacle display operation is received (for example, when a trigger operation of the obstacle display control is received), which is not limited herein.
The function of the virtual obstacle in the virtual environment, which is displayed by the virtual projectile, includes an attack blocking function, i.e. the virtual obstacle is used to block virtual injuries to the virtual object from a second side if the virtual object is located on a first side of the virtual obstacle, wherein the first side and the second side are opposite. The virtual obstacle is schematically a virtual object of a preset shape, which may be a virtual wall, a virtual steel plate, a virtual explosion-proof shield, etc., and the preset shape may be a cuboid, a cube, a sphere, or other irregularly shaped object, which is not limited herein.
When the virtual object is on the first side of the virtual barrier, the virtual barrier can protect the virtual object from virtual injury from the second side, i.e. the virtual barrier can block attack in a preset direction, alternatively, the virtual barrier can block attack operation of an hostile virtual object, which has hostile array relation with the virtual object of the throwing virtual throwing object, or a teammate virtual object, which has same array relation with the virtual object of the throwing virtual throwing object, and in one example, the hostile virtual object cannot penetrate the virtual barrier using a virtual bullet launched by a virtual firearm, and the teammate virtual object can penetrate the virtual barrier using a virtual bullet launched by a virtual firearm; or the virtual barrier blocks all attack operations in the virtual environment, in one example, neither hostile virtual objects nor teammate virtual objects can penetrate the virtual barrier using virtual bullets fired by the virtual firearm; or the virtual barrier allows a directional attack operation but blocks the opposite other direction attack operation, in one example the virtual barrier comprises an attack face and a defending face, from which the virtual bullet can pass through the virtual barrier but cannot pass through the virtual barrier.
The virtual obstacle also corresponds to a virtual life value, in one example, an attack operation of the virtual obstacle by an adversary virtual object may cause damage to the virtual obstacle, that is, reduce the corresponding virtual life value of the virtual obstacle, when the virtual life value is reduced to 0, the virtual obstacle disappears from the virtual environment, which is schematic, the virtual obstacle may disappear directly from the virtual environment, or an explosion effect may be triggered when the virtual life value is cleared, and the explosion effect may cause virtual damage to the virtual object within a certain range. Namely, responding to the attack operation of the hostile virtual object on the virtual barrier, and reducing the virtual life value of the virtual barrier; in response to the virtual life value being cleared, the virtual obstacle is controlled to disappear from the virtual environment.
The virtual obstacle also corresponds to a target display duration, in one example, if the virtual life value of the virtual obstacle is not cleared before the target display duration is over, the virtual obstacle is continuously displayed in the virtual environment, and if the virtual life value of the virtual obstacle is cleared, the virtual obstacle disappears from the virtual environment even if the display duration of the virtual obstacle does not reach the target display duration; or the virtual barrier only corresponds to the target display time length, namely before the display time length of the virtual barrier does not reach the target display time length, the virtual barrier is displayed in the virtual environment, and the virtual barrier is controlled to disappear from the virtual environment in response to the display time length of the virtual barrier reaching the target display time length; or in response to the display time of the virtual obstacle reaching the target display time, triggering an attack function of the virtual obstacle in the virtual environment, wherein the attack function comprises virtual injury to a virtual object in a first preset range, and the virtual object can be a hostile virtual object, a teammate virtual object or a virtual object for throwing the virtual throwing object.
The virtual barrier is also correspondingly provided with a cancel control, when the virtual barrier is successfully displayed at the target position in the virtual environment, the cancel control is displayed on the virtual environment interface, and when the cancel control receives a trigger signal, the virtual barrier in the virtual environment is controlled to disappear, namely, after the virtual barrier is displayed through the virtual throwing object, the user can cancel the virtual barrier manually.
The functions of the virtual barrier in the virtual environment also include attack functions. After the virtual barrier is displayed in the virtual environment, an attack trigger signal aiming at the virtual barrier is received, and an attack function of the virtual barrier in the virtual environment is triggered based on the attack trigger signal, wherein the attack function is used for indicating virtual damage to virtual rest in a first preset range of the virtual barrier, and the virtual object can be an hostile virtual object, a teammate virtual object or a virtual object for throwing the virtual throwing object. Illustratively, the issuing of the attack trigger signal includes triggering by a control, or triggering in response to the virtual obstacle disappearing, or triggering when the virtual object is within a certain preset range of the virtual obstacle.
The method comprises the steps that an attack function of a virtual barrier is triggered through a control, namely, a terminal displays the control in a virtual environment interface, the control is overlapped on a virtual environment picture, a virtual environment is displayed in the virtual environment picture, and the control is used for triggering the attack function of the virtual barrier; receiving a triggering operation for a control; and generating an attack trigger signal for the virtual barrier based on the trigger operation so as to realize the attack function of the virtual barrier in the virtual environment. Optionally, the control may be displayed in the virtual environment interface when the virtual object approaches the virtual obstacle, i.e. the control is displayed in response to the virtual object moving within a second preset range of the virtual obstacle; the control can also be displayed on the virtual environment interface after the virtual object throws the virtual throwing object and the virtual obstacle is successfully displayed in the virtual environment, namely, the control is displayed in response to the virtual obstacle being displayed at the target position.
Schematically, when the display duration of the virtual barrier in the virtual environment reaches the target display duration, the attack function of the virtual barrier is automatically triggered. That is, in response to the display duration of the virtual obstacle reaching the target display duration, an attack trigger signal to the virtual obstacle is generated to implement an attack function of the virtual obstacle in the virtual environment.
Illustratively, when a virtual obstacle located in a virtual environment is approached by a virtual object, an attack function of the virtual obstacle is automatically triggered. That is, in response to the existence of the virtual object in the third preset range of the virtual obstacle, an attack trigger signal for the virtual obstacle is generated to realize an attack function of the virtual obstacle in the virtual environment, where the virtual object may be an hostile virtual object, a teammate virtual object, or a virtual object for throwing the virtual throwing object.
In summary, in the operation method of the virtual throwing object provided by the embodiment of the present application, in a virtual environment including a virtual object, the virtual object is controlled to throw the virtual throwing object to a target position in the virtual environment, so as to display a virtual barrier at the target position, where the virtual barrier can protect a virtual object located at a first side from virtual injury from a second side, and the first side is opposite to the second side, thereby improving operation diversity of the virtual throwing object and enriching functions of the virtual throwing object.
Referring to fig. 5, a method for operating a virtual throwing object according to another exemplary embodiment of the present application is illustrated, where in the embodiment of the present application, an attack function may be further implemented by the virtual throwing object, and the method includes:
Step 501, a virtual object holding a virtual projectile is displayed.
The virtual object is in a virtual environment, and the virtual object holds a virtual projectile. The virtual projectile is capable of displaying a virtual obstacle after being thrown, and the functions of the virtual obstacle in the virtual environment include an attack blocking function and an attack function.
Step 502, a pre-roll operation is received.
The pre-throwing operation is used for aiming throwing of the virtual throwing object in the virtual environment. The user can determine the drop point position of the virtual projectile in the virtual environment when being thrown in the current throwing direction through the pre-throwing operation, and the user can pre-judge the target position of the virtual projectile triggering function based on the drop point position.
In step 503, track indication information of the virtual projectile is displayed in the virtual environment based on the pre-projectile throwing operation.
The trajectory indication information is used to indicate a flight trajectory of the virtual projectile in the virtual environment. Illustratively, the throwing indication information is determined by the corresponding throwing direction and throwing angle of the virtual throwing object under the pre-throwing operation.
In response to receiving the throwing operation, the virtual projectile is thrown to a target location in the virtual environment, step 504.
The throwing operation is to instruct throwing of the virtual thrown object through the virtual object to a target location in the virtual environment. The method comprises the steps that a terminal determines that a throwing operation of a user is received through a trigger signal of a throwing control, according to a throwing direction and a throwing angle determined in a pre-throwing operation stage, a corresponding throwing signal is generated and transmitted to a server, the server determines the final falling of a virtual throwing object, namely a target position, according to the throwing direction and the throwing angle in the throwing signal, feedback information corresponding to the target position is returned to the terminal, and the terminal displays a corresponding virtual environment picture, namely that the virtual throwing object is thrown to the target position in a virtual environment by a virtual object according to the feedback information.
Step 505, displaying a virtual obstacle at a target location.
The virtual obstacle is schematically a virtual object of a preset shape, which may be a virtual wall, a virtual steel plate, a virtual explosion-proof shield, etc., and the preset shape may be a cuboid, a cube, a sphere, or other irregularly shaped object, which is not limited herein. The functions of the virtual barrier include an attack blocking function and an attack function.
Steps 506 to 508 (step 506 includes step 5061 and step 5062, step 507 includes step 5071 and step 5072) are functional effects that three virtual obstacles can achieve in the virtual environment.
Step 5061, an attack trigger signal for a virtual obstacle is received.
The sending of the attack trigger signal comprises triggering through a control, or triggering when the virtual obstacle disappears or triggering when the virtual object approaches to the virtual obstacle within a certain preset range. After receiving the attack trigger signal aiming at the virtual obstacle, the terminal correspondingly generates an attack trigger request, sends the attack trigger request to the server, determines a virtual object which is virtually injured by the attack function of the virtual obstacle in the virtual environment according to the attack trigger request, generates effect feedback information, feeds the effect feedback information back to the terminal, and displays a corresponding virtual environment picture according to the effect feedback information.
Step 5062, triggering an attack function of the virtual barrier in the virtual environment based on the attack trigger signal.
The attack function is used for indicating virtual damage to the virtual object in a first preset range of the virtual obstacle. In an example, as shown in fig. 6, a virtual obstacle 610 is displayed in a virtual environment screen 600, where the virtual obstacle 610 is a rectangular wall surface, and the first preset range is determined according to the length of the virtual obstacle 610, that is, the length of the virtual obstacle 610 is taken as a diameter to make a circle, and the obtained circular range 620 is the first preset range. When the attack function of the virtual obstacle is triggered, the virtual obstacle can generate an explosion effect, and the corresponding hostile virtual object in the first preset range can be virtually damaged by the preset value.
Illustratively, the magnitude of the virtual injury is related to the distance of the hostile virtual object from the virtual obstacle, and in one example, the closer the hostile virtual object is to the virtual obstacle, the higher the value corresponding to the virtual injury received, and the farther the hostile virtual object is to the virtual obstacle, the lower the value corresponding to the virtual injury received.
In step 5071, in response to receiving an attack operation of the virtual object in the virtual environment on the virtual obstacle, the virtual life value of the virtual obstacle is reduced.
The virtual obstacle also corresponds to a virtual life value. Illustratively, when the virtual object in the virtual environment includes a hostile virtual object, an attack operation of the hostile virtual object on the virtual obstacle may cause damage to the virtual obstacle, that is, reduce a corresponding virtual life value of the virtual obstacle, and when the virtual life value is reduced to 0, the virtual obstacle disappears from the virtual environment.
In one example, the attack operation of the hostile virtual object has a correspondence to the virtual damage caused by the virtual barrier by the virtual prop used by the virtual object, that is, the higher the attack force corresponding to the virtual prop, the higher the virtual damage to the virtual barrier, for example, if the virtual firearm held by the hostile virtual object is a virtual pistol, the damage to the virtual barrier is lower when the virtual firearm is hit, the damage to the virtual barrier is lower when the virtual firearm held by the hostile virtual object is a virtual mortar, the damage to the virtual shell corresponding to the virtual firearm is higher, the damage to the virtual barrier is higher when the virtual barrier is hit, and if the corresponding virtual damage is higher than the total virtual life value or the residual virtual life value of the virtual barrier, the virtual barrier is directly destroyed, and the attack blocking function cannot be provided to the virtual object any more.
In response to the virtual life value being cleared, the virtual obstacle is controlled to disappear from the virtual environment, step 5072.
When the virtual life value of the virtual obstacle is cleared, the virtual obstacle disappears from the virtual environment, and the attack blocking function cannot be provided for the virtual object. Illustratively, the virtual obstacle may disappear directly from the virtual environment, or trigger an explosion effect when the virtual life value is cleared, which may cause virtual damage to the virtual object within a certain range.
In step 508, in response to the display duration of the virtual obstacle reaching the target display duration, the virtual obstacle is controlled to disappear from the virtual environment.
The virtual barrier also corresponds to a target display duration, and illustratively, if the virtual life value of the virtual barrier is not emptied before the target display duration is over, the virtual barrier is continuously displayed in the virtual environment, and if the virtual life value of the virtual barrier is emptied, the virtual barrier disappears from the virtual environment even if the display duration of the virtual barrier does not reach the target display duration; or the virtual barrier only corresponds to the target display time length, namely, before the display time length of the virtual barrier does not reach the target display time length, the virtual barrier is displayed in the virtual environment, and the virtual barrier is controlled to disappear from the virtual environment in response to the display time length of the virtual barrier reaching the target display time length. Schematically, or after the display duration of the virtual obstacle reaches the target display duration, triggering an attack function of the virtual obstacle in the virtual environment, that is, after the virtual obstacle displays the target display duration in the virtual environment, generating an explosion effect, and generating virtual injury to the virtual object in the second preset range.
Schematically, when the display duration of the virtual obstacle does not reach the target display duration, the user may control the virtual obstacle to disappear through the cancel control, that is, when the cancel control receives the trigger signal, the virtual obstacle directly disappears.
In summary, according to the operation method of the virtual throwing object provided by the embodiment of the application, the virtual obstacle is displayed in the virtual environment through the virtual throwing object, and the functions of the virtual obstacle in the virtual environment include the attack blocking function and the attack function, so that the fight effect can be realized, the tactical effect can also be realized, the operation diversity of the virtual throwing object is improved, and the functions of the virtual throwing object are enriched.
Referring to fig. 7, there is shown a method for operating a virtual projectile according to another exemplary embodiment of the present application, in which a throwing process of the virtual projectile is described, the method includes:
step 701, displaying a virtual object holding a virtual projectile.
The user can control the virtual object in the virtual environment through the terminal so as to realize various activities in the virtual environment through the virtual object. Wherein the activity includes a virtual projectile held using a virtual object.
In response to receiving the pre-cast operation, track indication information of the virtual projectile is displayed in the virtual environment, step 702.
Illustratively, responsive to receiving a signal from the target account to use the virtual projectile, a trajectory of the virtual projectile in the virtual environment is determined. That is, the terminal generates a use signal according to the received pre-throwing operation, determines a throwing track of the throwing object in the virtual environment according to the use signal, and displays track indication information in a virtual environment picture according to the throwing track. The target account number corresponds to the virtual object, namely, the user logs in the target account number in an application program on the terminal, the application program is provided with a virtual environment, and the user can control the virtual object in the virtual environment through the terminal.
In this embodiment of the present application, please refer to fig. 8, a virtual environment interface 800 displayed by a terminal includes a throwing control 810, a user throws a virtual throwing object by clicking or pressing the throwing control 810, and illustratively, when the user clicks the throwing control 810, the virtual object displays a throwing action, the terminal displays track indication information 820 corresponding to a current throwing direction and a throwing angle in a virtual environment picture, and the user can approximately determine a drop point of the virtual throwing object through the track indication information 820.
Illustratively, the throwing trajectory may be determined by the terminal or by the server, without limitation. Taking the determination by the server as an example, the terminal generates a throwing signal corresponding to the throwing direction and the throwing angle, the throwing signal is sent to the server, and the server performs logic processing according to the throwing signal.
Step 703, displaying drop point indication information of the virtual projectile in the virtual environment.
And determining the drop point indication information of the virtual throwing object through the track indication information, wherein the drop point indication information is used for indicating the drop point condition of the virtual throwing object in the virtual environment. Illustratively, the drop point indication information may be a drop point position when the virtual throwing object collides for the first time in the virtual environment when throwing the virtual throwing object in the corresponding throwing angle and throwing direction at the time of the previous pre-throwing operation, and in one example, if the drop point position when the first time collides is a virtual wall, the drop point indication information is displayed on the virtual wall. Or the drop point indication information may be a final drop point in the virtual environment, that is, a target position, when the virtual projectile is thrown with the corresponding throwing angle and throwing direction at the time of the current pre-throwing operation, in one example, when the virtual projectile is thrown with the corresponding throwing angle and throwing direction at the time of the current pre-throwing operation, the drop point position where the virtual projectile collides first in the virtual environment is located on the virtual wall, the virtual projectile bounces off the virtual wall, falls on the virtual ground and collides second time, and finally falls on the collision point of the second collision, and the drop point indication information is displayed at the collision point of the second collision.
Step 704, in response to receiving the throwing operation, throwing the virtual projectile to a target location in the virtual environment.
The virtual throwing object flies and collides in the virtual environment and then falls on a target position, and the determination of the target position is schematically obtained by the server through logic processing, namely the server determines the collision point of the virtual throwing object in the virtual environment according to throwing signals, if the collision point corresponds to a trigger plane, the collision point is determined to be the target position, if the collision point corresponds to a non-trigger plane, the virtual throwing object continues to fly after rebounding of the non-trigger plane and collides with the next collision point until the virtual throwing object falls on the trigger plane, wherein the trigger plane comprises a virtual plane which is horizontally displayed in the virtual environment, such as a virtual ground, a virtual desktop and the like, and the non-trigger plane comprises a virtual plane which is vertically displayed or obliquely displayed in the virtual environment, such as a virtual wall surface and the like. And the server generates a feedback signal from the determined target position, returns the feedback signal to the terminal, and displays a corresponding virtual environment picture according to the feedback signal, namely a throwing track of the virtual throwing object in the virtual environment and the target position corresponding to the falling point.
Step 705, displaying a virtual obstacle at a target location. When the virtual throwing object is thrown to the virtual environment and the corresponding throwing time length reaches the target triggering time length, displaying the virtual barrier at the target position.
As shown in fig. 9, a virtual obstacle 910 is displayed at a target position in the virtual environment interface 900, where the virtual obstacle 910 corresponds to a virtual wall, and its corresponding placement direction is perpendicular to the current virtual object's orientation, that is, as shown in fig. 10, the virtual obstacle 1010 in the virtual environment 1000 is oriented by the virtual object's orientation 1020 in parallel or on the same line as the normal direction 1030 of the virtual obstacle. I.e. the drop point of the virtual projectile determines the target location of virtual obstacle generation, while the orientation of the virtual object determines the orientation of the virtual obstacle.
Illustratively, the size of the virtual obstacle is determined by the surrounding environment, in one example, since there are more controls in the virtual environment in the upward direction, the maximum height is generally taken in height, and if other obstacles exist above, such as a roof, the corresponding height is taken as the height of the virtual obstacle from the obstacle to the ground; then, the lengths of the two sides are selected mechanically, rays are emitted from the left side and the right side of the target position schematically, after the rays collide with other obstacles, the terminal can obtain a transverse longest length, if the length is larger than the preset length of the virtual obstacle, the virtual obstacle with the preset length is directly displayed, if the length is smaller than the preset length of the virtual obstacle, the length is taken as the length of the virtual obstacle, and finally, the virtual obstacle is displayed through the determined height and the determined length.
In one example, please refer to fig. 11, which shows a flowchart corresponding to the operation method of the virtual projectile, namely, the flowchart includes: controlling the virtual object to equip the virtual projectile 1101; determining if a throwing control is triggered 1102; if so, the virtual projectile is thrown into the virtual environment 1103; determining 1104 whether a non-trigger plane collision has occurred; if so, bounce and continue flying 1105; if not, then the drop in the target location explodes and creates a virtual barrier 1106; judging whether the virtual obstacle is destroyed 1107; if so, the virtual obstacle disappears 1108; if not, continuing to resist attack operation 1109 of the hostile virtual object; judging whether an attack function 1110 of the virtual obstacle is triggered; if so, the virtual barrier generates an explosion effect and causes virtual injury 1111 to the hostile virtual object within the first preset range.
In summary, in the operation method of the virtual throwing object provided in the embodiment of the present application, in a virtual environment including a virtual object, track indication information is displayed through a pre-throwing operation, the virtual throwing object is thrown to a target position in the virtual environment through a throwing operation, and a virtual barrier is displayed at the target position, where the virtual barrier can provide an attack blocking function for the virtual object, that is, if the virtual object is located at one side of the virtual barrier, the virtual barrier can block virtual damage to the virtual object from a second side, thereby improving operation diversity of the virtual throwing object and enriching functions of the virtual throwing object.
Referring to fig. 12, there is shown a block diagram of an operation device for virtual throwing object according to an exemplary embodiment of the present application, the device includes:
a display module 1210 for displaying a virtual object of a handheld virtual projectile, the virtual object being in a virtual environment;
a receiving module 1220 for a pre-throwing operation for aiming a throwing of the virtual projectile in the virtual environment;
the display module 1210 is further configured to display, in the virtual environment, trajectory indication information of the virtual projectile, where the trajectory indication information is used to indicate a flight trajectory of the virtual projectile in the virtual environment, based on the pre-projectile operation;
the display module 1210 is further configured to, in response to receiving a throwing operation, throw the virtual projectile to a target location in the virtual environment;
the display module 1210 is further configured to display a virtual obstacle at the target location, the virtual obstacle configured to block virtual injury to the virtual object from a second side in response to the virtual object being located on a first side of the virtual obstacle, wherein the first side and the second side are opposite.
In an alternative embodiment, the virtual projectile corresponds to a target trigger duration;
the display module 1210 is further configured to display the virtual obstacle at the target location in response to the thrown duration of the virtual projectile reaching the target trigger duration.
In an alternative embodiment, the function of the virtual barrier in the virtual environment comprises an attack function;
the receiving module 1220 is further configured to receive an attack trigger signal for the virtual obstacle;
referring to fig. 13, the apparatus further includes:
the triggering module 1230 is configured to trigger the attack function of the virtual obstacle in the virtual environment based on the attack trigger signal, where the attack function is used to indicate virtual damage to a virtual object that is within a first preset range of the virtual obstacle.
In an optional embodiment, the display module 1210 is further configured to display a control in a virtual environment interface, where the control is configured to trigger the attack function of the virtual obstacle;
the receiving module 1220 is further configured to receive a triggering operation on the control;
The apparatus further comprises:
a generating module 1340 is configured to generate the attack trigger signal for the virtual obstacle based on the trigger operation.
In an optional embodiment, the display module 1210 is further configured to display the control in response to the virtual object moving to be within a second preset range of the virtual obstacle.
In an alternative embodiment, the virtual obstacle corresponds to a virtual life value;
the apparatus further comprises:
a control module 1250 for reducing the virtual life value of the virtual obstacle in response to receiving an attack operation of the virtual object in the virtual environment on the virtual obstacle;
the control module 1250 is further configured to control the disappearance of the virtual obstacle from the virtual environment in response to the virtual life value being cleared.
In an alternative embodiment, the virtual barrier corresponds to a target display duration;
the control module 1250 is further configured to control the virtual obstacle to disappear from the virtual environment in response to the display duration of the virtual obstacle reaching the target display duration;
or, the triggering module 1230 is further configured to trigger the attack function of the virtual obstacle in the virtual environment in response to the display duration of the virtual obstacle reaching the target display duration.
In an alternative embodiment, the display module 1210 further includes:
a determining unit 1211 for determining, based on the trajectory indication information, drop point indication information of the virtual projectile, the drop point indication information being for indicating a drop point condition of the virtual projectile in the virtual environment;
the display module 1210 is further configured to display drop point indication information of the virtual projectile in the virtual environment.
In summary, in the operation device for the virtual throwing object provided by the embodiment of the present application, in a virtual environment including a virtual object, the virtual object holding the virtual throwing object is controlled to throw the virtual throwing object to a target position in the virtual environment, so as to display a virtual barrier at the target position, where the virtual barrier can protect the virtual object on a first side from virtual injury from a second side, and the first side is opposite to the second side, thereby improving operation diversity of the virtual throwing object and enriching functions of the virtual throwing object.
It should be noted that: the virtual throwing object operating device provided in the above embodiment is only exemplified by the above-mentioned division of each functional module, and in practical application, the above-mentioned functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the operation device of the virtual throwing object provided in the above embodiment and the operation method embodiment of the virtual throwing object belong to the same concept, and detailed implementation processes of the operation device and the operation method embodiment of the virtual throwing object are detailed in the method embodiment, and are not repeated here.
Fig. 14 shows a block diagram of a terminal 1400 provided by an exemplary embodiment of the present invention. The terminal 1400 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Terminal 1400 may also be referred to as a user device, a portable terminal, a laptop terminal, a desktop terminal, and the like.
In general, terminal 1400 includes: a processor 1401 and a memory 1402.
Processor 1401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1401 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1401 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1401 may be integrated with a GPU (Graphics Processing Unit, image processor) for rendering and rendering of content required to be displayed by the display screen. In some embodiments, the processor 1401 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1402 may include one or more computer-readable storage media, which may be non-transitory. Memory 1402 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1402 is used to store at least one instruction for execution by processor 1401 to implement the region action method in the virtual environment provided by the method embodiments in the present application.
In some embodiments, terminal 1400 may optionally further include: a peripheral interface 1403 and at least one peripheral. The processor 1401, memory 1402, and peripheral interface 1403 may be connected by a bus or signal lines. The individual peripheral devices may be connected to the peripheral device interface 1403 via buses, signal lines or a circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1404, a display screen 1405, a camera assembly 1406, audio circuitry 1407, and a power source 1409.
Peripheral interface 1403 may be used to connect at least one Input/Output (I/O) related peripheral to processor 1401 and memory 1402. In some embodiments, processor 1401, memory 1402, and peripheral interface 1403 are integrated on the same chip or circuit board; in some other embodiments, either or both of processor 1401, memory 1402, and peripheral interface 1403 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1404 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 1404 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1404 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1404 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 1404 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 1404 may also include NFC (Near Field Communication, short range wireless communication) related circuits, which are not limited in this application.
The display screen 1405 is used to display UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1405 is a touch display screen, the display screen 1405 also has the ability to collect touch signals at or above the surface of the display screen 1405. The touch signal may be input to the processor 1401 as a control signal for processing. At this time, the display 1405 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1405 may be one, providing a front panel of the terminal 1400; in other embodiments, the display 1405 may be at least two, respectively disposed on different surfaces of the terminal 1400 or in a folded design; in still other embodiments, the display 1405 may be a flexible display disposed on a curved surface or a folded surface of the terminal 1400. Even more, the display 1405 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The display 1405 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera component 1406 is used to capture images or video. Optionally, camera assembly 1406 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 1406 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuitry 1407 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1401 for processing, or inputting the electric signals to the radio frequency circuit 1404 for voice communication. For purposes of stereo acquisition or noise reduction, a plurality of microphones may be provided at different portions of the terminal 1400, respectively. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 1401 or the radio frequency circuit 1404 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, audio circuitry 1407 may also include a headphone jack.
A power supply 1409 is used to power the various components in terminal 1400. The power supply 1409 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 1409 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1400 also includes one or more sensors 1410. The one or more sensors 1410 include, but are not limited to: acceleration sensor 1411, gyro sensor 1412, pressure sensor 1413, optical sensor 1415, and proximity sensor 1416.
The acceleration sensor 1411 may detect the magnitudes of accelerations on three coordinate axes of a coordinate system established with the terminal 1400. For example, the acceleration sensor 1411 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1401 may control the touch display 1405 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 1411. The acceleration sensor 1411 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 1412 may detect a body direction and a rotation angle of the terminal 1400, and the gyro sensor 1412 may collect a 3D motion of the user to the terminal 1400 in cooperation with the acceleration sensor 1411. The processor 1401 may implement the following functions based on the data collected by the gyro sensor 1412: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
Pressure sensor 1413 may be disposed on a side frame of terminal 1400 and/or on an underlying layer of touch screen 1405. When the pressure sensor 1413 is provided at a side frame of the terminal 1400, a grip signal of the terminal 1400 by a user can be detected, and the processor 1401 performs right-and-left hand recognition or quick operation according to the grip signal collected by the pressure sensor 1413. When the pressure sensor 1413 is disposed at the lower layer of the touch screen 1405, the processor 1401 realizes control of the operability control on the UI interface according to the pressure operation of the user on the touch screen 1405. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 1415 is used to collect the ambient light intensity. In one embodiment, the processor 1401 may control the display brightness of the touch screen 1405 based on the intensity of ambient light collected by the optical sensor 1415. Specifically, when the intensity of the ambient light is high, the display brightness of the touch display screen 1405 is turned up; when the ambient light intensity is low, the display brightness of the touch display screen 1405 is turned down. In another embodiment, the processor 1401 may also dynamically adjust the shooting parameters of the camera assembly 1406 based on the ambient light intensity collected by the optical sensor 1415.
A proximity sensor 1416, also referred to as a distance sensor, is typically provided on the front panel of terminal 1400. The proximity sensor 1416 is used to collect the distance between the user and the front of the terminal 1400. In one embodiment, when the proximity sensor 1416 detects that the distance between the user and the front surface of the terminal 1400 gradually decreases, the processor 1401 controls the touch display 1405 to switch from the bright screen state to the off screen state; when the proximity sensor 1416 detects that the distance between the user and the front surface of the terminal 1400 gradually increases, the touch display 1405 is controlled by the processor 1401 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 14 is not limiting and that terminal 1400 may include more or less components than those illustrated, or may combine certain components, or employ a different arrangement of components.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program for instructing related hardware, and the program may be stored in a computer readable storage medium, which may be a computer readable storage medium included in the memory of the above embodiments; or may be a computer-readable storage medium, alone, that is not incorporated into the terminal. The computer readable storage medium has at least one instruction, at least one program, a set of codes, or a set of instructions stored therein, the at least one instruction, the at least one program, the set of codes, or the set of instructions loaded and executed by the processor to implement the region action method in a virtual environment as described in any of the above embodiments.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), solid state disk (SSD, solid State Drives), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, resistance Random Access Memory) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments is merely exemplary in nature and is in no way intended to limit the invention, since it is intended that all modifications, equivalents, improvements, etc. that fall within the spirit and scope of the invention.

Claims (13)

1. A method of operating a virtual projectile, the method comprising:
Displaying a virtual object of a handheld virtual projectile, the virtual object being in a virtual environment;
receiving a pre-cast operation for aiming a cast of the virtual projectile in the virtual environment;
displaying track indication information of the virtual projectile in the virtual environment based on the pre-projectile operation, the track indication information being used for indicating a flight track of the virtual projectile in the virtual environment;
in response to receiving a throwing operation, throwing the virtual projectile to a target location in the virtual environment;
displaying a virtual obstacle at the target location, the virtual obstacle for blocking virtual injury to the virtual object from a second side in response to the virtual object being located on a first side of the virtual obstacle, wherein the first side and the second side are opposite, the function of the virtual obstacle in the virtual environment comprising an attack function;
receiving an attack trigger signal aiming at the virtual obstacle, wherein the sending of the attack trigger signal comprises at least one of triggering through a control and triggering when the virtual obstacle disappears and triggering when the virtual object approaches to the virtual obstacle within a preset range;
Triggering the attack function of the virtual barrier in the virtual environment based on the attack trigger signal, wherein the attack function is used for indicating virtual damage to the virtual object in the first preset range of the virtual barrier.
2. The method of claim 1, wherein the virtual projectile corresponds to a target trigger time period;
the displaying a virtual obstacle at the target location includes:
and displaying the virtual obstacle at the target position in response to the thrown time of the virtual projectile reaching the target trigger time.
3. The method of claim 1, wherein the receiving an attack trigger signal for the virtual obstacle comprises:
displaying the control in a virtual environment interface, wherein the control is used for triggering the attack function of the virtual barrier;
receiving a triggering operation on the control;
the attack trigger signal to the virtual barrier is generated based on the trigger operation.
4. The method of claim 3, wherein displaying control controls in a virtual environment interface comprises:
And responding to the virtual object moving to the second preset range of the virtual obstacle, and displaying the control.
5. The method according to any one of claims 1 to 4, wherein the virtual obstacle corresponds to a virtual life value;
after the virtual obstacle is displayed at the target position in the virtual environment, the method further comprises:
in response to receiving an attack operation of a virtual object in the virtual environment on the virtual obstacle, reducing the virtual life value of the virtual obstacle;
and controlling the virtual obstacle to disappear from the virtual environment in response to the virtual life value being cleared.
6. The method of any one of claims 1 to 4, wherein the virtual obstacle corresponds to a target display duration;
after the virtual obstacle is displayed at the target position, the method further comprises:
controlling the virtual obstacle to disappear from the virtual environment in response to the display duration of the virtual obstacle reaching the target display duration;
or, in response to the display duration of the virtual obstacle reaching the target display duration, triggering the attack function of the virtual obstacle in the virtual environment.
7. The method of claim 1, wherein the displaying the trajectory indication information of the virtual projectile in the virtual environment based on the pre-projectile operation further comprises:
determining drop point indication information of the virtual throwing object based on the track indication information, wherein the drop point indication information is used for indicating the drop point condition of the virtual throwing object in the virtual environment;
and displaying the drop point indication information of the virtual throwing object in the virtual environment.
8. An apparatus for manipulating a virtual projectile, the apparatus comprising:
the display module is used for displaying a virtual object of the handheld virtual throwing object, and the virtual object is in a virtual environment;
a receiving module for a pre-throwing operation for aiming a throwing of the virtual thrower in the virtual environment;
the display module is further used for displaying track indication information of the virtual throwing object in the virtual environment based on the pre-throwing operation, wherein the track indication information is used for indicating the flight track of the virtual throwing object in the virtual environment;
the display module is further used for throwing the virtual throwing object to a target position in the virtual environment in response to receiving throwing operation;
The display module is further configured to display a virtual obstacle at the target location, the virtual obstacle being configured to block virtual injury to the virtual object from a second side in response to the virtual object being located on a first side of the virtual obstacle, wherein the first side is opposite the second side, and wherein the function of the virtual obstacle in the virtual environment comprises an attack function;
receiving an attack trigger signal aiming at the virtual obstacle, wherein the sending of the attack trigger signal comprises at least one of triggering through a control and triggering when the virtual obstacle disappears and triggering when the virtual object approaches to the virtual obstacle within a preset range;
triggering the attack function of the virtual barrier in the virtual environment based on the attack trigger signal, wherein the attack function is used for indicating virtual damage to the virtual object in the first preset range of the virtual barrier.
9. The apparatus of claim 8, wherein the virtual projectile corresponds to a target trigger time period;
the display module is further configured to display the virtual obstacle at the target position in response to the thrown duration of the virtual throwing object reaching the target trigger duration.
10. The apparatus of claim 8, wherein the device comprises a plurality of sensors,
the display module is further configured to display the control in a virtual environment interface, where the control is configured to trigger the attack function of the virtual obstacle;
the receiving module is also used for receiving triggering operation on the control;
the apparatus further comprises:
and the generation module is used for generating the attack trigger signal to the virtual barrier based on the trigger operation.
11. A computer device comprising a processor and a memory, the memory having stored therein at least one program that is loaded and executed by the processor to implement the method of operating a virtual projectile in accordance with any one of claims 1 to 7.
12. A computer readable storage medium having stored therein at least one program code loaded and executed by a processor to implement the method of operating a virtual projectile in accordance with any one of claims 1 to 7.
13. A computer program product comprising a computer program or instructions which when executed by a processor implements a method of operating a virtual projectile as claimed in any one of claims 1 to 7.
CN202110227863.2A 2021-03-01 2021-03-01 Virtual throwing object operation method, device, equipment and medium Active CN112933601B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110227863.2A CN112933601B (en) 2021-03-01 2021-03-01 Virtual throwing object operation method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110227863.2A CN112933601B (en) 2021-03-01 2021-03-01 Virtual throwing object operation method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN112933601A CN112933601A (en) 2021-06-11
CN112933601B true CN112933601B (en) 2023-05-16

Family

ID=76247043

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110227863.2A Active CN112933601B (en) 2021-03-01 2021-03-01 Virtual throwing object operation method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN112933601B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113318438B (en) * 2021-06-30 2023-08-15 北京字跳网络技术有限公司 Virtual prop control method, device, equipment and computer readable storage medium
CN113546424A (en) * 2021-08-04 2021-10-26 网易(杭州)网络有限公司 Virtual resource use control method and device, computer equipment and storage medium
CN116899220A (en) * 2021-09-24 2023-10-20 腾讯科技(深圳)有限公司 Track display method and device, storage medium and electronic equipment
CN114939275A (en) * 2022-05-24 2022-08-26 北京字跳网络技术有限公司 Object interaction method, device, equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111443857B (en) * 2020-03-12 2021-05-25 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN112933601A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
CN112933601B (en) Virtual throwing object operation method, device, equipment and medium
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN110721468B (en) Interactive property control method, device, terminal and storage medium
JP2021514701A (en) Equipment display methods, devices, devices, storage media and computer programs in virtual environment battles
WO2021143260A1 (en) Method and apparatus for using virtual props, computer device and storage medium
CN111589149B (en) Using method, device, equipment and storage medium of virtual prop
CN111744186B (en) Virtual object control method, device, equipment and storage medium
CN112057857B (en) Interactive property processing method, device, terminal and storage medium
WO2021147496A1 (en) Method and apparatus for using virtual prop, and device and storage meduim
CN112870715B (en) Virtual item putting method, device, terminal and storage medium
CN113713382B (en) Virtual prop control method and device, computer equipment and storage medium
CN113713383B (en) Throwing prop control method, throwing prop control device, computer equipment and storage medium
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
CN112717410B (en) Virtual object control method and device, computer equipment and storage medium
CN111760284A (en) Virtual item control method, device, equipment and storage medium
CN113041622A (en) Virtual throwing object throwing method in virtual environment, terminal and storage medium
CN111921190B (en) Prop equipment method, device, terminal and storage medium for virtual object
CN111249726B (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN112316430B (en) Prop using method, device, equipment and medium based on virtual environment
CN112402964B (en) Using method, device, equipment and storage medium of virtual prop
CN111659122B (en) Virtual resource display method and device, electronic equipment and storage medium
CN112402966B (en) Virtual object control method, device, terminal and storage medium
CN114100128B (en) Prop special effect display method, device, computer equipment and storage medium
CN112717394B (en) Aiming mark display method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40045962

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant